Untitled Document
  Home
Speakers
Sessions
  Schedule
Sponsors
Exhibitors
  Media Sponsors
  Association Sponsors
Untitled Document
2012 East Diamond Sponsor

Untitled Document
2012 East Platinum Plus Sponsors

Untitled Document
2012 East Platinum Sponsors

Untitled Document
2012 East Gold Sponsors

Untitled Document
2012 East Silver Sponsors

Untitled Document
2012 East Bronze Sponsors

Untitled Document
2012 East Exhibitors

Untitled Document
2012 East Big Data Pavilion Sponsor

Untitled Document
2012 East Day 2 & 3 Morning Break Sponsor

Untitled Document
2012 East ODCA Partner Pavilion

Untitled Document
2012 East Association Sponsors

Untitled Document
2012 East Media Sponsors

Untitled Document
2011 West Diamond Sponsor

Untitled Document
2011 West Platinum Sponsors

Untitled Document
2011 West Platinum Plus Sponsor

Untitled Document
2011 West Gold Sponsors

Untitled Document
2011 West Silver Sponsors

Untitled Document
2011 West Bronze Sponsor


Untitled Document
2011 West Exhibitors























































Untitled Document
2011 West Wireless Network Sponsor

Untitled Document
2011 West Break Sponsor

Untitled Document
2011 West Day 3 Coffee Break Sponsor

Untitled Document
2011 West Day 2 Lunch Sponsor

Untitled Document
2011 West Day 3 Lunch Sponsor

Untitled Document
2011 Media Sponsors

Step-by-Step: Build a SharePoint 2013 Lab in the Cloud on Windows Azure
Leverage Windows Azure as your Lab in the Cloud!

My good friend and colleague, Tommy Patterson, recently blogged about leveraging the Windows Azure Infrastructure as a Service (IaaS) preview offering to build a FREE lab environment in the Cloud.  You can read Tommy’s step-by-step article at:

What about a SharePoint 2013 Lab in the Cloud?
Now that SharePoint Server 2013 has been released, I frequently get asked about ways in which a SharePoint 2013 lab environment can be easily built for studying, testing and/or performing a proof-of-concept.  You could certainly build this lab environment on your own hardware, but due to the level of SharePoint 2013 hardware requirements, a lot of us may not have sufficient spare hardware to implement an on-premise lab environment.

This makes a great scenario for leveraging our Windows Azure FREE 90-day Trial Offer to build a free lab environment for SharePoint 2013 in the cloud.  Using the process outlined in this article, you’ll be able to build a basic functional farm environment for SharePoint 2013 that will be accessible for approximately 105 hours of compute usage each month at no cost to you under the 90-day Trial Offer.

After the 90-day trial period is up, you can choose if you’d like to convert to a full paid subscription.  If you choose to convert to a paid subscription, this lab environment will cost approximately $0.56 USD per hour of compute usage ( that’s right – just 56 cents per hour ) plus associated storage and networking costs ( which can typically be less than $10 USD per month for a lab of this nature ). These estimated costs are based on published Pay-As-You-Go pricing for Windows Azure that is current as of this article’s date.

Note: If you are testing advanced SharePoint 2013 scenarios and need more resources than available in the lab configuration below, you can certainly scale-up or scale-out elastically by provisioning larger VMs or additional SharePoint web and application server VMs.  To determine the specific costs associated with higher resource levels, please visit the Windows Azure Pricing Calculator for Virtual Machines.

SharePoint 2013 Lab Scenario
To deliver a functional and expandable lab environment, I’ll be walking through the approach of provisioning SharePoint Server 2013 on Windows Azure VMs as depicted in the following configuration diagram that will require three (3) VMs on a common Windows Azure Virtual Network.

SP2013onAzureScenario

Lab Scenario: SharePoint 2013 on Windows Azure VM

In this lab, we’ll be using a naming convention of XXXlabYYY01, where XXX will be replaced with your unique initials and YYY will be replaced with an abbreviation representing the function of a virtual machine or Windows Azure configuration component (ie., ad, db or app).

Note: This study lab configuration is suitable for study, functional testing and basic proof-of-concept usage.  This configuration is not currently supported for pilot or production SharePoint 2013 farm environments.

Prerequisites

The following is required to complete this step-by-step guide:

  • A Windows Azure subscription with the Virtual Machines Preview enabled.

    DO IT: Sign up for a FREE Trial of Windows Azure

    NOTE: When activating your FREE Trial for Windows Azure, you will be prompted for credit card information.  This information is used only to validate your identity and your credit card will not be charged, unless you explicitly convert your FREE Trial account to a paid subscription at a later point in time. 
  • Completion of the Getting Started tasks in the following article:

    DO IT: Getting Started with Servers in the Cloud
  • This step-by-step guide assumes that the reader is already familiar with configuring Windows Server Active Directory, SQL Server and SharePoint Server 2013 in an on-premise installation. This guide focuses on the unique aspects associated with configuring these components on the Windows Azure cloud platform.

Let’s Get Started!
In this step-by-step guide, you will learn how to:

  • Register a DNS Server in Windows Azure
  • Define a Virtual Network in Windows Azure
  • Configure Windows Server Active Directory in a Windows Azure VM
  • Configure SQL Server 2012 in a Windows Azure VM
  • Configure SharePoint Server 2013 in a Windows Azure VM
  • Export / Import Lab Environment via PowerShell

Exercise 1: Register a DNS Server in Windows Azure

Register the internal IP address that our domain controller VM will be using for Active Directory-integrated Dynamic DNS services by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Networks located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Networks | Virtual Network | Register DNS Server.
  4. Complete the DNS Server fields as follows:

    - NAME: XXXlabdns01
    - DNS Server IP Address: 10.0.0.4
  5. Click the REGISTER DNS SERVER button.

Exercise 2: Define a Virtual Network in Windows Azure

Define a common virtual network in Windows Azure for running Active Directory, Database and SharePoint virtual machines by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Networks located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Networks | Virtual Network | Quick Create.
  4. Complete the Virtual Network fields as follows:

    - NAME: XXXlabnet01
    - Address Space: 10.---.---.---
    - Maximum VM Count: 4096 [CIDR: /20]
    - Affinity Group: Select the Affinity Group defined in the Getting Started steps from the Prerequisites section above.
    - Connect to Existing DNS: Select XXXlabdns01 – the DNS Server registered in Exercise 1 above.
  5. Click the CREATE A VIRTUAL NETWORK button.

Exercise 3: Configure Windows Server Active Directory in a Windows Azure VM

Provision a new Windows Azure VM to run a Windows Server Active Directory domain controller in a new Active Directory forest by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select Windows Server 2012, December 2012 and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabad01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Small (1 core, 1.75GB Memory)

    Click the Next button to continue.

    Note: It is suggested to use secure passwords for Administrator users and service accounts, as Windows Azure virtual machines could be accessible from the Internet knowing just their DNS.  You can also read this document on the Microsoft Security website that will help you select a secure password: http://www.microsoft.com/security/online-privacy/passwords-create.aspx.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabad01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabad01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabad01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.4

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabad01, and go back to Exercise 2 and Exercise 3 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabad01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabad01-data01
    - Size: 10 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabad01.
  11. On the virtual machine details page for XXXlabad01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabad01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.  This volume will be used for NTDS DIT database, log and SYSVOL folder locations.

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Configure Local Storage
  13. Using the Server Manager tool, install Active Directory Domain Services and promote this server to a domain controller in a new forest with the following parameters:

    - Active Directory Forest name: contoso.com
    - Volume Location for NTDS database, log and SYSVOL folders: F:

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Install and Administer Active Directory
  14. After Active Directory has been installed, create the following user accounts that will be used when installing and configuring SharePoint Server 2013 later in this step-by-step guide:

    - CONTOSO\sp_farm – SharePoint Farm Data Access Account
    - CONTOSO\sp_serviceapps – SharePoint Farm Service Applications Account

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Install and Administer Active Directory

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 4: Configure SQL Server 2012 in a Windows Azure VM

Provision a new Windows Azure VM to run SQL Server 2012 by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select SQL Server 2012 Evaluation Edition and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabdb01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Medium (2 cores, 3.5GB Memory)

    Click the Next button to continue.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabdb01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabdb01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabdb01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.5

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabdb01, and go back to Exercise 2 and Exercise 3 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabdb01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabdb01-data01
    - Size: 50 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabdb01.
  11. On the virtual machine details page for XXXlabdb01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabdb01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.
  13. Open SQL Server Management Studio from Start | All Programs | Microsoft SQL Server 2012 | SQL Server Management Studio and update default folder locations to the F: volume.

    1. Connect to the SQL Server 2012 default instance using your Windows Account.
    2. Now, you will update the database's default locations for DATA, LOGS and BACKUP folders. To do this, right click on your SQL Server instance and select Properties.
    3. Select Database Settings from the left side pane.
    4. Locate the Database default locations section and update the default values for each path to point to the F: volume you previously formatted.
    5. Close SQL Server Management Studio.
  14. In order to allow SharePoint to connect to the SQL Server, you will need to add an Inbound Rule for the SQL Server requests in the Windows Firewall. To do this, open Windows Firewall with Advanced Security from Start | All Programs | Administrative Tools.

    1. Select Inbound Rules node, right-click it and select New Rule to open the New Inbound Rule Wizard.

    2. In the Rule Type page, select Port and click Next.

    3. In Protocols and Ports page, leave TCP selected, select Specific local ports, and set its value to 1433. Click Next to continue.

    4. In the Action page, make sure that Allow the connection is selected and click Next.

    5. In the Profile page, leave the default values and click Next.

    6. In the Name page, set the Inbound Rule's Name to SQLServerRule and click Finish

    7. Close Windows Firewall with Advanced Security window.

  15. Using the Server Manager tool, join this server to the contoso.com domain and restart the server to complete the domain join operation.
  16. After the server restarts, connect again via Remote Desktop to the server’s console and login with the local Administrator credentials defined above in Step 5.

  17. Open SQL Server Management Studio from Start | All Programs | Microsoft SQL Server 2012 | SQL Server Management Studio and add the CONTOSO\Administrator user to SQL Server with the Sysadmin server role selected.

    1. Expand Security folder within the SQL Server instance. Right-click Logins folder and select New Login.

    2. In the General section, set the Login name to CONTOSO\Administrator, and select the Windows Authentication option.

    3. Click Server Roles on the left pane.  Select the checkbox for the Sysadmin server role.

    4. Click the OK button and close SQL Server Management Studio.

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 5: Configure SharePoint Server 2013 in a Windows Azure VM

Provision a new Windows Azure VM to run SharePoint Server 2013 by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select Windows Server 2012, December 2012 and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabapp01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Large (4 cores, 7GB Memory)

    Click the Next button to continue.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabapp01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabapp01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabapp01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.6

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabapp01, and go back to Exercise 2,  Exercise 3 and Exercise 4 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabapp01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabapp01-data01
    - Size: 50 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabapp01.
  11. On the virtual machine details page for XXXlabapp01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabapp01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.  
  13. In the Server Manager tool, click on Local Server in the left navigation pane and click on the Workgroup option.  Join this server to the contoso.com domain and restart the server to complete the domain join operation.
  14. After the server restarts, re-establish a Remote Desktop connection to the server and logon with the CONTOSO\Administrator domain user credentials defined earlier in Exercise 3.
  15. In the Server Manager tool, click on Local Server in the left navigation pane and select IE Enhanced Security Configuration. Turn off enhanced security for Administrators and click the OK button.

    Note: Modifying Internet Explorer Enhanced Security configurations is not good practice and is only for the purpose of this particular step-by-step guide. The correct approach should be to download files locally and then copy them to a shared folder or directly to the VM.

  16. Press the Windows key to switch to the Start Screen and launch Internet Explorer.  Download the following files to the F:\INSTALL folder:

    - SharePoint Server 2013 Evaluation Edition

    Make a note of the SharePoint Product Key listed on this page, as you’ll need it for the installation of SharePoint Server 2013.

    - ASP.NET 4.5 hotfix for Windows Server 2012 ( KB2765317 )

  17. Navigate to the F:\INSTALL folder and double-click on the downloaded .IMG file to mount it.  Copy all files and folders from the mounted .IMG file to F:\INSTALL.
  18. Install the SharePoint Server 2013 software prerequisites by running F:\INSTALL\prerequisiteinstaller.exe.  Note that this process may require multiple server restarts to complete.  After all required software is successfully installed, continue with the next step in this step-by-step guide.
  19. Install the ASP.NET 4.5 hotfix downloaded to the F:\INSTALL folder in Step 14 above.
  20. Run F:\INSTALL\setup.exe to launch the SharePoint Server 2013 installation process.
  21. When prompted, on the Server Type tab of the setup program, select the Complete installation option.
  22. On the File Location tab of the setup program, change the data path to use the F: volume formatted in Step 12 above.
  23. At the end of the installation process, ensure the checkbox is selected to Run the SharePoint Products Configuration Wizard Now and click the Close button.
  24. In the SharePoint Products Configuration Wizard, when prompted on the Connect to server farm dialog, select the option to Create a new server farm.
  25. On the Specify Configuration Database Settings, specify the following values for each field:

    - Database Server: XXXlabdb01
    - Username: CONTOSO\sp_farm
    - Password: Type the password specified when the sp_farm domain user account was created earlier in Exercise 3, Step 14.
  26. Click the Next > button and accept all other default values in the SharePoint Products Configuration Wizard.  Click the Finish button when prompted to complete the wizard.
  27. The SharePoint 2013 Central Administration web page should launch automatically.  When prompted, click the Start the Wizard button to begin the Initial Farm Configuration Wizard.
  28. When prompted for Service Account, type the CONTOSO\sp_serviceapps domain username and password specified when this account was created earlier in Exercise 3, Step 14.
  29. Accept all other default values and click the Next > button to continue.
  30. On the Create a Site Collection page, create a new top-level Intranet site collection using the following field values:

    - Title and Description: Enter your preferred Title and Description for the new site collection
    - URL: Select the root URL path – http://XXXlabapp01/
    - Select experience version:
    2013
    - Select a template: Publishing | Publishing Portal

    Click the OK button to provision a new top-level Intranet site collection. 

    After the new top-level Intranet site collection is provisioned, test navigating to the URL for this site collection from within the Remote Desktop session to the server.
  31. On the SharePoint 2013 Central Administration site, configure a Public URL alternate access mapping for accessing the new top-level Intranet site collection from the Internet.
    1. On the Central Administration site home page, click the Configure alternate access mappings link.
    2. On the Alternate Access Mappings page, click the Edit Public URLs link.
    3. On the Edit Public Zone URLs page, select and specify the following values:

      - Alternate Access Mapping Collection: SharePoint - 80
      - Internet: http://XXXlabapp01.cloudapp.net

      Click the Save button to complete the Alternate Access Mapping configuration.
  32. Close the Remote Desktop session to the server.
  33. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  34. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  35. On the Virtual Machines page, click on the name of the SharePoint virtual machine – XXXlabapp01.
  36. On the XXXlabapp01 virtual machine details page, click on Endpoints in the top navigation area of the page.
  37. Click the +Add Endpoint button in the bottom navigation bar of the page to define a new virtual machine endpoint that will permit HTTP web traffic inbound to the SharePoint virtual machine. 
  38. On the Add an endpoint to a virtual machine form, select the Add Endpoint option and click the Next button to continue.
  39. On the Specify the details of the endpoint form, specify the following field values:

    - Name: Web HTTP
    - Protocol: TCP
    - Public Port: 80
    - Private Port: 80

    Click the Checkmark button to create a new endpoint definition that will permit inbound web traffic to the SharePoint virtual machine.
  40. After the endpoint configuration has been successfully applied, test browsing to the following public URL to confirm that you are able to access the Intranet site collection that is configured on SharePoint:

    - URL: http://XXXlabapp01.cloudapp.net

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 6: Export / Import Lab Environment via PowerShell

Our functional SharePoint lab environment is now complete, but if you’re like me, you won’t be using this lab environment 24x7 around-the-clock.  As long as the virtual machines are provisioned, they will continue to accumulate compute hours against your Free 90-Day Windows Azure Trial account regardless of virtual machine state – even in a shutdown state!

To preserve as many of your free compute hours for productive lab work, we can leverage the Windows Azure PowerShell module to de-provision our lab virtual machines when not in use and re-provision our lab virtual machines when we need them again.  Once you’ve configured the PowerShell scripts below, you’ll be able to spin up your SharePoint lab environment when needed in as little as 5-10 minutes!

Note: Prior to beginning this exercise, please ensure that you’ve downloaded, installed and configured the Windows Azure PowerShell module as outlined in the Getting Started article listed in the Prerequisite section of this step-by-step guide.

  1. De-provisioning your lab. Use the PowerShell snippet below to shutdown, export and de-provision your SharePoint lab environment when you’re not using it.  Prior to running this script, be sure to edit the first line to reflect the names of each of your VMs and confirm that the $ExportPath location exists.

    $myVMs = @("XXXlabapp01","XXXlabdb01","XXXlabad01")
    Foreach ( $myVM in $myVMs ) {
    Stop-AzureVM -ServiceName $myVM -Name $myVM
    $ExportPath = "C:\ExportVMs\ExportAzureVM-$myVM.xml"
    Export-AzureVM -ServiceName $myVM -name $myVM -Path $ExportPath
    Remove-AzureVM -ServiceName $myVM -name $myVM
    }

  2. Re-provisioning your lab. Use the PowerShell snippet below to import and re-provision your SharePoint lab environment when you’re ready to use it again.  Prior to running this script, be sure to edit the first two lines to reflect the names of your Virtual Network and VMs.

    $myVNet = "XXXlabnet01"
    $myVMs = @("XXXlabad01","XXXlabdb01","XXXlabapp01")
    Foreach ( $myVM in $myVMs ) {
    $ExportPath = "C:\ExportVMs\ExportAzureVM-$myVM.xml"    
    Import-AzureVM -Path $ExportPath | New-AzureVM -ServiceName $myVM -VNetName $myVNet
    Start-AzureVM -ServiceName $myVM -name $myVM
    }

To ensure safe de-provisioning and re-provisioning of your SharePoint lab environment, note that it is important to preserve the specific order of the VM names listed in both code snippets above to ensure that the dependency order across VMs is properly handled.

What’s Next? Keep Learning!

Now that your SharePoint Server 2013 lab environment is running in the cloud, be sure to explore the resources below to continue your learning:

Build Your Lab! Build Your Lab! Download Windows Server 2012
Build Your Lab in the Cloud! Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines
Join our "Early Experts" study group! Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group
About Keith Mayer
Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

Untitled Document
Cloud Expo - Cloud Looms Large on SYS-CON.TV


Cloud Expo 2013 East Opening Keynote by IBM
In this Cloud Expo Keynote, Danny Sabbah, CTO & General Manager, Next Generation Platform, will detail the critical architectural considerations and success factors organizations must internalize to successfully implement, optimize and innovate using next generation architectures.
Lisa Larson, Vice President of Enterprise Cloud Solutions of Rackspace Hosting Live From New York City
In the old world of IT, if you didn't have hardware capacity or the budget to buy more, your project was dead in the water. Budget constraints can leave some of the best, most creative and most ingenious innovations on the cutting room floor. It's a true dilemma for developers and innovators – why spend the time creating, when a project could be abandoned in a blink? That was the old world. In the new world of IT, developers rule. They have access to resources they can spin up instantly. A hybrid cloud ignites innovation and empowers developers to focus on what they need. A hybrid cloud blends the best of all worlds, public cloud, private cloud and dedicated servers to fit the needs of developers and offer the ideal environment for each app and workload without the constraints of a one-size-fits-all cloud.

Keynote: Driving Cloud Innovation: SSDs Change Cloud Storage Paradigm
Cloud is a transformational shift in computing that can have a powerful effect on enterprise IT when designed correctly and used to its full potential. Join Citrix in a discussion that centers on building, connecting and empowering users with cloud services and hear examples of how enterprises are solving real-world business challenges with an architecture and solution purpose-built for the cloud.

Go Beyond IaaS to Deliver "Anything As a Service"
Many organizations want to expand upon the IaaS foundation to deliver cloud services in all forms—software, mobility, infrastructure and IT. Understanding the strategy, planning process and tools for this transformation will help catalyze changes in the way the business operates and deliver real value. Join us to learn about the new ITaaS model and how to begin the transformation.


Cloud Expo Latest Stories
SYS-CON Events announced today that Cloudian, Inc., the leading provider of hybrid cloud storage solutions, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Cloudian is a Foster City, Calif.-based software company specializing in cloud storage. Cloudian HyperStore® is an S3-compatible cloud object storage platform that enables service providers and enterprises to build reliable, affordable and scalable hybrid cloud storage solutions. Cloudian actively partners with leading cloud computing environments including Amazon Web Services, Citrix Cloud Platform, Apache CloudStack, OpenStack and the vast ecosystem of S3 compatible tools and applications. Cloudian's customers include Vodafone, Nextel, NTT, Nifty, and LunaCloud. The company has additional offices in China and Japan.
The consumption economy is here and so are cloud applications and solutions that offer more than subscription and flat fee models and at the same time are available on a pure consumption model, which not only reduces IT spend but also lowers infrastructure costs, and offers ease of use and availability. In their session at 15th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss this shifting dynamic with an example of a top European Telco provider. Find out how they are leveraging the power of acloud-based consumption model services to offer more value to the mass market and enable a new revenue model that embraces the true meaning of the Third Industrial Revolution.
The emergence of cloud computing and Big Data warrants a greater role for the PMO to successfully manage enterprise transformation driven by these powerful trends. As the adoption of cloud-based services continues to grow, a governance model is needed to orchestrate enterprise cloud implementations and harness the power of Big Data analytics. In his session at 15th Cloud Expo, Mahesh Singh, President of BigData, Inc., to discuss how the Enterprise PMO takes center stage not only in developing the appropriate governance model but also in collaborating with key stakeholders to ensure a successful transformation.
SYS-CON Events announced today that TechXtend (formerly Programmer’s Paradise), a leading value-added provider of server and storage virtualization, and r-evolution will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TechXtend (formerly Programmer’s Paradise) is a leading value-added provider of software, systems and solutions for corporations, government organizations, and academic institutions across the United States and Canada. TechXtend is the Exclusive Reseller in the United States for r-evolution
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, will discuss how a user-centric Application Performance Management solution can help inspire your users with every application transaction.
Come learn about what you need to consider when moving your data to the cloud. In her session at 15th Cloud Expo, Skyla Loomis, a Program Director of Cloudant Development at Cloudant, will discuss the security, performance, and operational implications of keeping your data on premise, moving it to the cloud, or taking a hybrid approach. She will use real customer examples to illustrate the tradeoffs, key decision points, and how to be successful with a cloud or hybrid cloud solution.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
The 16th International Cloud Expo announces that its Call for Papers is now open. 16th International Cloud Expo, to be held June 9–11, 2015, at the Javits Center in New York City brings together Cloud Computing, APM, APIs, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore’s Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at 15th Cloud Expo, Mason Katz, CTO and co-founder of StackIQ, to discuss how infrastructure teams should be aware of the capitalization and depreciation model of these expenses to fully understand when and where automation is critical.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.
Top Stories for Cloud Expo 2012 East

... (more)

Best Recent Articles on Cloud Computing & Big Data Topics
As we enter a new year, it is time to look back over the past year and resolve to improve upon it. In 2014, we will see more service providers resolve to add more personalization in enterprise technology. Below are seven predictions about what will drive this trend toward personalization.
IT organizations face a growing demand for faster innovation and new applications to support emerging opportunities in social, mobile, growth markets, Big Data analytics, mergers and acquisitions, strategic partnerships, and more. This is great news because it shows that IT continues to be a key stakeholder in delivering business service innovation. However, it also means that IT must deliver new innovation despite flat budgets, while maintaining existing services that grow more complex every day.
Cloud computing is transforming the way businesses think about and leverage technology. As a result, the general understanding of cloud computing has come a long way in a short time. However, there are still many misconceptions about what cloud computing is and what it can do for businesses that adopt this game-changing computing model. In this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan, Rex Wang, Vice President of Product Marketing at Oracle, discusses and dispels some of the common myths about cloud computing that still exist today.
Despite the economy, cloud computing is doing well. Gartner estimates the cloud market will double by 2016 to $206 billion. The time for dabbling in the cloud is over! The 14th International Cloud Expo, co-located with 5th International Big Data Expo and 3rd International SDN Expo, to be held June 10-12, 2014, at the Javits Center in New York City, N.Y. announces that its Call for Papers is now open. Topics include all aspects of providing or using massively scalable IT-related capabilities as a service using Internet technologies (see suggested topics below). Cloud computing helps IT cut infrastructure costs while adding new features and services to grow core businesses. Clouds can help grow margins as costs are cut back but service offerings are expanded. Help plant your flag in the fast-expanding business opportunity that is The Cloud, Big Data and Software-Defined Networking: submit your speaking proposal today!
What do you get when you combine Big Data technologies….like Pig and Hive? A flying pig? No, you get a “Logical Data Warehouse.” In 2012, Infochimps (now CSC) leveraged its early use of stream processing, NoSQLs, and Hadoop to create a design pattern which combined real-time, ad-hoc, and batch analytics. This concept of combining the best-in-breed Big Data technologies will continue to advance across the industry until the entire legacy (and proprietary) data infrastructure stack will be replaced with a new (and open) one.
While unprecedented technological advances have been made in healthcare in areas such as genomics, digital imaging and Health Information Systems, access to this information has been not been easy for both the healthcare provider and the patient themselves. Regulatory compliance and controls, information lock-in in proprietary Electronic Health Record systems and security concerns have made it difficult to share data across health care providers.
Cloud Expo, Inc. has announced today that Vanessa Alvarez has been named conference chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
12th International Cloud Expo, held on June 10–13, 2013 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
It's a simple fact that the better sales reps understand their prospects' intentions, preferences and pain points during calls, the more business they'll close. Each day, as your prospects interact with websites and social media platforms, their behavioral data profile is expanding. It's now possible to gain unprecedented insight into prospects' content preferences, product needs and budget. We hear a lot about how valuable Big Data is to sales and marketing teams. But data itself is only valuable when it's part of a bigger story, made visible in the right context.
Cloud Expo, Inc. has announced today that Larry Carvalho has been named Tech Chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Everyone talks about a cloud-first or mobile-first strategy. It's the trend du jour, and for good reason as these innovative technologies have revolutionized an industry and made savvy companies a lot of money. But consider for a minute what's emerging with the Age of Context and the Internet of Things. Devices, interfaces, everyday objects are becoming endowed with computing smarts. This is creating an unprecedented focus on the Application Programming Interface (API) as developers seek to connect these devices and interfaces to create new supporting services and hybrids. I call this trend the move toward an API-first business model and strategy.
We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.
Register and Save!
Save $500
on your “Golden Pass”!
Call 201.802.3020
or click here to Register
Early Bird Expires June 10th.


Silicon Valley Call For Papers Now OPEN
Submit
Call for Papers for the
upcoming Cloud Expo in
Santa Clara, CA!
[November 5-8, 2012]


Sponsorship Opportunities
Please Call
201.802.3021
events (at) sys-con.com
SYS-CON's Cloud Expo, held each year in California, New York, Prague, Tokyo, and Hong Kong is the world’s leading Cloud event in its 5th year, larger than all other Cloud events put together. For sponsorship, exhibit opportunites and show prospectus, please contact Carmen Gonzalez, carmen (at) sys-con.com.


New York City Expo Floor Plan Revealed
Cloud Expo New York
[June 11-14, 2012]

Floor Plan Revealed


Introducing Big Data Expo
Introducing
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Get a jump on that rapidly evolving trend at Big Data Expo, which we are introducing in June at
Cloud Expo New York.

Follow @CloudExpo New York on Twitter


Testimonials
Cloud Expo was a fantastic event for CSS Corp - we easily exceeded our objectives for engaging with clients and prospects."
AHMAR ABBAS
SVP, Global Infrastructure Management, CSS Corp.
 
With our launch at Cloud Expo, we successfully transformed the company from a relatively unknown European player into the dominant player in the market. Our competitors were taken by surprise and just blown away. We got a huge number of really high quality leads..."
PETE MALCOLM
CEO, Abiquo
 
We were extremely pleased with Cloud Expo this year - I’d say it exceeded expectations all around. This is the same info we got from partners who attended as well. Nice job!"
MARY BASS
Director of Marketing, UnivaUD
 
Cloud Expo helps focus the debate on the critical issues at hand in effect connecting main street with the next frontier."

GREG O’CONNOR
President & CEO, Appzero


Who Should Attend?
Senior Technologists including CIOs, CTOs, VPs of technology, IT directors and managers, network and storage managers, network engineers, enterprise architects, communications and networking specialists, directors of infrastructure Business Executives including CEOs, CMOs, CIOs, presidents, VPs, directors, business development; product and purchasing managers.


Join Us as a Media Partner - Together We Can Rock the IT World!
SYS-CON Media has a flourishing Media Partner program in which mutually beneficial promotion and benefits are arranged between our own leading Enterprise IT portals and events and those of our partners.

If you would like to participate, please provide us with details of your website/s and event/s or your organization and please include basic audience demographics as well as relevant metrics such as ave. page views per month.

To get involved, email Marilyn Moux at marilyn@sys-con.com.

Latest Blog Posts
Regardless if you’ve migrated multiple applications or this is your first migration to a public Infrastructure-as-a-Service (IaaS) you will want to run a small proof-of-concept to make sure that the basic elements of data flow operate as expected and your components will run in the IaaS environment. This week I spent some time experimenting with the three top IaaS offerings: Amazon AWS, Google Compute Cloud and Microsoft Azure. The architecture was relatively simple: 3 docker containers, one hosting a LAMP—Linux, Apache, MySQL & PHP—stack running WordPress, one hosting Postfix mail server forwarding all mail, and one hosting CVS. The results of the testing were informative.
On August 28th, Samsung and LG announced their latest generation of smartwatches in separate press releases. The Samsung report flaunts its Gear S, a smartwatch with a curved screen that measures a full two-inches long. Capabilities of the watch include providing notifications from social networking websites, GPS, improved S Health features, turn-by-turn directions, a touch keyboard, and 3G connectivity. Samsung estimates battery life at two days for normal usage. Details regarding pricing and when the product goes to market were not released.
In today’s software-defined business era, uptime and availability are key to the business survival. The application is the business. However, ensuring proper application performance remains a daunting task for their production environments, where do you start? Enter Business Transactions. By starting to focus on the end-user experience and measuring application performance based on their interactions, we can correctly gauge how the entire environment is performing. We follow each individual user request as it flows through the application architecture, comparing the response time to its optimal performance. This inside-out strategy allows AppDynamics to instantly identify performance bottlenecks and allows application owners to get to the root-cause of issues that much faster.
Last week I was both learning and speaking at the The Internet of @ThingsExpo in New York City. I taught a session on the subject of IoT, Code Halos and Digital Transformation Strategies. Then I had the privilege of interviewing Microsoft's mobile and IoT guru Nick Landry (Twitter @ActiveNick). In this interview he shares Microsoft's solutions and strategies around the Internet of Things. Enjoy!
In June I was presenting at the ThingsExpo conference  at the Javits Center in New York City. During my talk I was demonstrating how to integrate consumer devices into a business workflow. In particular, I was doing live measuring of my own blood pressure to show how to integrate consumer devices into a business workflow. Internet […]
I came across a very interesting story posted at the Reviewed.com. Website in the section covering home oven reviews. The headline read “The Smart Home Dominates the Internet of Things.” The story, based on a recent report published by Appinions, a market research firm, stated that while home automation “is used interchangeably” with the Internet of Things, the fact is that the IoT is far bigger than that. “It promises to unify a vast array of industrial and commercial devices and activities, not just your phone and dishwasher.” To be sure, home automation is leading the way right now, what with Apple, Google, Microsoft and other companies jumping on the bandwagon. What is missing in the piece is the real challenge (and opportunity) of the IoT: monetization.
The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all these features like Polymer and Angular. It will also encourage people who are not familiar with these next generational frameworks to get in on the action. As I am from a gaming background then I always complain that TCP (Web Sockets) is not genuinely real-time, so I look forward to seeing UDP (WebRTC) solutions being delivered like Desktop Sharing in Chrome 34.
So exactly how do you kick start a DevOps strategy? For example, say your organization is tied down to a very sequential, but cumbersome Waterfall approach to software development that is wasting precious dollars and hindering productivity? In the following we’ve outlined some strategy tips that every business leader will need to consider as they start down the path of DevOps adoption. Whatever steps your organization takes on the DevOps path of rolling out software faster and more effectively and deployment will require the support of your senior level management team. Explain the advantages of DevOps to the executive team in terms that they can easily understand. Provide an outline of how DevOps and cloud computing can save on ROI and get your new mobile application into the hands of the customer faster and more effectively with higher quality.
The IoT has the potential to transform the world, bringing new functions and efficiency to big problems such as disease, poverty, traffic, and government transparence. It also makes for a nice Christmas gift for “kids in science fair projects.” This latter point was made by Broadcom and Brian Bedrosian, the company's senior director for wireless connectivity. Specifically, the company has introduced its Wiced Sense Kit at a price of US$20. The kit—suitable for adult use, of course, as well--allows IoT developers and the millions of “makers” in the thriving Maker Movement to build sensors that communicate with iPhones, home communication hubs, and other devices. Suggested uses include baby monitors, door locks, and personal healthcare monitors. Broadcom's kit presumably competes with a US$25 kit from Texas Instruments, which also has Android and Bluetooth functionality.
The Internet of Things (IoT) is getting personal. Wearables, ingestables, even implantables – devices that not only help us with our fitness, but can monitor and manage disease and its treatment – are right around the corner. And where the IoT goes, money follows. In this case, Big Pharma smells opportunity. But while these new consumer doodads get all the press, health-related IoT technologies are only the tip of the iceberg for the Digital Transformation hitting the pharmaceutical industry, as it struggles with a turbulent business landscape. “There are very few blockbuster drugs,” explains Eric Pilkington, EVP Digital Strategy at McCann Health North America. “The pharma companies are trying to figure out what to do. The competitors are the generics: identical and cheaper. The pharmas have to provide value beyond the drug itself.”
Such devices and connected technologies exist today, and the sector is driving the leading edge of the Internet of Things under the label of the “Connected Home” or “Smart Home”. Fact is, some pretty traditional low-tech concepts are becoming technology-enabled, such as/including locking doors, turning on the lights, heating the home, etc. What this means is that technology companies are broadening into areas that have historically been outside of their business plans. Apple is in the fitness business. Google is in the heating business. Samsung is in the appliance business. Others include cable TV and Internet provider Comcast, and the Schlage-brand lock maker Ingersoll Rand.
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center. What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to execute a DNS query. Every. Single. Thing.
Elasticity is hailed as one of the biggest benefits of cloud and software-defined architectures. It's more efficient than traditional scalability models that only went one direction: up. It's based on the premise that wasting money and resources all the time just to ensure capacity on a seasonal or periodic basis is not only unappealing, but unnecessary in the age of software-defined everything. The problem is that scaling down is much, much harder than scaling up. Oh, not from the perspective of automation and orchestration. That is, as the kids say these days, easy peasy lemon squeezy. APIs have made the ability to add and remove resources simplicity itself. There isn't a load balancing service available today without this capability - at least not one that's worth having.
As enterprises work to rapidly embrace the mobile revolution, both for their workforce and to engage more deeply with their customers, the pressure is on for IT to support the tools needed by their application developers. Mobile application developers are working with a massive variety of technologies and platforms, but one trend that stands out is the rapid adoption of NoSQL database engines and the use of Database-as-a-Service (DBaaS) platforms and services to run them. Gartner has predicted that by 2017, 20% of enterprises will have their own internal mobile app store, meaning that enterprises are deploying both commercial and custom applications to their workforce at increasing speeds. There’s no denying the massive growth in mobile applications within the enterprise.
The Internet of Things is only going to make that even more challenging as businesses turn to new business models and services fueled by a converging digital-physical world. Applications, whether focused on licensing, provisioning, managing or storing data for these "things" will increase the already significant burden on IT as a whole. The inability to scale from an operational perspective is really what software-defined architectures are attempting to solve by operationalizing the network to shift the burden of provisioning and management from people to technology.
Untitled Document
Past SYS-CON Events
    Cloud Expo West
cloudcomputingexpo
2011west.sys-con.com

 
    Cloud Expo East
cloudcexpo
2011east.sys-con.com

 
    Cloud Expo West
cloudcomputingexpo
2010west.sys-con.com

 
    Virtualization Expo West
virtualization
2010west.sys-con.com
    Cloud Expo Europe
cloudexpoeurope2010.
sys-con.com

 
    Cloud Expo East
cloudcomputingexpo
2010east.sys-con.com

 
    Virtualization Expo East
virtualizationconference
2010east.sys-con.com
    Cloud Expo West
cloudcomputingexpo
2009west.sys-con.com

 
    Virtualization Expo West
virtualizationconference
2009west.sys-con.com
    GovIT Expo
govitexpo.com
 
    Cloud Expo Europe
cloudexpoeurope2009.sys-con.com
 

Cloud Expo 2011 Allstar Conference Faculty

S.F.S.
Dell

Singer
NRO

Pereyra
Oracle

Ryan
OpSource

Butte
PwC

Leone
Oracle

Riley
AWS

Varia
AWS

Lye
Oracle

O'Connor
AppZero

Crandell
RightScale

Nucci
Dell Boomi

Hillier
CiRBA

Morrison
Layer 7 Tech

Robbins
NYT

Schwarz
Oracle

What The Enterprise IT World Says About Cloud Expo
 
"We had extremely positive feedback from both customers and prospects that attended the show and saw live demos of NaviSite's enterprise cloud based services."
  –William Toll
Sr. Director, Marketing & Strategic Alliances
Navisite
 


 
"More and better leads than ever expected! I have 4-6 follow ups personally."
  –Richard Wellner
Chief Scientist
Univa UD
 


 
"Good crowd, good questions. The event looked very successful."
  –Simon Crosby
CTO
Citrix Systems
 


 
"It's the largest cloud computing conference I've ever seen."
  –David Linthicum
CTO
Brick Group