Untitled Document
  Home
Speakers
Sessions
  Schedule
Sponsors
Exhibitors
  Media Sponsors
  Association Sponsors
Untitled Document
2012 East Diamond Sponsor

Untitled Document
2012 East Platinum Plus Sponsors

Untitled Document
2012 East Platinum Sponsors

Untitled Document
2012 East Gold Sponsors

Untitled Document
2012 East Silver Sponsors

Untitled Document
2012 East Bronze Sponsors

Untitled Document
2012 East Exhibitors

Untitled Document
2012 East Big Data Pavilion Sponsor

Untitled Document
2012 East Day 2 & 3 Morning Break Sponsor

Untitled Document
2012 East ODCA Partner Pavilion

Untitled Document
2012 East Association Sponsors

Untitled Document
2012 East Media Sponsors

Untitled Document
2011 West Diamond Sponsor

Untitled Document
2011 West Platinum Sponsors

Untitled Document
2011 West Platinum Plus Sponsor

Untitled Document
2011 West Gold Sponsors

Untitled Document
2011 West Silver Sponsors

Untitled Document
2011 West Bronze Sponsor


Untitled Document
2011 West Exhibitors























































Untitled Document
2011 West Wireless Network Sponsor

Untitled Document
2011 West Break Sponsor

Untitled Document
2011 West Day 3 Coffee Break Sponsor

Untitled Document
2011 West Day 2 Lunch Sponsor

Untitled Document
2011 West Day 3 Lunch Sponsor

Untitled Document
2011 Media Sponsors

Step-by-Step: Build a SharePoint 2013 Lab in the Cloud on Windows Azure
Leverage Windows Azure as your Lab in the Cloud!

My good friend and colleague, Tommy Patterson, recently blogged about leveraging the Windows Azure Infrastructure as a Service (IaaS) preview offering to build a FREE lab environment in the Cloud.  You can read Tommy’s step-by-step article at:

What about a SharePoint 2013 Lab in the Cloud?
Now that SharePoint Server 2013 has been released, I frequently get asked about ways in which a SharePoint 2013 lab environment can be easily built for studying, testing and/or performing a proof-of-concept.  You could certainly build this lab environment on your own hardware, but due to the level of SharePoint 2013 hardware requirements, a lot of us may not have sufficient spare hardware to implement an on-premise lab environment.

This makes a great scenario for leveraging our Windows Azure FREE 90-day Trial Offer to build a free lab environment for SharePoint 2013 in the cloud.  Using the process outlined in this article, you’ll be able to build a basic functional farm environment for SharePoint 2013 that will be accessible for approximately 105 hours of compute usage each month at no cost to you under the 90-day Trial Offer.

After the 90-day trial period is up, you can choose if you’d like to convert to a full paid subscription.  If you choose to convert to a paid subscription, this lab environment will cost approximately $0.56 USD per hour of compute usage ( that’s right – just 56 cents per hour ) plus associated storage and networking costs ( which can typically be less than $10 USD per month for a lab of this nature ). These estimated costs are based on published Pay-As-You-Go pricing for Windows Azure that is current as of this article’s date.

Note: If you are testing advanced SharePoint 2013 scenarios and need more resources than available in the lab configuration below, you can certainly scale-up or scale-out elastically by provisioning larger VMs or additional SharePoint web and application server VMs.  To determine the specific costs associated with higher resource levels, please visit the Windows Azure Pricing Calculator for Virtual Machines.

SharePoint 2013 Lab Scenario
To deliver a functional and expandable lab environment, I’ll be walking through the approach of provisioning SharePoint Server 2013 on Windows Azure VMs as depicted in the following configuration diagram that will require three (3) VMs on a common Windows Azure Virtual Network.

SP2013onAzureScenario

Lab Scenario: SharePoint 2013 on Windows Azure VM

In this lab, we’ll be using a naming convention of XXXlabYYY01, where XXX will be replaced with your unique initials and YYY will be replaced with an abbreviation representing the function of a virtual machine or Windows Azure configuration component (ie., ad, db or app).

Note: This study lab configuration is suitable for study, functional testing and basic proof-of-concept usage.  This configuration is not currently supported for pilot or production SharePoint 2013 farm environments.

Prerequisites

The following is required to complete this step-by-step guide:

  • A Windows Azure subscription with the Virtual Machines Preview enabled.

    DO IT: Sign up for a FREE Trial of Windows Azure

    NOTE: When activating your FREE Trial for Windows Azure, you will be prompted for credit card information.  This information is used only to validate your identity and your credit card will not be charged, unless you explicitly convert your FREE Trial account to a paid subscription at a later point in time. 
  • Completion of the Getting Started tasks in the following article:

    DO IT: Getting Started with Servers in the Cloud
  • This step-by-step guide assumes that the reader is already familiar with configuring Windows Server Active Directory, SQL Server and SharePoint Server 2013 in an on-premise installation. This guide focuses on the unique aspects associated with configuring these components on the Windows Azure cloud platform.

Let’s Get Started!
In this step-by-step guide, you will learn how to:

  • Register a DNS Server in Windows Azure
  • Define a Virtual Network in Windows Azure
  • Configure Windows Server Active Directory in a Windows Azure VM
  • Configure SQL Server 2012 in a Windows Azure VM
  • Configure SharePoint Server 2013 in a Windows Azure VM
  • Export / Import Lab Environment via PowerShell

Exercise 1: Register a DNS Server in Windows Azure

Register the internal IP address that our domain controller VM will be using for Active Directory-integrated Dynamic DNS services by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Networks located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Networks | Virtual Network | Register DNS Server.
  4. Complete the DNS Server fields as follows:

    - NAME: XXXlabdns01
    - DNS Server IP Address: 10.0.0.4
  5. Click the REGISTER DNS SERVER button.

Exercise 2: Define a Virtual Network in Windows Azure

Define a common virtual network in Windows Azure for running Active Directory, Database and SharePoint virtual machines by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Networks located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Networks | Virtual Network | Quick Create.
  4. Complete the Virtual Network fields as follows:

    - NAME: XXXlabnet01
    - Address Space: 10.---.---.---
    - Maximum VM Count: 4096 [CIDR: /20]
    - Affinity Group: Select the Affinity Group defined in the Getting Started steps from the Prerequisites section above.
    - Connect to Existing DNS: Select XXXlabdns01 – the DNS Server registered in Exercise 1 above.
  5. Click the CREATE A VIRTUAL NETWORK button.

Exercise 3: Configure Windows Server Active Directory in a Windows Azure VM

Provision a new Windows Azure VM to run a Windows Server Active Directory domain controller in a new Active Directory forest by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select Windows Server 2012, December 2012 and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabad01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Small (1 core, 1.75GB Memory)

    Click the Next button to continue.

    Note: It is suggested to use secure passwords for Administrator users and service accounts, as Windows Azure virtual machines could be accessible from the Internet knowing just their DNS.  You can also read this document on the Microsoft Security website that will help you select a secure password: http://www.microsoft.com/security/online-privacy/passwords-create.aspx.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabad01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabad01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabad01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.4

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabad01, and go back to Exercise 2 and Exercise 3 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabad01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabad01-data01
    - Size: 10 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabad01.
  11. On the virtual machine details page for XXXlabad01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabad01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.  This volume will be used for NTDS DIT database, log and SYSVOL folder locations.

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Configure Local Storage
  13. Using the Server Manager tool, install Active Directory Domain Services and promote this server to a domain controller in a new forest with the following parameters:

    - Active Directory Forest name: contoso.com
    - Volume Location for NTDS database, log and SYSVOL folders: F:

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Install and Administer Active Directory
  14. After Active Directory has been installed, create the following user accounts that will be used when installing and configuring SharePoint Server 2013 later in this step-by-step guide:

    - CONTOSO\sp_farm – SharePoint Farm Data Access Account
    - CONTOSO\sp_serviceapps – SharePoint Farm Service Applications Account

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Install and Administer Active Directory

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 4: Configure SQL Server 2012 in a Windows Azure VM

Provision a new Windows Azure VM to run SQL Server 2012 by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select SQL Server 2012 Evaluation Edition and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabdb01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Medium (2 cores, 3.5GB Memory)

    Click the Next button to continue.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabdb01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabdb01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabdb01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.5

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabdb01, and go back to Exercise 2 and Exercise 3 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabdb01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabdb01-data01
    - Size: 50 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabdb01.
  11. On the virtual machine details page for XXXlabdb01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabdb01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.
  13. Open SQL Server Management Studio from Start | All Programs | Microsoft SQL Server 2012 | SQL Server Management Studio and update default folder locations to the F: volume.

    1. Connect to the SQL Server 2012 default instance using your Windows Account.
    2. Now, you will update the database's default locations for DATA, LOGS and BACKUP folders. To do this, right click on your SQL Server instance and select Properties.
    3. Select Database Settings from the left side pane.
    4. Locate the Database default locations section and update the default values for each path to point to the F: volume you previously formatted.
    5. Close SQL Server Management Studio.
  14. In order to allow SharePoint to connect to the SQL Server, you will need to add an Inbound Rule for the SQL Server requests in the Windows Firewall. To do this, open Windows Firewall with Advanced Security from Start | All Programs | Administrative Tools.

    1. Select Inbound Rules node, right-click it and select New Rule to open the New Inbound Rule Wizard.

    2. In the Rule Type page, select Port and click Next.

    3. In Protocols and Ports page, leave TCP selected, select Specific local ports, and set its value to 1433. Click Next to continue.

    4. In the Action page, make sure that Allow the connection is selected and click Next.

    5. In the Profile page, leave the default values and click Next.

    6. In the Name page, set the Inbound Rule's Name to SQLServerRule and click Finish

    7. Close Windows Firewall with Advanced Security window.

  15. Using the Server Manager tool, join this server to the contoso.com domain and restart the server to complete the domain join operation.
  16. After the server restarts, connect again via Remote Desktop to the server’s console and login with the local Administrator credentials defined above in Step 5.

  17. Open SQL Server Management Studio from Start | All Programs | Microsoft SQL Server 2012 | SQL Server Management Studio and add the CONTOSO\Administrator user to SQL Server with the Sysadmin server role selected.

    1. Expand Security folder within the SQL Server instance. Right-click Logins folder and select New Login.

    2. In the General section, set the Login name to CONTOSO\Administrator, and select the Windows Authentication option.

    3. Click Server Roles on the left pane.  Select the checkbox for the Sysadmin server role.

    4. Click the OK button and close SQL Server Management Studio.

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 5: Configure SharePoint Server 2013 in a Windows Azure VM

Provision a new Windows Azure VM to run SharePoint Server 2013 by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select Windows Server 2012, December 2012 and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabapp01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Large (4 cores, 7GB Memory)

    Click the Next button to continue.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabapp01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabapp01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabapp01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.6

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabapp01, and go back to Exercise 2,  Exercise 3 and Exercise 4 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabapp01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabapp01-data01
    - Size: 50 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabapp01.
  11. On the virtual machine details page for XXXlabapp01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabapp01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.  
  13. In the Server Manager tool, click on Local Server in the left navigation pane and click on the Workgroup option.  Join this server to the contoso.com domain and restart the server to complete the domain join operation.
  14. After the server restarts, re-establish a Remote Desktop connection to the server and logon with the CONTOSO\Administrator domain user credentials defined earlier in Exercise 3.
  15. In the Server Manager tool, click on Local Server in the left navigation pane and select IE Enhanced Security Configuration. Turn off enhanced security for Administrators and click the OK button.

    Note: Modifying Internet Explorer Enhanced Security configurations is not good practice and is only for the purpose of this particular step-by-step guide. The correct approach should be to download files locally and then copy them to a shared folder or directly to the VM.

  16. Press the Windows key to switch to the Start Screen and launch Internet Explorer.  Download the following files to the F:\INSTALL folder:

    - SharePoint Server 2013 Evaluation Edition

    Make a note of the SharePoint Product Key listed on this page, as you’ll need it for the installation of SharePoint Server 2013.

    - ASP.NET 4.5 hotfix for Windows Server 2012 ( KB2765317 )

  17. Navigate to the F:\INSTALL folder and double-click on the downloaded .IMG file to mount it.  Copy all files and folders from the mounted .IMG file to F:\INSTALL.
  18. Install the SharePoint Server 2013 software prerequisites by running F:\INSTALL\prerequisiteinstaller.exe.  Note that this process may require multiple server restarts to complete.  After all required software is successfully installed, continue with the next step in this step-by-step guide.
  19. Install the ASP.NET 4.5 hotfix downloaded to the F:\INSTALL folder in Step 14 above.
  20. Run F:\INSTALL\setup.exe to launch the SharePoint Server 2013 installation process.
  21. When prompted, on the Server Type tab of the setup program, select the Complete installation option.
  22. On the File Location tab of the setup program, change the data path to use the F: volume formatted in Step 12 above.
  23. At the end of the installation process, ensure the checkbox is selected to Run the SharePoint Products Configuration Wizard Now and click the Close button.
  24. In the SharePoint Products Configuration Wizard, when prompted on the Connect to server farm dialog, select the option to Create a new server farm.
  25. On the Specify Configuration Database Settings, specify the following values for each field:

    - Database Server: XXXlabdb01
    - Username: CONTOSO\sp_farm
    - Password: Type the password specified when the sp_farm domain user account was created earlier in Exercise 3, Step 14.
  26. Click the Next > button and accept all other default values in the SharePoint Products Configuration Wizard.  Click the Finish button when prompted to complete the wizard.
  27. The SharePoint 2013 Central Administration web page should launch automatically.  When prompted, click the Start the Wizard button to begin the Initial Farm Configuration Wizard.
  28. When prompted for Service Account, type the CONTOSO\sp_serviceapps domain username and password specified when this account was created earlier in Exercise 3, Step 14.
  29. Accept all other default values and click the Next > button to continue.
  30. On the Create a Site Collection page, create a new top-level Intranet site collection using the following field values:

    - Title and Description: Enter your preferred Title and Description for the new site collection
    - URL: Select the root URL path – http://XXXlabapp01/
    - Select experience version:
    2013
    - Select a template: Publishing | Publishing Portal

    Click the OK button to provision a new top-level Intranet site collection. 

    After the new top-level Intranet site collection is provisioned, test navigating to the URL for this site collection from within the Remote Desktop session to the server.
  31. On the SharePoint 2013 Central Administration site, configure a Public URL alternate access mapping for accessing the new top-level Intranet site collection from the Internet.
    1. On the Central Administration site home page, click the Configure alternate access mappings link.
    2. On the Alternate Access Mappings page, click the Edit Public URLs link.
    3. On the Edit Public Zone URLs page, select and specify the following values:

      - Alternate Access Mapping Collection: SharePoint - 80
      - Internet: http://XXXlabapp01.cloudapp.net

      Click the Save button to complete the Alternate Access Mapping configuration.
  32. Close the Remote Desktop session to the server.
  33. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  34. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  35. On the Virtual Machines page, click on the name of the SharePoint virtual machine – XXXlabapp01.
  36. On the XXXlabapp01 virtual machine details page, click on Endpoints in the top navigation area of the page.
  37. Click the +Add Endpoint button in the bottom navigation bar of the page to define a new virtual machine endpoint that will permit HTTP web traffic inbound to the SharePoint virtual machine. 
  38. On the Add an endpoint to a virtual machine form, select the Add Endpoint option and click the Next button to continue.
  39. On the Specify the details of the endpoint form, specify the following field values:

    - Name: Web HTTP
    - Protocol: TCP
    - Public Port: 80
    - Private Port: 80

    Click the Checkmark button to create a new endpoint definition that will permit inbound web traffic to the SharePoint virtual machine.
  40. After the endpoint configuration has been successfully applied, test browsing to the following public URL to confirm that you are able to access the Intranet site collection that is configured on SharePoint:

    - URL: http://XXXlabapp01.cloudapp.net

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 6: Export / Import Lab Environment via PowerShell

Our functional SharePoint lab environment is now complete, but if you’re like me, you won’t be using this lab environment 24x7 around-the-clock.  As long as the virtual machines are provisioned, they will continue to accumulate compute hours against your Free 90-Day Windows Azure Trial account regardless of virtual machine state – even in a shutdown state!

To preserve as many of your free compute hours for productive lab work, we can leverage the Windows Azure PowerShell module to de-provision our lab virtual machines when not in use and re-provision our lab virtual machines when we need them again.  Once you’ve configured the PowerShell scripts below, you’ll be able to spin up your SharePoint lab environment when needed in as little as 5-10 minutes!

Note: Prior to beginning this exercise, please ensure that you’ve downloaded, installed and configured the Windows Azure PowerShell module as outlined in the Getting Started article listed in the Prerequisite section of this step-by-step guide.

  1. De-provisioning your lab. Use the PowerShell snippet below to shutdown, export and de-provision your SharePoint lab environment when you’re not using it.  Prior to running this script, be sure to edit the first line to reflect the names of each of your VMs and confirm that the $ExportPath location exists.

    $myVMs = @("XXXlabapp01","XXXlabdb01","XXXlabad01")
    Foreach ( $myVM in $myVMs ) {
    Stop-AzureVM -ServiceName $myVM -Name $myVM
    $ExportPath = "C:\ExportVMs\ExportAzureVM-$myVM.xml"
    Export-AzureVM -ServiceName $myVM -name $myVM -Path $ExportPath
    Remove-AzureVM -ServiceName $myVM -name $myVM
    }

  2. Re-provisioning your lab. Use the PowerShell snippet below to import and re-provision your SharePoint lab environment when you’re ready to use it again.  Prior to running this script, be sure to edit the first two lines to reflect the names of your Virtual Network and VMs.

    $myVNet = "XXXlabnet01"
    $myVMs = @("XXXlabad01","XXXlabdb01","XXXlabapp01")
    Foreach ( $myVM in $myVMs ) {
    $ExportPath = "C:\ExportVMs\ExportAzureVM-$myVM.xml"    
    Import-AzureVM -Path $ExportPath | New-AzureVM -ServiceName $myVM -VNetName $myVNet
    Start-AzureVM -ServiceName $myVM -name $myVM
    }

To ensure safe de-provisioning and re-provisioning of your SharePoint lab environment, note that it is important to preserve the specific order of the VM names listed in both code snippets above to ensure that the dependency order across VMs is properly handled.

What’s Next? Keep Learning!

Now that your SharePoint Server 2013 lab environment is running in the cloud, be sure to explore the resources below to continue your learning:

Build Your Lab! Build Your Lab! Download Windows Server 2012
Build Your Lab in the Cloud! Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines
Join our "Early Experts" study group! Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group
About Keith Mayer
Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

Untitled Document
Cloud Expo - Cloud Looms Large on SYS-CON.TV


Cloud Expo 2013 East Opening Keynote by IBM
In this Cloud Expo Keynote, Danny Sabbah, CTO & General Manager, Next Generation Platform, will detail the critical architectural considerations and success factors organizations must internalize to successfully implement, optimize and innovate using next generation architectures.
Lisa Larson, Vice President of Enterprise Cloud Solutions of Rackspace Hosting Live From New York City
In the old world of IT, if you didn't have hardware capacity or the budget to buy more, your project was dead in the water. Budget constraints can leave some of the best, most creative and most ingenious innovations on the cutting room floor. It's a true dilemma for developers and innovators – why spend the time creating, when a project could be abandoned in a blink? That was the old world. In the new world of IT, developers rule. They have access to resources they can spin up instantly. A hybrid cloud ignites innovation and empowers developers to focus on what they need. A hybrid cloud blends the best of all worlds, public cloud, private cloud and dedicated servers to fit the needs of developers and offer the ideal environment for each app and workload without the constraints of a one-size-fits-all cloud.

Keynote: Driving Cloud Innovation: SSDs Change Cloud Storage Paradigm
Cloud is a transformational shift in computing that can have a powerful effect on enterprise IT when designed correctly and used to its full potential. Join Citrix in a discussion that centers on building, connecting and empowering users with cloud services and hear examples of how enterprises are solving real-world business challenges with an architecture and solution purpose-built for the cloud.

Go Beyond IaaS to Deliver "Anything As a Service"
Many organizations want to expand upon the IaaS foundation to deliver cloud services in all forms—software, mobility, infrastructure and IT. Understanding the strategy, planning process and tools for this transformation will help catalyze changes in the way the business operates and deliver real value. Join us to learn about the new ITaaS model and how to begin the transformation.


@CloudExpo Stories
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Top Stories for Cloud Expo 2012 East

In this Big Data Power Panel at the 10th International Cloud Expo, moderated by Cloud Expo Conference Chair Jeremy Geelan, Govind Rangasamy, Director of Product Management at Eucalyptus Systems; Kevin Brown; CEO of Coraid, Inc.; Christos Tryfonas, CTO and Co-Founder of Cetas; and Max Riggsbee, CMO and VP of Products for WhipTail, discussed such topics as: Big Data has existed since the early days of computing; why, then, do you think there is such an industry buzz around it right now? How is Big Data impacting storage and networking architecture in data centers? How about the intersection of Big Data Analytics and Cloud Computing - how big a sector is that and why? What's the difference between Big Data and Fast Data? ... (more)

Best Recent Articles on Cloud Computing & Big Data Topics
As we enter a new year, it is time to look back over the past year and resolve to improve upon it. In 2014, we will see more service providers resolve to add more personalization in enterprise technology. Below are seven predictions about what will drive this trend toward personalization.
IT organizations face a growing demand for faster innovation and new applications to support emerging opportunities in social, mobile, growth markets, Big Data analytics, mergers and acquisitions, strategic partnerships, and more. This is great news because it shows that IT continues to be a key stakeholder in delivering business service innovation. However, it also means that IT must deliver new innovation despite flat budgets, while maintaining existing services that grow more complex every day.
Cloud computing is transforming the way businesses think about and leverage technology. As a result, the general understanding of cloud computing has come a long way in a short time. However, there are still many misconceptions about what cloud computing is and what it can do for businesses that adopt this game-changing computing model. In this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan, Rex Wang, Vice President of Product Marketing at Oracle, discusses and dispels some of the common myths about cloud computing that still exist today.
Despite the economy, cloud computing is doing well. Gartner estimates the cloud market will double by 2016 to $206 billion. The time for dabbling in the cloud is over! The 14th International Cloud Expo, co-located with 5th International Big Data Expo and 3rd International SDN Expo, to be held June 10-12, 2014, at the Javits Center in New York City, N.Y. announces that its Call for Papers is now open. Topics include all aspects of providing or using massively scalable IT-related capabilities as a service using Internet technologies (see suggested topics below). Cloud computing helps IT cut infrastructure costs while adding new features and services to grow core businesses. Clouds can help grow margins as costs are cut back but service offerings are expanded. Help plant your flag in the fast-expanding business opportunity that is The Cloud, Big Data and Software-Defined Networking: submit your speaking proposal today!
What do you get when you combine Big Data technologies….like Pig and Hive? A flying pig? No, you get a “Logical Data Warehouse.” In 2012, Infochimps (now CSC) leveraged its early use of stream processing, NoSQLs, and Hadoop to create a design pattern which combined real-time, ad-hoc, and batch analytics. This concept of combining the best-in-breed Big Data technologies will continue to advance across the industry until the entire legacy (and proprietary) data infrastructure stack will be replaced with a new (and open) one.
While unprecedented technological advances have been made in healthcare in areas such as genomics, digital imaging and Health Information Systems, access to this information has been not been easy for both the healthcare provider and the patient themselves. Regulatory compliance and controls, information lock-in in proprietary Electronic Health Record systems and security concerns have made it difficult to share data across health care providers.
Cloud Expo, Inc. has announced today that Vanessa Alvarez has been named conference chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
12th International Cloud Expo, held on June 10–13, 2013 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
It's a simple fact that the better sales reps understand their prospects' intentions, preferences and pain points during calls, the more business they'll close. Each day, as your prospects interact with websites and social media platforms, their behavioral data profile is expanding. It's now possible to gain unprecedented insight into prospects' content preferences, product needs and budget. We hear a lot about how valuable Big Data is to sales and marketing teams. But data itself is only valuable when it's part of a bigger story, made visible in the right context.
Cloud Expo, Inc. has announced today that Larry Carvalho has been named Tech Chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Everyone talks about a cloud-first or mobile-first strategy. It's the trend du jour, and for good reason as these innovative technologies have revolutionized an industry and made savvy companies a lot of money. But consider for a minute what's emerging with the Age of Context and the Internet of Things. Devices, interfaces, everyday objects are becoming endowed with computing smarts. This is creating an unprecedented focus on the Application Programming Interface (API) as developers seek to connect these devices and interfaces to create new supporting services and hybrids. I call this trend the move toward an API-first business model and strategy.
We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.
Register and Save!
Save $500
on your “Golden Pass”!
Call 201.802.3020
or click here to Register
Early Bird Expires June 10th.


Silicon Valley Call For Papers Now OPEN
Submit
Call for Papers for the
upcoming Cloud Expo in
Santa Clara, CA!
[November 5-8, 2012]


Sponsorship Opportunities
Please Call
201.802.3021
events (at) sys-con.com
SYS-CON's Cloud Expo, held each year in California, New York, Prague, Tokyo, and Hong Kong is the world’s leading Cloud event in its 5th year, larger than all other Cloud events put together. For sponsorship, exhibit opportunites and show prospectus, please contact Carmen Gonzalez, carmen (at) sys-con.com.


New York City Expo Floor Plan Revealed
Cloud Expo New York
[June 11-14, 2012]

Floor Plan Revealed


Introducing Big Data Expo
Introducing
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Get a jump on that rapidly evolving trend at Big Data Expo, which we are introducing in June at
Cloud Expo New York.

Follow @CloudExpo New York on Twitter


Testimonials
Cloud Expo was a fantastic event for CSS Corp - we easily exceeded our objectives for engaging with clients and prospects."
AHMAR ABBAS
SVP, Global Infrastructure Management, CSS Corp.
 
With our launch at Cloud Expo, we successfully transformed the company from a relatively unknown European player into the dominant player in the market. Our competitors were taken by surprise and just blown away. We got a huge number of really high quality leads..."
PETE MALCOLM
CEO, Abiquo
 
We were extremely pleased with Cloud Expo this year - I’d say it exceeded expectations all around. This is the same info we got from partners who attended as well. Nice job!"
MARY BASS
Director of Marketing, UnivaUD
 
Cloud Expo helps focus the debate on the critical issues at hand in effect connecting main street with the next frontier."

GREG O’CONNOR
President & CEO, Appzero


Who Should Attend?
Senior Technologists including CIOs, CTOs, VPs of technology, IT directors and managers, network and storage managers, network engineers, enterprise architects, communications and networking specialists, directors of infrastructure Business Executives including CEOs, CMOs, CIOs, presidents, VPs, directors, business development; product and purchasing managers.


Join Us as a Media Partner - Together We Can Rock the IT World!
SYS-CON Media has a flourishing Media Partner program in which mutually beneficial promotion and benefits are arranged between our own leading Enterprise IT portals and events and those of our partners.

If you would like to participate, please provide us with details of your website/s and event/s or your organization and please include basic audience demographics as well as relevant metrics such as ave. page views per month.

To get involved, email Marilyn Moux at marilyn@sys-con.com.

@CloudExpo Blogs
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix managed service platforms and what specifically can be done to offer protection at every layer. He o...
The age of computers is over. You are now living in the age of intelligent processing by just about everything else. Like vacuum tubes and tape drives, desktops and laptops are on their way to becoming odd relics of a distant age, if people remember them at all. That may sound a bit extreme, but the fact is that applications are not married to any technological substrate, not even the most advanced mobile devices. That is why smart developers have already turned their attention to using JavaScript for building out next-generation technology like drone controllers, big data management tools, an...
When digital laggards finally recognize the degree of change digital technologies will force upon their businesses, and desperately try to outrun the inescapable Darwinian effect of their slow start, they will be faced with not one, but three ages of digital transformation to navigate and survive. Understanding these ages, and what is unique about each one, is critical for business strategy, prioritizing, planning, sequencing and budgeting.
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
This is another topic that has taken me a long time to write, but several conversations with Peter Burris(@plburris) from Wikibon finally helped me to pull this together. Thanks Peter! I’ve struggled to understand and define the Intellectual Capital (IC) components – or dimensions – of the new, Big Data organization; that is, what are the new Big Data assets that an organization needs to collect, enrich and apply to drive business differentiation and competitive advantage? These assets form the basis of the modern “collaborative value creation” process and are instrumental in helping organiza...
Are you still pondering whether to integrate cloud computing services into the structure of your IT network? You are not the only one. Most IT professionals are very concerned about data security, so they aren’t that willing to switch to cloud computing solutions that easily. It’s a fact that even advanced services like Amazon’s EC2 aren’t ready to cater to all privacy needs of data-sensitive companies.
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
The Dean of the University of San Francisco School of Management, Elizabeth Davis, recently asked me to sit on a Big Data panel at the Direct Sales Association conference. I was given a 5-minute slot to “demystify” Big Data to a non-technical group of about 1,000 people; to help them understand where and how this thing called “Big Data” could help them. Well if you know me, I can barely introduce myself in 5 minutes. But this was particularly challenging for me, as I’m used to talking about Big Data with organizations with at least some level of Big Data experience or understanding (maybe the...
The Internet of Things (IoT) promises to change everything by enabling “smart” environments (homes, cities, hospitals, schools, stores, etc.) and smart products (cars, trucks, airplanes, trains, wind turbines, lawnmowers, etc.). I recently wrote about the importance of moving beyond “connected” to “smart” in a blog titled “Internet of Things: Connected Does Not Equal Smart”. The article discusses the importance of moving beyond just collecting the data, to transitioning to leveraging this new wealth of IoT data to improve the decisions that these smart environments and products need to make: t...
Cloud computing has taken over the business world! With almost maniacal focus, single proprietors and Board Directors of the world's largest conglomerates see this new model as a "must do". This rapid shift is, in fact, accelerating. As Jeff Bertolucci observes in "The Shift to Cloud Services Is Happening Faster Than Expected": "According to the sixth annual Uptime Institute Data Center Industry Survey, which examines the big-picture trends shaping IT infrastructure delivery and strategy, the move to cloud services is accelerating. The Uptime Institute's February 2016 poll of more than 1,00...
Machine learning, cloud services, and artificial intelligence-enabled human agents are all combining to change the way that companies can order services, buy goods, and even hire employees and contractors. To learn more about how new trends are driving innovation into invoicing and spend management, please join me in welcoming Pierre Mitchell, Chief Research Officer and Managing Director at Azul Partners, where he leads the Spend Matters Procurement research activities. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
It feels like the barbarians are continually at the gate. We can’t seem to go more than a week before a new data breach is in the news, impacting potentially millions of individuals. The targets range from companies like Omni Hotels, which had been breached affecting up to 50,000 customers whose personal and credit card information was exposed, to North Carolina State University, where over 38,000 students’ personal information, including their SSNs, were at risk. As I mentioned in a recent blog ‘Internet of Things and Big Data – who owns your data?’, we have been storing our personal and cred...
The data warehouse and data lake are two different types of data storage repository. The data warehouse integrates data from different sources and suits business reporting. The data lake stores raw structured and unstructured data in whatever form the data source provides. It does not require prior knowledge of the analyses you think you want to perform. A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed. While a hierarchical data warehouse stores data in files or folders, a data lake uses a flat architecture to store data.
Big Data. Analytics. Internet of Things. Cloud. In the last few years, you cannot have a discussion around technology without those terms entering the conversation. They have been major technology disruptors impacting all aspects of the business. Change seems to occur at breakneck speeds and shows no sign of slowing. Today, it appears the one constant in technology is change. Constant change requires constant innovation which thereby introduces more new technologies. One of the new technologies entering the conversation is machine learning. Gartner identified machine learning as one of the top...
Untitled Document
Past SYS-CON Events
    Cloud Expo West
cloudcomputingexpo
2011west.sys-con.com

 
    Cloud Expo East
cloudcexpo
2011east.sys-con.com

 
    Cloud Expo West
cloudcomputingexpo
2010west.sys-con.com

 
    Virtualization Expo West
virtualization
2010west.sys-con.com
    Cloud Expo Europe
cloudexpoeurope2010.
sys-con.com

 
    Cloud Expo East
cloudcomputingexpo
2010east.sys-con.com

 
    Virtualization Expo East
virtualizationconference
2010east.sys-con.com
    Cloud Expo West
cloudcomputingexpo
2009west.sys-con.com

 
    Virtualization Expo West
virtualizationconference
2009west.sys-con.com
    GovIT Expo
govitexpo.com
 
    Cloud Expo Europe
cloudexpoeurope2009.sys-con.com
 

Cloud Expo 2011 Allstar Conference Faculty

S.F.S.
Dell

Singer
NRO

Pereyra
Oracle

Ryan
OpSource

Butte
PwC

Leone
Oracle

Riley
AWS

Varia
AWS

Lye
Oracle

O'Connor
AppZero

Crandell
RightScale

Nucci
Dell Boomi

Hillier
CiRBA

Morrison
Layer 7 Tech

Robbins
NYT

Schwarz
Oracle

What The Enterprise IT World Says About Cloud Expo
 
"We had extremely positive feedback from both customers and prospects that attended the show and saw live demos of NaviSite's enterprise cloud based services."
  –William Toll
Sr. Director, Marketing & Strategic Alliances
Navisite
 


 
"More and better leads than ever expected! I have 4-6 follow ups personally."
  –Richard Wellner
Chief Scientist
Univa UD
 


 
"Good crowd, good questions. The event looked very successful."
  –Simon Crosby
CTO
Citrix Systems
 


 
"It's the largest cloud computing conference I've ever seen."
  –David Linthicum
CTO
Brick Group