Untitled Document
  Home
Speakers
Sessions
  Schedule
Sponsors
Exhibitors
  Media Sponsors
  Association Sponsors
Untitled Document
2012 East Diamond Sponsor

Untitled Document
2012 East Platinum Plus Sponsors

Untitled Document
2012 East Platinum Sponsors

Untitled Document
2012 East Gold Sponsors

Untitled Document
2012 East Silver Sponsors

Untitled Document
2012 East Bronze Sponsors

Untitled Document
2012 East Exhibitors

Untitled Document
2012 East Big Data Pavilion Sponsor

Untitled Document
2012 East Day 2 & 3 Morning Break Sponsor

Untitled Document
2012 East ODCA Partner Pavilion

Untitled Document
2012 East Association Sponsors

Untitled Document
2012 East Media Sponsors

Untitled Document
2011 West Diamond Sponsor

Untitled Document
2011 West Platinum Sponsors

Untitled Document
2011 West Platinum Plus Sponsor

Untitled Document
2011 West Gold Sponsors

Untitled Document
2011 West Silver Sponsors

Untitled Document
2011 West Bronze Sponsor


Untitled Document
2011 West Exhibitors























































Untitled Document
2011 West Wireless Network Sponsor

Untitled Document
2011 West Break Sponsor

Untitled Document
2011 West Day 3 Coffee Break Sponsor

Untitled Document
2011 West Day 2 Lunch Sponsor

Untitled Document
2011 West Day 3 Lunch Sponsor

Untitled Document
2011 Media Sponsors

Step-by-Step: Build a SharePoint 2013 Lab in the Cloud on Windows Azure
Leverage Windows Azure as your Lab in the Cloud!

My good friend and colleague, Tommy Patterson, recently blogged about leveraging the Windows Azure Infrastructure as a Service (IaaS) preview offering to build a FREE lab environment in the Cloud.  You can read Tommy’s step-by-step article at:

What about a SharePoint 2013 Lab in the Cloud?
Now that SharePoint Server 2013 has been released, I frequently get asked about ways in which a SharePoint 2013 lab environment can be easily built for studying, testing and/or performing a proof-of-concept.  You could certainly build this lab environment on your own hardware, but due to the level of SharePoint 2013 hardware requirements, a lot of us may not have sufficient spare hardware to implement an on-premise lab environment.

This makes a great scenario for leveraging our Windows Azure FREE 90-day Trial Offer to build a free lab environment for SharePoint 2013 in the cloud.  Using the process outlined in this article, you’ll be able to build a basic functional farm environment for SharePoint 2013 that will be accessible for approximately 105 hours of compute usage each month at no cost to you under the 90-day Trial Offer.

After the 90-day trial period is up, you can choose if you’d like to convert to a full paid subscription.  If you choose to convert to a paid subscription, this lab environment will cost approximately $0.56 USD per hour of compute usage ( that’s right – just 56 cents per hour ) plus associated storage and networking costs ( which can typically be less than $10 USD per month for a lab of this nature ). These estimated costs are based on published Pay-As-You-Go pricing for Windows Azure that is current as of this article’s date.

Note: If you are testing advanced SharePoint 2013 scenarios and need more resources than available in the lab configuration below, you can certainly scale-up or scale-out elastically by provisioning larger VMs or additional SharePoint web and application server VMs.  To determine the specific costs associated with higher resource levels, please visit the Windows Azure Pricing Calculator for Virtual Machines.

SharePoint 2013 Lab Scenario
To deliver a functional and expandable lab environment, I’ll be walking through the approach of provisioning SharePoint Server 2013 on Windows Azure VMs as depicted in the following configuration diagram that will require three (3) VMs on a common Windows Azure Virtual Network.

SP2013onAzureScenario

Lab Scenario: SharePoint 2013 on Windows Azure VM

In this lab, we’ll be using a naming convention of XXXlabYYY01, where XXX will be replaced with your unique initials and YYY will be replaced with an abbreviation representing the function of a virtual machine or Windows Azure configuration component (ie., ad, db or app).

Note: This study lab configuration is suitable for study, functional testing and basic proof-of-concept usage.  This configuration is not currently supported for pilot or production SharePoint 2013 farm environments.

Prerequisites

The following is required to complete this step-by-step guide:

  • A Windows Azure subscription with the Virtual Machines Preview enabled.

    DO IT: Sign up for a FREE Trial of Windows Azure

    NOTE: When activating your FREE Trial for Windows Azure, you will be prompted for credit card information.  This information is used only to validate your identity and your credit card will not be charged, unless you explicitly convert your FREE Trial account to a paid subscription at a later point in time. 
  • Completion of the Getting Started tasks in the following article:

    DO IT: Getting Started with Servers in the Cloud
  • This step-by-step guide assumes that the reader is already familiar with configuring Windows Server Active Directory, SQL Server and SharePoint Server 2013 in an on-premise installation. This guide focuses on the unique aspects associated with configuring these components on the Windows Azure cloud platform.

Let’s Get Started!
In this step-by-step guide, you will learn how to:

  • Register a DNS Server in Windows Azure
  • Define a Virtual Network in Windows Azure
  • Configure Windows Server Active Directory in a Windows Azure VM
  • Configure SQL Server 2012 in a Windows Azure VM
  • Configure SharePoint Server 2013 in a Windows Azure VM
  • Export / Import Lab Environment via PowerShell

Exercise 1: Register a DNS Server in Windows Azure

Register the internal IP address that our domain controller VM will be using for Active Directory-integrated Dynamic DNS services by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Networks located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Networks | Virtual Network | Register DNS Server.
  4. Complete the DNS Server fields as follows:

    - NAME: XXXlabdns01
    - DNS Server IP Address: 10.0.0.4
  5. Click the REGISTER DNS SERVER button.

Exercise 2: Define a Virtual Network in Windows Azure

Define a common virtual network in Windows Azure for running Active Directory, Database and SharePoint virtual machines by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Networks located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Networks | Virtual Network | Quick Create.
  4. Complete the Virtual Network fields as follows:

    - NAME: XXXlabnet01
    - Address Space: 10.---.---.---
    - Maximum VM Count: 4096 [CIDR: /20]
    - Affinity Group: Select the Affinity Group defined in the Getting Started steps from the Prerequisites section above.
    - Connect to Existing DNS: Select XXXlabdns01 – the DNS Server registered in Exercise 1 above.
  5. Click the CREATE A VIRTUAL NETWORK button.

Exercise 3: Configure Windows Server Active Directory in a Windows Azure VM

Provision a new Windows Azure VM to run a Windows Server Active Directory domain controller in a new Active Directory forest by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select Windows Server 2012, December 2012 and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabad01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Small (1 core, 1.75GB Memory)

    Click the Next button to continue.

    Note: It is suggested to use secure passwords for Administrator users and service accounts, as Windows Azure virtual machines could be accessible from the Internet knowing just their DNS.  You can also read this document on the Microsoft Security website that will help you select a secure password: http://www.microsoft.com/security/online-privacy/passwords-create.aspx.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabad01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabad01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabad01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.4

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabad01, and go back to Exercise 2 and Exercise 3 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabad01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabad01-data01
    - Size: 10 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabad01.
  11. On the virtual machine details page for XXXlabad01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabad01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.  This volume will be used for NTDS DIT database, log and SYSVOL folder locations.

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Configure Local Storage
  13. Using the Server Manager tool, install Active Directory Domain Services and promote this server to a domain controller in a new forest with the following parameters:

    - Active Directory Forest name: contoso.com
    - Volume Location for NTDS database, log and SYSVOL folders: F:

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Install and Administer Active Directory
  14. After Active Directory has been installed, create the following user accounts that will be used when installing and configuring SharePoint Server 2013 later in this step-by-step guide:

    - CONTOSO\sp_farm – SharePoint Farm Data Access Account
    - CONTOSO\sp_serviceapps – SharePoint Farm Service Applications Account

    If you need additional guidance to complete this step, feel free to leverage the following study guide for assistance: Windows Server 2012 “Early Experts” Challenge – Install and Administer Active Directory

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 4: Configure SQL Server 2012 in a Windows Azure VM

Provision a new Windows Azure VM to run SQL Server 2012 by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select SQL Server 2012 Evaluation Edition and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabdb01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Medium (2 cores, 3.5GB Memory)

    Click the Next button to continue.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabdb01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabdb01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabdb01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.5

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabdb01, and go back to Exercise 2 and Exercise 3 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabdb01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabdb01-data01
    - Size: 50 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabdb01.
  11. On the virtual machine details page for XXXlabdb01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabdb01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.
  13. Open SQL Server Management Studio from Start | All Programs | Microsoft SQL Server 2012 | SQL Server Management Studio and update default folder locations to the F: volume.

    1. Connect to the SQL Server 2012 default instance using your Windows Account.
    2. Now, you will update the database's default locations for DATA, LOGS and BACKUP folders. To do this, right click on your SQL Server instance and select Properties.
    3. Select Database Settings from the left side pane.
    4. Locate the Database default locations section and update the default values for each path to point to the F: volume you previously formatted.
    5. Close SQL Server Management Studio.
  14. In order to allow SharePoint to connect to the SQL Server, you will need to add an Inbound Rule for the SQL Server requests in the Windows Firewall. To do this, open Windows Firewall with Advanced Security from Start | All Programs | Administrative Tools.

    1. Select Inbound Rules node, right-click it and select New Rule to open the New Inbound Rule Wizard.

    2. In the Rule Type page, select Port and click Next.

    3. In Protocols and Ports page, leave TCP selected, select Specific local ports, and set its value to 1433. Click Next to continue.

    4. In the Action page, make sure that Allow the connection is selected and click Next.

    5. In the Profile page, leave the default values and click Next.

    6. In the Name page, set the Inbound Rule's Name to SQLServerRule and click Finish

    7. Close Windows Firewall with Advanced Security window.

  15. Using the Server Manager tool, join this server to the contoso.com domain and restart the server to complete the domain join operation.
  16. After the server restarts, connect again via Remote Desktop to the server’s console and login with the local Administrator credentials defined above in Step 5.

  17. Open SQL Server Management Studio from Start | All Programs | Microsoft SQL Server 2012 | SQL Server Management Studio and add the CONTOSO\Administrator user to SQL Server with the Sysadmin server role selected.

    1. Expand Security folder within the SQL Server instance. Right-click Logins folder and select New Login.

    2. In the General section, set the Login name to CONTOSO\Administrator, and select the Windows Authentication option.

    3. Click Server Roles on the left pane.  Select the checkbox for the Sysadmin server role.

    4. Click the OK button and close SQL Server Management Studio.

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 5: Configure SharePoint Server 2013 in a Windows Azure VM

Provision a new Windows Azure VM to run SharePoint Server 2013 by performing the following steps:

  1. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  2. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  3. Click the +NEW button located on the bottom navigation bar and select Compute | Virtual Machines | From Gallery.
  4. In the Virtual Machine Operating System Selection list, select Windows Server 2012, December 2012 and click the Next button.
  5. On the Virtual Machine Configuration page, complete the fields as follows:

    - Virtual Machine Name: XXXlabapp01
    - New Password and Confirm Password fields: Choose and confirm a new local Administrator password.
    - Size: Large (4 cores, 7GB Memory)

    Click the Next button to continue.
  6. On the Virtual Machine Mode page, complete the fields as follows:

    - Standalone Virtual Machine: Selected
    - DNS Name: XXXlabapp01.cloudapp.net
    - Storage Account: Select the Storage Account defined in the Getting Started steps from the Prerequisites section above.
    - Region/Affinity Group/Virtual Network: Select XXXlabnet01 – the Virtual Network defined in Exercise 2 above.
    - Virtual Network Subnets: Select Subnet-1 (10.0.0.0/23)

    Click the Next button to continue.
  7. On the Virtual Machine Options page, click the Checkmark button to begin provisioning the new virtual machine.

    As the new virtual machine is being provisioned, you will see the Status column on the Virtual Machines page of the Windows Azure Management Portal cycle through several values including Stopped, Stopped (Provisioning), and Running (Provisioning).  When provisioning for this new Virtual Machine is completed, the Status column will display a value of Running and you may continue with the next step in this guide.
  8. After the new virtual machine has finished provisioning, click on the name ( XXXlabapp01 ) of the new Virtual Machine displayed on the Virtual Machines page of the Windows Azure Management Portal.
  9. On the virtual machine details page for XXXlabapp01, make note of the Internal IP Address displayed on this page.  This IP address should be listed as 10.0.0.6

    If a different internal IP address is displayed, the virtual network and/or virtual machine configuration was not completed correctly.  In this case, click the DELETE button located on the bottom toolbar of the virtual machine details page for XXXlabapp01, and go back to Exercise 2,  Exercise 3 and Exercise 4 to confirm that all steps were completed correctly.
  10. On the virtual machine details page for XXXlabapp01, click the Attach button located on the bottom navigation toolbar and select Attach Empty Disk.  Complete the following fields on the Attach an empty disk to the virtual machine form:

    - Name: XXXlabapp01-data01
    - Size: 50 GB
    - Host Cache Preference: None

    Click the Checkmark button to create and attach the a new virtual hard disk to virtual machine XXXlabapp01.
  11. On the virtual machine details page for XXXlabapp01, click the Connect button located on the bottom navigation toolbar and click the Open button to launch a Remote Desktop Connection to the console of this virtual machine.  Logon at the console of your virtual machine with the local Administrator credentials defined in Step 5 above.
  12. From the Remote Desktop console of XXXlabapp01, create a new partition on the additional data disk attached above in Step 10 and format this partition as a new F: NTFS volume.  
  13. In the Server Manager tool, click on Local Server in the left navigation pane and click on the Workgroup option.  Join this server to the contoso.com domain and restart the server to complete the domain join operation.
  14. After the server restarts, re-establish a Remote Desktop connection to the server and logon with the CONTOSO\Administrator domain user credentials defined earlier in Exercise 3.
  15. In the Server Manager tool, click on Local Server in the left navigation pane and select IE Enhanced Security Configuration. Turn off enhanced security for Administrators and click the OK button.

    Note: Modifying Internet Explorer Enhanced Security configurations is not good practice and is only for the purpose of this particular step-by-step guide. The correct approach should be to download files locally and then copy them to a shared folder or directly to the VM.

  16. Press the Windows key to switch to the Start Screen and launch Internet Explorer.  Download the following files to the F:\INSTALL folder:

    - SharePoint Server 2013 Evaluation Edition

    Make a note of the SharePoint Product Key listed on this page, as you’ll need it for the installation of SharePoint Server 2013.

    - ASP.NET 4.5 hotfix for Windows Server 2012 ( KB2765317 )

  17. Navigate to the F:\INSTALL folder and double-click on the downloaded .IMG file to mount it.  Copy all files and folders from the mounted .IMG file to F:\INSTALL.
  18. Install the SharePoint Server 2013 software prerequisites by running F:\INSTALL\prerequisiteinstaller.exe.  Note that this process may require multiple server restarts to complete.  After all required software is successfully installed, continue with the next step in this step-by-step guide.
  19. Install the ASP.NET 4.5 hotfix downloaded to the F:\INSTALL folder in Step 14 above.
  20. Run F:\INSTALL\setup.exe to launch the SharePoint Server 2013 installation process.
  21. When prompted, on the Server Type tab of the setup program, select the Complete installation option.
  22. On the File Location tab of the setup program, change the data path to use the F: volume formatted in Step 12 above.
  23. At the end of the installation process, ensure the checkbox is selected to Run the SharePoint Products Configuration Wizard Now and click the Close button.
  24. In the SharePoint Products Configuration Wizard, when prompted on the Connect to server farm dialog, select the option to Create a new server farm.
  25. On the Specify Configuration Database Settings, specify the following values for each field:

    - Database Server: XXXlabdb01
    - Username: CONTOSO\sp_farm
    - Password: Type the password specified when the sp_farm domain user account was created earlier in Exercise 3, Step 14.
  26. Click the Next > button and accept all other default values in the SharePoint Products Configuration Wizard.  Click the Finish button when prompted to complete the wizard.
  27. The SharePoint 2013 Central Administration web page should launch automatically.  When prompted, click the Start the Wizard button to begin the Initial Farm Configuration Wizard.
  28. When prompted for Service Account, type the CONTOSO\sp_serviceapps domain username and password specified when this account was created earlier in Exercise 3, Step 14.
  29. Accept all other default values and click the Next > button to continue.
  30. On the Create a Site Collection page, create a new top-level Intranet site collection using the following field values:

    - Title and Description: Enter your preferred Title and Description for the new site collection
    - URL: Select the root URL path – http://XXXlabapp01/
    - Select experience version:
    2013
    - Select a template: Publishing | Publishing Portal

    Click the OK button to provision a new top-level Intranet site collection. 

    After the new top-level Intranet site collection is provisioned, test navigating to the URL for this site collection from within the Remote Desktop session to the server.
  31. On the SharePoint 2013 Central Administration site, configure a Public URL alternate access mapping for accessing the new top-level Intranet site collection from the Internet.
    1. On the Central Administration site home page, click the Configure alternate access mappings link.
    2. On the Alternate Access Mappings page, click the Edit Public URLs link.
    3. On the Edit Public Zone URLs page, select and specify the following values:

      - Alternate Access Mapping Collection: SharePoint - 80
      - Internet: http://XXXlabapp01.cloudapp.net

      Click the Save button to complete the Alternate Access Mapping configuration.
  32. Close the Remote Desktop session to the server.
  33. Sign in at the Windows Azure Management Portal with the logon credentials used when you signed up for your Free 90-Day Windows Azure Trial.
  34. Select Virtual Machines located on the side navigation panel on the Windows Azure Management Portal page.
  35. On the Virtual Machines page, click on the name of the SharePoint virtual machine – XXXlabapp01.
  36. On the XXXlabapp01 virtual machine details page, click on Endpoints in the top navigation area of the page.
  37. Click the +Add Endpoint button in the bottom navigation bar of the page to define a new virtual machine endpoint that will permit HTTP web traffic inbound to the SharePoint virtual machine. 
  38. On the Add an endpoint to a virtual machine form, select the Add Endpoint option and click the Next button to continue.
  39. On the Specify the details of the endpoint form, specify the following field values:

    - Name: Web HTTP
    - Protocol: TCP
    - Public Port: 80
    - Private Port: 80

    Click the Checkmark button to create a new endpoint definition that will permit inbound web traffic to the SharePoint virtual machine.
  40. After the endpoint configuration has been successfully applied, test browsing to the following public URL to confirm that you are able to access the Intranet site collection that is configured on SharePoint:

    - URL: http://XXXlabapp01.cloudapp.net

The configuration for this virtual machine is now complete, and you may continue with the next exercise in this step-by-step guide.

Exercise 6: Export / Import Lab Environment via PowerShell

Our functional SharePoint lab environment is now complete, but if you’re like me, you won’t be using this lab environment 24x7 around-the-clock.  As long as the virtual machines are provisioned, they will continue to accumulate compute hours against your Free 90-Day Windows Azure Trial account regardless of virtual machine state – even in a shutdown state!

To preserve as many of your free compute hours for productive lab work, we can leverage the Windows Azure PowerShell module to de-provision our lab virtual machines when not in use and re-provision our lab virtual machines when we need them again.  Once you’ve configured the PowerShell scripts below, you’ll be able to spin up your SharePoint lab environment when needed in as little as 5-10 minutes!

Note: Prior to beginning this exercise, please ensure that you’ve downloaded, installed and configured the Windows Azure PowerShell module as outlined in the Getting Started article listed in the Prerequisite section of this step-by-step guide.

  1. De-provisioning your lab. Use the PowerShell snippet below to shutdown, export and de-provision your SharePoint lab environment when you’re not using it.  Prior to running this script, be sure to edit the first line to reflect the names of each of your VMs and confirm that the $ExportPath location exists.

    $myVMs = @("XXXlabapp01","XXXlabdb01","XXXlabad01")
    Foreach ( $myVM in $myVMs ) {
    Stop-AzureVM -ServiceName $myVM -Name $myVM
    $ExportPath = "C:\ExportVMs\ExportAzureVM-$myVM.xml"
    Export-AzureVM -ServiceName $myVM -name $myVM -Path $ExportPath
    Remove-AzureVM -ServiceName $myVM -name $myVM
    }

  2. Re-provisioning your lab. Use the PowerShell snippet below to import and re-provision your SharePoint lab environment when you’re ready to use it again.  Prior to running this script, be sure to edit the first two lines to reflect the names of your Virtual Network and VMs.

    $myVNet = "XXXlabnet01"
    $myVMs = @("XXXlabad01","XXXlabdb01","XXXlabapp01")
    Foreach ( $myVM in $myVMs ) {
    $ExportPath = "C:\ExportVMs\ExportAzureVM-$myVM.xml"    
    Import-AzureVM -Path $ExportPath | New-AzureVM -ServiceName $myVM -VNetName $myVNet
    Start-AzureVM -ServiceName $myVM -name $myVM
    }

To ensure safe de-provisioning and re-provisioning of your SharePoint lab environment, note that it is important to preserve the specific order of the VM names listed in both code snippets above to ensure that the dependency order across VMs is properly handled.

What’s Next? Keep Learning!

Now that your SharePoint Server 2013 lab environment is running in the cloud, be sure to explore the resources below to continue your learning:

Build Your Lab! Build Your Lab! Download Windows Server 2012
Build Your Lab in the Cloud! Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines
Join our "Early Experts" study group! Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group
About Keith Mayer
Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

Untitled Document
Cloud Expo - Cloud Looms Large on SYS-CON.TV


Cloud Expo 2013 East Opening Keynote by IBM
In this Cloud Expo Keynote, Danny Sabbah, CTO & General Manager, Next Generation Platform, will detail the critical architectural considerations and success factors organizations must internalize to successfully implement, optimize and innovate using next generation architectures.
Lisa Larson, Vice President of Enterprise Cloud Solutions of Rackspace Hosting Live From New York City
In the old world of IT, if you didn't have hardware capacity or the budget to buy more, your project was dead in the water. Budget constraints can leave some of the best, most creative and most ingenious innovations on the cutting room floor. It's a true dilemma for developers and innovators – why spend the time creating, when a project could be abandoned in a blink? That was the old world. In the new world of IT, developers rule. They have access to resources they can spin up instantly. A hybrid cloud ignites innovation and empowers developers to focus on what they need. A hybrid cloud blends the best of all worlds, public cloud, private cloud and dedicated servers to fit the needs of developers and offer the ideal environment for each app and workload without the constraints of a one-size-fits-all cloud.

Keynote: Driving Cloud Innovation: SSDs Change Cloud Storage Paradigm
Cloud is a transformational shift in computing that can have a powerful effect on enterprise IT when designed correctly and used to its full potential. Join Citrix in a discussion that centers on building, connecting and empowering users with cloud services and hear examples of how enterprises are solving real-world business challenges with an architecture and solution purpose-built for the cloud.

Go Beyond IaaS to Deliver "Anything As a Service"
Many organizations want to expand upon the IaaS foundation to deliver cloud services in all forms—software, mobility, infrastructure and IT. Understanding the strategy, planning process and tools for this transformation will help catalyze changes in the way the business operates and deliver real value. Join us to learn about the new ITaaS model and how to begin the transformation.


Cloud Expo Latest Stories
With the explosion of the cloud, more businesses are transitioning to a recurring revenue model to generate reliable sales, grow profits, and open new markets. This opportunity requires businesses to get to market quickly with the pricing and packaging options customers want. In addition, you will want to take advantage of the ensuing tidal wave of data to more effectively upsell, cross-sell and manage your customers. All of this is possible, but only with the right approach. At 15th Cloud Expo, Brendan O'Brien, Co-founder at Aria Systems and the inventor of cloud billing panelists, will lead a panel discussion on what it takes to launch and manage a successful recurring revenue business. The panelists will offer their insights about what each department will need to consider, from financial management to line of business and IT. The panelists will also offer examples from their success in recurring revenue with companies such as Audi, Constant Contact, Experian, Pitney-Bowes, Teleko...
Planning scalable environments isn't terribly difficult, but it does require a change of perspective. In his session at 15th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will broaden your views to think on an Internet scale by dissecting a video publishing application built with The SoftLayer Platform, Message Queuing, Object Storage, and Drupal. By examining a scalable modular application build that can handle unpredictable traffic, attendees will able to grow your development arsenal and pick up a few strategies to apply to your own projects.
Come learn about what you need to consider when moving your data to the cloud. In her session at 15th Cloud Expo, Skyla Loomis, a Program Director of Cloudant Development at Cloudant, will discuss the security, performance, and operational implications of keeping your data on premise, moving it to the cloud, or taking a hybrid approach. She will use real customer examples to illustrate the tradeoffs, key decision points, and how to be successful with a cloud or hybrid cloud solution.
The cloud provides an easy onramp to building and deploying Big Data solutions. Transitioning from initial deployment to large-scale, highly performant operations may not be as easy. In his session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, will discuss the benefits, weaknesses, and performance characteristics of public and bare metal cloud deployments that can help you make the right decisions.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Is your organization struggling to deal with skyrocketing volumes of digital assets? The amount of data is growing exponentially and organizations are having a hard time managing this growth. In his session at 15th Cloud Expo, Amar Kapadia, Senior Director of Open Cloud Strategy at Seagate, will walk through the essential considerations when developing a cloud storage strategy. In this discussion, you will understand the challenges IT is facing, why companies need to move to cloud, and how the right cloud model can help your business economically overcome the data struggle.
If cloud computing benefits are so clear, why have so few enterprises migrated their mission-critical apps? The answer is often inertia and FUD. No one ever got fired for not moving to the cloud – not yet. In his session at 15th Cloud Expo, Michael Hoch, SVP, Cloud Advisory Service at Virtustream, will discuss the six key steps to justify and execute your MCA cloud migration.
The 16th International Cloud Expo announces that its Call for Papers is now open. 16th International Cloud Expo, to be held June 9–11, 2015, at the Javits Center in New York City brings together Cloud Computing, APM, APIs, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, will discuss how a user-centric Application Performance Management solution can help inspire your users with every application transaction.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Cloudian, Inc., the leading provider of hybrid cloud storage solutions, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Cloudian is a Foster City, Calif.-based software company specializing in cloud storage. Cloudian HyperStore® is an S3-compatible cloud object storage platform that enables service providers and enterprises to build reliable, affordable and scalable hybrid cloud storage solutions. Cloudian actively partners with leading cloud computing environments including Amazon Web Services, Citrix Cloud Platform, Apache CloudStack, OpenStack and the vast ecosystem of S3 compatible tools and applications. Cloudian's customers include Vodafone, Nextel, NTT, Nifty, and LunaCloud. The company has additional offices in China and Japan.
SYS-CON Events announced today that TechXtend (formerly Programmer’s Paradise), a leading value-added provider of server and storage virtualization, and r-evolution will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TechXtend (formerly Programmer’s Paradise) is a leading value-added provider of software, systems and solutions for corporations, government organizations, and academic institutions across the United States and Canada. TechXtend is the Exclusive Reseller in the United States for r-evolution
Top Stories for Cloud Expo 2012 East

... (more)

Best Recent Articles on Cloud Computing & Big Data Topics
As we enter a new year, it is time to look back over the past year and resolve to improve upon it. In 2014, we will see more service providers resolve to add more personalization in enterprise technology. Below are seven predictions about what will drive this trend toward personalization.
IT organizations face a growing demand for faster innovation and new applications to support emerging opportunities in social, mobile, growth markets, Big Data analytics, mergers and acquisitions, strategic partnerships, and more. This is great news because it shows that IT continues to be a key stakeholder in delivering business service innovation. However, it also means that IT must deliver new innovation despite flat budgets, while maintaining existing services that grow more complex every day.
Cloud computing is transforming the way businesses think about and leverage technology. As a result, the general understanding of cloud computing has come a long way in a short time. However, there are still many misconceptions about what cloud computing is and what it can do for businesses that adopt this game-changing computing model. In this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan, Rex Wang, Vice President of Product Marketing at Oracle, discusses and dispels some of the common myths about cloud computing that still exist today.
Despite the economy, cloud computing is doing well. Gartner estimates the cloud market will double by 2016 to $206 billion. The time for dabbling in the cloud is over! The 14th International Cloud Expo, co-located with 5th International Big Data Expo and 3rd International SDN Expo, to be held June 10-12, 2014, at the Javits Center in New York City, N.Y. announces that its Call for Papers is now open. Topics include all aspects of providing or using massively scalable IT-related capabilities as a service using Internet technologies (see suggested topics below). Cloud computing helps IT cut infrastructure costs while adding new features and services to grow core businesses. Clouds can help grow margins as costs are cut back but service offerings are expanded. Help plant your flag in the fast-expanding business opportunity that is The Cloud, Big Data and Software-Defined Networking: submit your speaking proposal today!
What do you get when you combine Big Data technologies….like Pig and Hive? A flying pig? No, you get a “Logical Data Warehouse.” In 2012, Infochimps (now CSC) leveraged its early use of stream processing, NoSQLs, and Hadoop to create a design pattern which combined real-time, ad-hoc, and batch analytics. This concept of combining the best-in-breed Big Data technologies will continue to advance across the industry until the entire legacy (and proprietary) data infrastructure stack will be replaced with a new (and open) one.
While unprecedented technological advances have been made in healthcare in areas such as genomics, digital imaging and Health Information Systems, access to this information has been not been easy for both the healthcare provider and the patient themselves. Regulatory compliance and controls, information lock-in in proprietary Electronic Health Record systems and security concerns have made it difficult to share data across health care providers.
Cloud Expo, Inc. has announced today that Vanessa Alvarez has been named conference chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
12th International Cloud Expo, held on June 10–13, 2013 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
It's a simple fact that the better sales reps understand their prospects' intentions, preferences and pain points during calls, the more business they'll close. Each day, as your prospects interact with websites and social media platforms, their behavioral data profile is expanding. It's now possible to gain unprecedented insight into prospects' content preferences, product needs and budget. We hear a lot about how valuable Big Data is to sales and marketing teams. But data itself is only valuable when it's part of a bigger story, made visible in the right context.
Cloud Expo, Inc. has announced today that Larry Carvalho has been named Tech Chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Everyone talks about a cloud-first or mobile-first strategy. It's the trend du jour, and for good reason as these innovative technologies have revolutionized an industry and made savvy companies a lot of money. But consider for a minute what's emerging with the Age of Context and the Internet of Things. Devices, interfaces, everyday objects are becoming endowed with computing smarts. This is creating an unprecedented focus on the Application Programming Interface (API) as developers seek to connect these devices and interfaces to create new supporting services and hybrids. I call this trend the move toward an API-first business model and strategy.
We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.
Register and Save!
Save $500
on your “Golden Pass”!
Call 201.802.3020
or click here to Register
Early Bird Expires June 10th.


Silicon Valley Call For Papers Now OPEN
Submit
Call for Papers for the
upcoming Cloud Expo in
Santa Clara, CA!
[November 5-8, 2012]


Sponsorship Opportunities
Please Call
201.802.3021
events (at) sys-con.com
SYS-CON's Cloud Expo, held each year in California, New York, Prague, Tokyo, and Hong Kong is the world’s leading Cloud event in its 5th year, larger than all other Cloud events put together. For sponsorship, exhibit opportunites and show prospectus, please contact Carmen Gonzalez, carmen (at) sys-con.com.


New York City Expo Floor Plan Revealed
Cloud Expo New York
[June 11-14, 2012]

Floor Plan Revealed


Introducing Big Data Expo
Introducing
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Get a jump on that rapidly evolving trend at Big Data Expo, which we are introducing in June at
Cloud Expo New York.

Follow @CloudExpo New York on Twitter


Testimonials
Cloud Expo was a fantastic event for CSS Corp - we easily exceeded our objectives for engaging with clients and prospects."
AHMAR ABBAS
SVP, Global Infrastructure Management, CSS Corp.
 
With our launch at Cloud Expo, we successfully transformed the company from a relatively unknown European player into the dominant player in the market. Our competitors were taken by surprise and just blown away. We got a huge number of really high quality leads..."
PETE MALCOLM
CEO, Abiquo
 
We were extremely pleased with Cloud Expo this year - I’d say it exceeded expectations all around. This is the same info we got from partners who attended as well. Nice job!"
MARY BASS
Director of Marketing, UnivaUD
 
Cloud Expo helps focus the debate on the critical issues at hand in effect connecting main street with the next frontier."

GREG O’CONNOR
President & CEO, Appzero


Who Should Attend?
Senior Technologists including CIOs, CTOs, VPs of technology, IT directors and managers, network and storage managers, network engineers, enterprise architects, communications and networking specialists, directors of infrastructure Business Executives including CEOs, CMOs, CIOs, presidents, VPs, directors, business development; product and purchasing managers.


Join Us as a Media Partner - Together We Can Rock the IT World!
SYS-CON Media has a flourishing Media Partner program in which mutually beneficial promotion and benefits are arranged between our own leading Enterprise IT portals and events and those of our partners.

If you would like to participate, please provide us with details of your website/s and event/s or your organization and please include basic audience demographics as well as relevant metrics such as ave. page views per month.

To get involved, email Marilyn Moux at marilyn@sys-con.com.

Latest Blog Posts
When it comes to cloud storage services, the options are plentiful. For personal or business use, cloud storage has quickly become a vital element of saving important information, whether they be company files, personal photos, or documents. In fact, many businesses plan out strategies that largely depend on the cloud services they use. With that in mind, the choice of cloud storage provider has become an immensely important one. Many different choices exist, some offered by large corporations while others comes from startups. To differentiate what service is best for you, it's best to focus on the features each one offers. Not every cloud storage service is built the same, so a careful examination is in order.
Amazon is indisputably the biggest name in cloud service providers. They have built up a strong market presence primarily on the argument that access to cheap compute and storage resources is attractive to companies looking to shed IT costs as they move from on-premises solutions to the cloud. But after the initial push for cheap resources, how will this market develop? Amazon has cut prices to their cloud offering more than 40 times since introducing the service in 2006. The way this gets translated in press circles is that cloud services pricing is approaching some floor. But is that true?
It's hard to miss the world of opportunities that data collection and analysis have opened up. But how can you avoid having information overload? It takes a lot of will power, in our data obsessed world to say "too much!" However, there are many ways where too much information is destroying productivity, and actually causing bad decision making, not good. But it is hard to avoid the world of opportunities that has been opened in data collection and analysis. So how do you balance the two? The first step is to understand there is a big difference between data collection, and it's utilization. While it seems subtle, the difference is key, and utilization is where many make mistakes.
Senior executives at large multinational enterprises are already demanding that their CIO has a plan in place to ensure that they can effectively procure public and private cloud services for their organization. In smaller companies, some IT managers are now expected to acquire the knowledge and skills to perform a similar role. Are they prepared? To find out, let's review a current IT resource assessment.
Over the past couple of days Platform-as-a-Service (PaaS) has taken some hard knocks in the press. See here and here. PaaS has always had a hard life. It’s typical middle child syndrome. It’s older sibling SaaS is very mature and is growing considerably everyday thanks to the adoption by line-of-business leaders. It’s younger sibling, IaaS, gets all the attention from the uber geeks who prefer to manage everything themselves. Poor PaaS is left trying to wring out an identity for itself; some unique value that users can grasp onto. Unfortunately, the support PaaS needs to come into its own is hard to come by these days.
We (as in the industry at large) don't talk enough about applying architectural best practices with respect to emerging API and software-defined models of networking. But we should. That's because as we continue down the path that continues to software-define the network, using APIs and software development methodologies to simplify and speed the provisioning of network services, the more we run into if not rules, then best practices, that should be considered before we willy nilly start integrating all the network things.
The keys to the digital kingdom are credentials. In no industry is this more true (and ultimately more damaging) than financial services. The sophistication of the attacks used to gather those credentials and thwart the increasingly complex authentication process that guards financial transactions is sometimes staggering. That's because they require not just stealth but coordination across multiple touch points in the transaction process.

When you plan your migration to the cloud, and the cloud security best practices to secure it, there is no need to reinvent the wheel.  Here is some advice from the Fortune 500. Use these tips to learn from others’ successes and to avoid their failures – maybe their companies can afford “valuable” learning lessons, […]

The post Cloud Security Best Practices of the Fortune 500 appeared first on Porticor Cloud Security.

The global village, mobile devices, online marketplaces, social networks, and on-demand entertainment all have a part to play. People all over the world are increasing the time they spend in the virtual world. They’re buying, selling, sharing, studying, developing apps, hanging out in social networks, and starting to use digital currencies that bypass traditional banking. Alongside these community-driven ideas, we are also seeing enormous change in business to business relationships. Cloud computing enables any size business to obtain and manage big-business manufacturing, warehousing, marketing, data analytics, enterprise applications and global spread. Supply chains are radically altered: a business of any size can buy, produce and sell globally, and leverage vertically and horizontally integrated supply chains.

#SDAS #IAM #IoT #Mobile The new requirements for app delivery include a focus on hyperscaling access to applications.

A plurality (48%) of enterprises deliver between 1 and 500 applications to consumers and employees. A somewhat surprising 21% deliver more than 1000 applications every day*.

Consider, now, the possible combinations (or is it permutations, I always mix those two up) that can be formed along with the increasing number of devices/connections per consumer and employee (predicted to hit 5 per individual by 2017 by Cisco). Oh, and don't forget to consider the potential impact from the Internet of Things. Things that need access to applications and data controlled by corporate access policies.

As you've probably already surmised, traditional access control technology isn't going to scale well in the face of that many potential entry points into the organization. In many cases, even modern access control solutions aren't going to scale - operationally or ...

The Internet has changed the way businesses are constructed: vertical integrations and home-grown systems are being steadily replaced by off-the-shelf solutions, SaaS integrations, and web-based workflows. Documents are no longer stored on a file server halfheartedly maintained by your IT department, they’re centralized in a document-storage site like Dropbox or Box.com. Productivity software is no longer something that lives on your desktop computer in your office, but rather on the cloud using Google Drive or Office365.
It was great catching up with Brian at VMworld, even if it was in the Tea Garden.  We go back a ways.  This 6 minute video clip discusses who will win the cloud wars and how CloudVelox differentiates from a dozen or so early cloud migration and DR tools.  
The problem with web application performance is directly related to the increasing page size and number of objects comprising pages today. Increasing corporate bandwidth (the pipe between the Internet and the organization) doesn't generally help. The law of diminishing returns is at work; at some point more bandwidth (like more hardware) just isn't enough because the problem isn't in how fast bits are traveling, but how many times bits are traversing the network.
Public cloud computing is surging forward into healthcare, finance, and utilities. Popular cloud based implementations run the gamut from big data analysis to customer service applications, and everything in between. As more and more sensitive data processing is done in the cloud, encryption of data has become the obvious best practice. Google Compute Engine has provided data encryption for some time; and in a recent interview, AWS’s CTO said they’d like all data, or at least all sensitive business data, to be always encrypted

Significant money is at stake and in need of protection in the Payment Card Industry (PCI). The global payment card industry covers several sectors: banks and financial institutions (acquirers), issuers, processors, service providers, merchants carrying out transactions online and via point of sale terminals in bricks and mortar stores, large and small. PCI Security The […]

The post PCI-DSS Encryption Requirements appeared first on Porticor Cloud Security.

Untitled Document
Past SYS-CON Events
    Cloud Expo West
cloudcomputingexpo
2011west.sys-con.com

 
    Cloud Expo East
cloudcexpo
2011east.sys-con.com

 
    Cloud Expo West
cloudcomputingexpo
2010west.sys-con.com

 
    Virtualization Expo West
virtualization
2010west.sys-con.com
    Cloud Expo Europe
cloudexpoeurope2010.
sys-con.com

 
    Cloud Expo East
cloudcomputingexpo
2010east.sys-con.com

 
    Virtualization Expo East
virtualizationconference
2010east.sys-con.com
    Cloud Expo West
cloudcomputingexpo
2009west.sys-con.com

 
    Virtualization Expo West
virtualizationconference
2009west.sys-con.com
    GovIT Expo
govitexpo.com
 
    Cloud Expo Europe
cloudexpoeurope2009.sys-con.com
 

Cloud Expo 2011 Allstar Conference Faculty

S.F.S.
Dell

Singer
NRO

Pereyra
Oracle

Ryan
OpSource

Butte
PwC

Leone
Oracle

Riley
AWS

Varia
AWS

Lye
Oracle

O'Connor
AppZero

Crandell
RightScale

Nucci
Dell Boomi

Hillier
CiRBA

Morrison
Layer 7 Tech

Robbins
NYT

Schwarz
Oracle

What The Enterprise IT World Says About Cloud Expo
 
"We had extremely positive feedback from both customers and prospects that attended the show and saw live demos of NaviSite's enterprise cloud based services."
  –William Toll
Sr. Director, Marketing & Strategic Alliances
Navisite
 


 
"More and better leads than ever expected! I have 4-6 follow ups personally."
  –Richard Wellner
Chief Scientist
Univa UD
 


 
"Good crowd, good questions. The event looked very successful."
  –Simon Crosby
CTO
Citrix Systems
 


 
"It's the largest cloud computing conference I've ever seen."
  –David Linthicum
CTO
Brick Group