Untitled Document
  Home
Speakers
Sessions
  Schedule
Sponsors
Exhibitors
  Media Sponsors
  Association Sponsors
Untitled Document
2012 East Diamond Sponsor

Untitled Document
2012 East Platinum Plus Sponsors

Untitled Document
2012 East Platinum Sponsors

Untitled Document
2012 East Gold Sponsors

Untitled Document
2012 East Silver Sponsors

Untitled Document
2012 East Bronze Sponsors

Untitled Document
2012 East Exhibitors

Untitled Document
2012 East Big Data Pavilion Sponsor

Untitled Document
2012 East Day 2 & 3 Morning Break Sponsor

Untitled Document
2012 East ODCA Partner Pavilion

Untitled Document
2012 East Association Sponsors

Untitled Document
2012 East Media Sponsors

Untitled Document
2011 West Diamond Sponsor

Untitled Document
2011 West Platinum Sponsors

Untitled Document
2011 West Platinum Plus Sponsor

Untitled Document
2011 West Gold Sponsors

Untitled Document
2011 West Silver Sponsors

Untitled Document
2011 West Bronze Sponsor


Untitled Document
2011 West Exhibitors























































Untitled Document
2011 West Wireless Network Sponsor

Untitled Document
2011 West Break Sponsor

Untitled Document
2011 West Day 3 Coffee Break Sponsor

Untitled Document
2011 West Day 2 Lunch Sponsor

Untitled Document
2011 West Day 3 Lunch Sponsor

Untitled Document
2011 Media Sponsors

Essential Cloud Computing Characteristics
According to NIST the cloud model is composed of five essential characteristics, three service models, & four deployment models

If you ask five different experts you will get maybe five different opinions what cloud computing is. And all five may be correct. The best definition of cloud computing that I have ever found is the National Institute of Standards and Technology Definition of Cloud Computing. According to NIST the cloud model is composed of five essential characteristics, three service models, and four deployment models. In this post I will look at the essential characteristics only, and compare to the traditional computing models; in future posts I will look at the service and deployment models.

Because computing always implies resources (CPU, memory, storage, networking etc.), the premise of cloud is an improved way to provision, access and manage those resources. Let's look at each essential characteristic of the cloud:

On-Demand Self-Service
Essentially what this means is that you (as a consumer of the resources) can provision the resources at any time you want to, and you can do this without assistance from the resource provider.

Here is an example. In the old days if your application needed additional computing power to support growing load, the process you normally used to go through is briefly as follows: call the hardware vendor and order new machines; once the hardware is received you need to install the Operating System, connect the machine to the network, configure  any firewall rules etc.; next, you need to install your application and add the machine to the pool of other machines that already handle the load for your application. This is a very simplistic view of the process but it still requires you to interact with many internal and external teams in order to complete it - those can be but are not limited to hardware vendors, IT administrators, network administrators, database administrators, operations etc. As a result it can take weeks or even months to get the hardware ready to use.

Thanks to the cloud computing though you can reduce this process to minutes. All this lengthy process comes to a click of a button or a call to the provider's API and you can have the additional resources available within minutes without. Why is this important?

Because in the past the process involved many steps and usually took months, application owners often used to over provision the environments that host their application. Of course this results in huge capital expenditures at the beginning of the project, resource underutilization throughout the project, and huge losses if the project doesn't succeed. With cloud computing though you are in control and you can provision only enough resources to support your current load.

Broad Network Access
Well, this is not something new - we've had the Internet for more than 20 years already and the cloud did not invent this. And although NIST talks that the cloud promotes the use of heterogeneous clients (like smartphones, tablets etc.) I do think this would be possible even without the cloud. However there is one important thing that in my opinion  the cloud enabled that would be very hard to do with the traditional model. The cloud made it easier to bring your application closer to your users around the world. "What is the difference?", you will ask. "Isn't it that the same as Internet or the Web?" Yes and no. Thanks to the Internet you were able to make your application available to users around the world but there were significant differences in the user experience in different parts of the world. Let's say that your company is based on California and you had a very popular application with millions of users in US. Because you are based in California all servers that host your application are either in your basement or in a datacenter that is nearby so that you can easily go and fix any hardware issues that may occur. Now, think about the experience that your users will get across the country! People from East Coast will see slower response times and possibly more errors than people from the West. If you wanted to expand globally then this problems will be amplified. The way to solve this issue was to deploy servers on the East Cost and in any other part of the world that you want to expand to.

With cloud computing though you can just provision new resources in the region you want to expand to, deploy your application and start serving your users.

It again comes to the cost that you incur by deploying new data centers around the world versus just using resources on demand and releasing them if you are not successful. Because the cloud is broadly accessible you can rely on having the ability to provision resources in different parts of the world.

Resource Pooling
One can argue whether resource pooling is good or bad. The part that brings most concerns among users is the colocation of application on the same hardware or on the same virtual machine. Very often you can hear that this compromises security, can impact your application's performance and even bring it down. Those have been real concerns in the past but with the advancement in virtualization technology and the latest application runtimes you can consider them outdated. That doesn't mean that you should not think about security and performance when you design your application.

The good side of the resource pooling is that it enabled cloud providers to achieve higher application density on single hardware and much higher resource utilization (sometimes going up to 75% to 80% compared to the 10%-12% in the traditional approach). As a result of that the price for resource usage continues to fall. Another benefit of the resource pooling is that resources can easily be shifted where the demand is without the need for the customer to know where those resources come from and where are they located. Once again, as a customer you can request from the pool as many resources as you need at certain time; once you are done utilizing those you can return them to the pool so that somebody else can use them. Because you as a customer are not aware what the size of the resource pool is, your perception is that the resources are unlimited. In contrast in the traditional approach the application owners have always been constrained by the resources available on limited number of machines (i.e. the ones that they have ordered and installed in their own datacenter).

Rapid Elasticity
Elasticity is tightly related to the pooling of resources and allows you to easily expand and contract the amount of resources your application is using. The best part here is that this expansion and contraction can be automated and thus save you money when your application is under light load and doesn't need many resources.

In order to achieve this elasticity in the traditional case the process would look something like this: when the load on your application increases you need to power up more machines and add them to the pool of servers that run your application; when the load on your application decreases you start removing servers from the pool and then powering them off. Of course we all know that nobody is doing this because it is much more expensive to constantly add and remove machines from the pool and thus everybody runs the maximum number of machines all the time with very low utilization. And we all know that if the resource planning is not done right and the load on the application is so heavy that the maximum number of machines cannot handle it, the result is increase of errors, dropped request and unhappy customers.

In the cloud scenario where you can add and remove resource within minutes you don't need to spend a great deal of time doing capacity planning. You can start very small, monitor the usage of your application and add more and more resources as you grow.

Measured Service
In order to make money the cloud providers need the ability to measure the resource usage. Because in most cases the cloud monetization is based on the pay-per-use model they need to be able to give the customers break down of how much and what resources they have used. As mentioned in the NIST definition this allows transparency for both the provider and the consumer of the service.

The ability to measure the resource usage is important in to you, the consumer of the service, in several different ways. First, based on historical data you can budget for future growth of your application. It also allows you to better budget new projects that deliver similar applications. It is also important for application architects and developers to optimize their applications for lower resource utilization (at the end everything comes to dollars on the monthly bill).

On the other side it helps the cloud providers to better optimize their datacenter resources and provide higher density per hardware. It also helps them with the capacity planning so that they don't end up with 100% utilization and no excess capacity to cover unexpected consumer growth.

Compare this to the traditional approach where you never knew how much of your compute capacity is utilized, or how much of your network capacity is used, or how much of your storage is occupied. In rare cases companies were able to collect such statistics but almost never those have been used to provide financial benefit for the enterprise.

Having those five essential characteristics you should be able to recognize the "true" cloud offerings available on the market. In the next posts I will go over the service and deployment models for cloud computing.

Read the original blog entry...

About Toddy Mladenov

Toddy Mladenov has more than 15 years experience in software development and technology consulting at companies like Microsoft, SAP and 3Com. Currently he is a CTO of Agitare Technologies, Inc. - a boutique consulting company that specializes in Cloud Computing and Big Data Solutions. Before Agitare Tech Toddy spent few years with PaaS startup Apprenda and more than six years working on Microsft's cloud computing platform Windows Azure, Windows Client and MSN/Windows Live. During his career at Microsoft he managed different aspects of the software development process for Windows Azure and Windows Services. He also evangelized Microsoft cloud services among open source communities like PHP and Java. In the past he developed enterprise software for German's software giant SAP and several startups in Europe, and managed the technical sales for 3Com in the Balkan region.

With his broad industry experience, international background and end-user point of view Toddy has an unique approach towards technology. He believes that technology should be develop to improve people's lives and is eager to share his knowledge in topics like cloud computing, mobile and web development.



Untitled Document
Cloud Expo - Cloud Looms Large on SYS-CON.TV


Cloud Expo 2013 East Opening Keynote by IBM
In this Cloud Expo Keynote, Danny Sabbah, CTO & General Manager, Next Generation Platform, will detail the critical architectural considerations and success factors organizations must internalize to successfully implement, optimize and innovate using next generation architectures.
Lisa Larson, Vice President of Enterprise Cloud Solutions of Rackspace Hosting Live From New York City
In the old world of IT, if you didn't have hardware capacity or the budget to buy more, your project was dead in the water. Budget constraints can leave some of the best, most creative and most ingenious innovations on the cutting room floor. It's a true dilemma for developers and innovators – why spend the time creating, when a project could be abandoned in a blink? That was the old world. In the new world of IT, developers rule. They have access to resources they can spin up instantly. A hybrid cloud ignites innovation and empowers developers to focus on what they need. A hybrid cloud blends the best of all worlds, public cloud, private cloud and dedicated servers to fit the needs of developers and offer the ideal environment for each app and workload without the constraints of a one-size-fits-all cloud.

Keynote: Driving Cloud Innovation: SSDs Change Cloud Storage Paradigm
Cloud is a transformational shift in computing that can have a powerful effect on enterprise IT when designed correctly and used to its full potential. Join Citrix in a discussion that centers on building, connecting and empowering users with cloud services and hear examples of how enterprises are solving real-world business challenges with an architecture and solution purpose-built for the cloud.

Go Beyond IaaS to Deliver "Anything As a Service"
Many organizations want to expand upon the IaaS foundation to deliver cloud services in all forms—software, mobility, infrastructure and IT. Understanding the strategy, planning process and tools for this transformation will help catalyze changes in the way the business operates and deliver real value. Join us to learn about the new ITaaS model and how to begin the transformation.


@CloudExpo Stories
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Top Stories for Cloud Expo 2012 East

In this Big Data Power Panel at the 10th International Cloud Expo, moderated by Cloud Expo Conference Chair Jeremy Geelan, Govind Rangasamy, Director of Product Management at Eucalyptus Systems; Kevin Brown; CEO of Coraid, Inc.; Christos Tryfonas, CTO and Co-Founder of Cetas; and Max Riggsbee, CMO and VP of Products for WhipTail, discussed such topics as: Big Data has existed since the early days of computing; why, then, do you think there is such an industry buzz around it right now? How is Big Data impacting storage and networking architecture in data centers? How about the intersection of Big Data Analytics and Cloud Computing - how big a sector is that and why? What's the difference between Big Data and Fast Data? ... (more)

Best Recent Articles on Cloud Computing & Big Data Topics
As we enter a new year, it is time to look back over the past year and resolve to improve upon it. In 2014, we will see more service providers resolve to add more personalization in enterprise technology. Below are seven predictions about what will drive this trend toward personalization.
IT organizations face a growing demand for faster innovation and new applications to support emerging opportunities in social, mobile, growth markets, Big Data analytics, mergers and acquisitions, strategic partnerships, and more. This is great news because it shows that IT continues to be a key stakeholder in delivering business service innovation. However, it also means that IT must deliver new innovation despite flat budgets, while maintaining existing services that grow more complex every day.
Cloud computing is transforming the way businesses think about and leverage technology. As a result, the general understanding of cloud computing has come a long way in a short time. However, there are still many misconceptions about what cloud computing is and what it can do for businesses that adopt this game-changing computing model. In this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan, Rex Wang, Vice President of Product Marketing at Oracle, discusses and dispels some of the common myths about cloud computing that still exist today.
Despite the economy, cloud computing is doing well. Gartner estimates the cloud market will double by 2016 to $206 billion. The time for dabbling in the cloud is over! The 14th International Cloud Expo, co-located with 5th International Big Data Expo and 3rd International SDN Expo, to be held June 10-12, 2014, at the Javits Center in New York City, N.Y. announces that its Call for Papers is now open. Topics include all aspects of providing or using massively scalable IT-related capabilities as a service using Internet technologies (see suggested topics below). Cloud computing helps IT cut infrastructure costs while adding new features and services to grow core businesses. Clouds can help grow margins as costs are cut back but service offerings are expanded. Help plant your flag in the fast-expanding business opportunity that is The Cloud, Big Data and Software-Defined Networking: submit your speaking proposal today!
What do you get when you combine Big Data technologies….like Pig and Hive? A flying pig? No, you get a “Logical Data Warehouse.” In 2012, Infochimps (now CSC) leveraged its early use of stream processing, NoSQLs, and Hadoop to create a design pattern which combined real-time, ad-hoc, and batch analytics. This concept of combining the best-in-breed Big Data technologies will continue to advance across the industry until the entire legacy (and proprietary) data infrastructure stack will be replaced with a new (and open) one.
While unprecedented technological advances have been made in healthcare in areas such as genomics, digital imaging and Health Information Systems, access to this information has been not been easy for both the healthcare provider and the patient themselves. Regulatory compliance and controls, information lock-in in proprietary Electronic Health Record systems and security concerns have made it difficult to share data across health care providers.
Cloud Expo, Inc. has announced today that Vanessa Alvarez has been named conference chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
12th International Cloud Expo, held on June 10–13, 2013 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
It's a simple fact that the better sales reps understand their prospects' intentions, preferences and pain points during calls, the more business they'll close. Each day, as your prospects interact with websites and social media platforms, their behavioral data profile is expanding. It's now possible to gain unprecedented insight into prospects' content preferences, product needs and budget. We hear a lot about how valuable Big Data is to sales and marketing teams. But data itself is only valuable when it's part of a bigger story, made visible in the right context.
Cloud Expo, Inc. has announced today that Larry Carvalho has been named Tech Chair of Cloud Expo® 2014. 14th International Cloud Expo will take place on June 10-12, 2014, at the Javits Center in New York City, New York, and 15th International Cloud Expo® will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Everyone talks about a cloud-first or mobile-first strategy. It's the trend du jour, and for good reason as these innovative technologies have revolutionized an industry and made savvy companies a lot of money. But consider for a minute what's emerging with the Age of Context and the Internet of Things. Devices, interfaces, everyday objects are becoming endowed with computing smarts. This is creating an unprecedented focus on the Application Programming Interface (API) as developers seek to connect these devices and interfaces to create new supporting services and hybrids. I call this trend the move toward an API-first business model and strategy.
We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.
Register and Save!
Save $500
on your “Golden Pass”!
Call 201.802.3020
or click here to Register
Early Bird Expires June 10th.


Silicon Valley Call For Papers Now OPEN
Submit
Call for Papers for the
upcoming Cloud Expo in
Santa Clara, CA!
[November 5-8, 2012]


Sponsorship Opportunities
Please Call
201.802.3021
events (at) sys-con.com
SYS-CON's Cloud Expo, held each year in California, New York, Prague, Tokyo, and Hong Kong is the world’s leading Cloud event in its 5th year, larger than all other Cloud events put together. For sponsorship, exhibit opportunites and show prospectus, please contact Carmen Gonzalez, carmen (at) sys-con.com.


New York City Expo Floor Plan Revealed
Cloud Expo New York
[June 11-14, 2012]

Floor Plan Revealed


Introducing Big Data Expo
Introducing
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Get a jump on that rapidly evolving trend at Big Data Expo, which we are introducing in June at
Cloud Expo New York.

Follow @CloudExpo New York on Twitter


Testimonials
Cloud Expo was a fantastic event for CSS Corp - we easily exceeded our objectives for engaging with clients and prospects."
AHMAR ABBAS
SVP, Global Infrastructure Management, CSS Corp.
 
With our launch at Cloud Expo, we successfully transformed the company from a relatively unknown European player into the dominant player in the market. Our competitors were taken by surprise and just blown away. We got a huge number of really high quality leads..."
PETE MALCOLM
CEO, Abiquo
 
We were extremely pleased with Cloud Expo this year - I’d say it exceeded expectations all around. This is the same info we got from partners who attended as well. Nice job!"
MARY BASS
Director of Marketing, UnivaUD
 
Cloud Expo helps focus the debate on the critical issues at hand in effect connecting main street with the next frontier."

GREG O’CONNOR
President & CEO, Appzero


Who Should Attend?
Senior Technologists including CIOs, CTOs, VPs of technology, IT directors and managers, network and storage managers, network engineers, enterprise architects, communications and networking specialists, directors of infrastructure Business Executives including CEOs, CMOs, CIOs, presidents, VPs, directors, business development; product and purchasing managers.


Join Us as a Media Partner - Together We Can Rock the IT World!
SYS-CON Media has a flourishing Media Partner program in which mutually beneficial promotion and benefits are arranged between our own leading Enterprise IT portals and events and those of our partners.

If you would like to participate, please provide us with details of your website/s and event/s or your organization and please include basic audience demographics as well as relevant metrics such as ave. page views per month.

To get involved, email Marilyn Moux at marilyn@sys-con.com.

@CloudExpo Blogs
Kenichi (Kevin) Mori is the director of Sony Electronics’ Security Systems Division, based in Park Ridge, New Jersey. He oversees business development and partnerships for the security group, which is part of Sony Electronics’ Professional Solutions of America group. He started his Sony career doing business-to -business sales and product marketing in China before moving to the security marketing division in Japan and then coming to the U.S. SecuritySolutionsWatch.com: Thank you for joining us today Kevin. Before discussing Sony's security capabilities in greater detail, please tell us about...
One of the more interesting data points to come out of our State of Application Delivery 2015 was the overwhelming importance placed on availability - even over security. When respondents were asked which service they would not deploy an application without, they chose availability. Security came in a close second. This caused a great deal of discussion. After all, one of the most often cited impediment to adopting, well, everything has been and remains security. One would think, then, that security is top of mind and clearly a priority for everyone. Yet availability beat it out for what...
Our guest on the podcast this week is Mark Thiele, EVP of Data Center Technology at Switch. We discuss the idea that private clouds are often equated with do-it-yourself and why that should be changed. Taking sure you are receiving the private environment you need at a cost that can support your business. Listen in to learn the different ways to own and manage a private cloud.
DevOps was created to reduce many of these same conflicts and while DevOps has had several high-profile successes it still presents a challenge for larger organizations. Large enterprises managing mission-critical systems still have separate silos for development and operations. In this post I discuss how DevOps fits into the enterprise and what release managers can do to adapt and extend DevOps to meet the challenges present in larger businesses. First, I’m going to define DevOps. Then I’m going to discuss the impedance mismatch between DevOps and a larger enterprise. In conclusion I’m going...
Experts predict that the cloud computing will continue to grow this year. According to the Computerworld Forecast Study, spending on cloud computing is expected to rise by 42% in 2015. Moreover, Esna projects that in a bid to increase the use of mobile, social, customer-facing and collaboration technologies, 40% of IT teams will spend more on software as a service and a mix of public, private and hybrid solutions this year.
Creating global change that is actually good for the entire world is a mammoth task. With a population of almost 7 Billion people as of 2015, the planet is taking a toll with surviving the brunt of keeping the works going. What role can Cloud Computing play in making it easier for all of us?
DevOps is all about removing barriers to rapid, safe delivery of new experiences to your customers. Much of this revolves around automating error-prone, human-driven processes so that processes can be standardized, scaled, and varied programmatically. Some of the types of tools used in a DevOps-minded organization might include version control systems, automation servers, and configuration management systems. Many tools can be used across categories, with varying amounts of success. Some vendors offer products that claim to address all of these needs with one solution – most rarely deliver on ...
Application metrics, logs, and business KPIs are a goldmine. It’s easy to get started with the ELK stack (Elasticsearch, Logstash and Kibana) – you can see lots of people coming up with impressive dashboards, in less than a day, with no previous experience. Going from proof-of-concept to production tends to be a bit more difficult, unfortunately, and it tends to gobble up our attention, time, and money. In his session at DevOps Summit, Otis Gospodnetić, co-author of Lucene in Action and founder of Sematext, will share the architecture and decisions behind Sematext’s services for handling larg...
It’s become easy to monitor applications that are deployed on hundreds of servers – thanks to the advances in application performance management tools. But the more data you collect the harder it is to visualize the health state in a way that a single dashboard tells you both the overall status as well as the problematic component. Eugene Turetsky (Dynatrace) and Stephan Levesque (SSQ Financial Group) shared their solution for monitoring large IT infrastructures that contain several hundred components that support SSQ’s most-critical applications running on a variety of technology stacks incl...
We explore how retailer Columbia Sportswear has made great strides in improving their business results through modernized IT, and where they expect to go next with their software-defined strategy. To learn more about the new wave of IT, we sat down with Suzan Pickett, Manager of Global Infrastructure Services at Columbia Sportswear in Portland, Oregon; Tim Melvin, Director of Global Technology Infrastructure at Columbia, and Carlos Tronco, Lead Systems Engineer at Columbia Sportswear. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
I recently had the opportunity to attend UI19 in Boston, a long-running conference focused on user experience design and ways to be more effective in a UX role as part of a larger team. One of the presentations in particular stuck with me as I returned to Boulder thinking about VictorOps and our evolution as an early stage startup. Presented by Kim Goodwin, her talk on Principles, Values, and Effective Design Teams touched on a number of challenges we’ve experienced first-hand here at VictorOps as we strive to balance the delivery of a great product with the necessity to move quickly, while...
A friend of mine's son recently returned from an extended absence which basically removed him from nearly all aspects of technology, including the Internet, for a bit longer than 5 years. Upon return, observing him restore his awareness of technologies and absorb all things new developed over the past 5 years was both exciting and moving. To be fair, the guy grew up in an Internet world, with access to online resources including Facebook, Twitter, and other social applications. The interesting part of his re-introduction to the "wired" world was watching the comprehension flashes he went t...
Over the last couple of years I have talked to numerous enterprise customers, analysts, industry pundits, and others interested in cloud technologies, and one thing is abundantly clear – Platform-as-a-Service (PaaS) seems to mean different things to different people. But the term PaaS is irrelevant – it's just noise. What is relevant, and what is important, is what PaaS does: enable applications. That's what enterprises care about. They want to accelerate application development to get products to market faster and into users' hands sooner.
It’s easy to fall into a pattern of dysfunctional releases, release processes that are characterized by delay, inefficiency, and endless meetings that encourage people to view releases as a problem. These are the kinds of meetings that inspire references to the movie Office Space or emails that include clippings of the cartoon Dilbert - repetitive meetings to answer the same questions over and over again all because people lack the tools to connect the issue tracker with the change management systems. In organizations without a reliable process a release is also a time for production system o...
RealTime Medicare Data analyzes huge volumes of Medicare data and provides analysis to their many customers on the caregiver side of the healthcare sector using HP Vertica. Here to explain how they manage such large data requirements for quality, speed, and volume, we're joined by Scott Hannon, CIO of RealTime Medicare Data and he's based in Birmingham, Alabama. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Untitled Document
Past SYS-CON Events
    Cloud Expo West
cloudcomputingexpo
2011west.sys-con.com

 
    Cloud Expo East
cloudcexpo
2011east.sys-con.com

 
    Cloud Expo West
cloudcomputingexpo
2010west.sys-con.com

 
    Virtualization Expo West
virtualization
2010west.sys-con.com
    Cloud Expo Europe
cloudexpoeurope2010.
sys-con.com

 
    Cloud Expo East
cloudcomputingexpo
2010east.sys-con.com

 
    Virtualization Expo East
virtualizationconference
2010east.sys-con.com
    Cloud Expo West
cloudcomputingexpo
2009west.sys-con.com

 
    Virtualization Expo West
virtualizationconference
2009west.sys-con.com
    GovIT Expo
govitexpo.com
 
    Cloud Expo Europe
cloudexpoeurope2009.sys-con.com
 

Cloud Expo 2011 Allstar Conference Faculty

S.F.S.
Dell

Singer
NRO

Pereyra
Oracle

Ryan
OpSource

Butte
PwC

Leone
Oracle

Riley
AWS

Varia
AWS

Lye
Oracle

O'Connor
AppZero

Crandell
RightScale

Nucci
Dell Boomi

Hillier
CiRBA

Morrison
Layer 7 Tech

Robbins
NYT

Schwarz
Oracle

What The Enterprise IT World Says About Cloud Expo
 
"We had extremely positive feedback from both customers and prospects that attended the show and saw live demos of NaviSite's enterprise cloud based services."
  –William Toll
Sr. Director, Marketing & Strategic Alliances
Navisite
 


 
"More and better leads than ever expected! I have 4-6 follow ups personally."
  –Richard Wellner
Chief Scientist
Univa UD
 


 
"Good crowd, good questions. The event looked very successful."
  –Simon Crosby
CTO
Citrix Systems
 


 
"It's the largest cloud computing conference I've ever seen."
  –David Linthicum
CTO
Brick Group