From the Blogosphere
Back to Basics: The Theory of (Performance) Relativity
Choice of load balancing algorithms is critical to ensuring consistent and acceptable performance
By: Lori MacVittie
Nov. 16, 2012 10:00 AM
Choice of load balancing algorithms is critical to ensuring consistent and acceptable performance
One of the primary reasons folks use a Load balancer is scalability with a secondary driver of maintaining performance. We all know the data exists to prove that "seconds matter" and current users of the web have itchy fingers, ready to head for the competition the microsecond they experience any kind of delay.
Similarly, we know that productivity is inherently tied to performance. With more and more critical business functions "webified", the longer it takes to load a page the longer the delay a customer or help desk service representative experiences, reducing the number of calls or customers that can be serviced in any given measurable period.
So performance is paramount, I see no reason to persuade you further to come to that conclusion.
Ensuring performance then is a vital operational directive. One of the ways operations tries to meet that objective is through load balancing. Distributing load ensures available and can be used to offset any latency introduced by increasing capacity (again, I don't think there's anyone who'll argue against the premise that load and performance degradation are inherently tied together).
But just adding a load balancing service isn't enough. The algorithm used to distribute load will invariably impact performance – for better or for worse.
Consider the industry standard "fastest response time" algorithm. This algorithm distributes load based on the historical performance of each instance in the pool (farm). On the surface, this seems like a good choice. After all, what you want is the fastest response time, so why not base load balancing decisions on the metric against which you are going to be measured?
The answer is simple: "fastest" is relative. With very light load on a pool of, say, three servers, "fastest" might mean sub-second responses. But as load increases and performance decreases, "fastest" might start creeping up into the seconds – if not more. Sure, you're still automagically choosing the fastest of the three servers, but "fastest" is absolutely relative to the measurements of all three servers.
Thus, "fastest response time" is probably a poor choice if one of your goals is measured in response time to the ultimate customer – unless you combine it with an upper connection limit.
HOW TO USE "FASTEST RESPONSE TIME" ALGORITHMS CORRECTLY
One of the negatives of adopting a cloud computing paradigm with a nearly religious-like zeal is that you buy into the notion that utilization is the most important metric in the data center. You simply do not want to be wasting CPU cycles, because that means you're inefficient and not leveraging cloud to its fullest potential.
Well, horse-puckey. The reality is that 100% utilization and consistently well-performing applications do not go hand in hand. Period. You can have one, but not the other. You're going to have to choose which is more important a measurement – fast applications or full utilization.
In the six years I spent load testing everything from web applications to web application firewalls to load balancers to XML gateways one axiom always, always, remained true:
As load increases performance decreases.
You're welcome to test and retest and retest again to prove that wrong, but good luck. I've never seen performance increase or even stay the same as utilization approaches 100%.
Now, once you accept that reality you can use it to your advantage. You know that performance is going to decrease as load increases, you just don't know at what point the degradation will become unacceptable to your users. So you need to test to find that breaking point. You want to stress the application and measure the degradation, noting the number of concurrent connections at which performance starts to degrade into unacceptable territory. That is your connection limit.
Keep track of that limit (for the application, specifically, because not all applications will have the same limits). When you configure your load balancing service you can now select fastest response time but you also need to input hard connection limits on a per-instance basis. This prevents each instance from passing through the load-performance confluence that causes end-users to start calling up the help desk or sighing "the computer is slow" while on the phone with their customers.
This means testing. Not once, not twice, but at least three runs. Make sure you've found the right load-performance confluence point and write it down. On your hand, in permanent marker.
While cloud computing and virtualization have certainly simplified load balancing services in terms of deployment, it's still up to you to figure out the right settings and configuration options to ensure that your applications are performing with the appropriate levels of "fast".
Cloud Expo Breaking News
Top Stories for Cloud Expo 2012 East
Best Recent Articles on Cloud Computing & Big Data Topics
The Arlington, Virginia-based National Science Foundation has just released its "Report on Support for Cloud Computing" - in response to the America Competes Reauthorization Act of 2010, Section 524. It is an absolute must-read for all concerned with current and future research projects in Cloud Computing.
"The volume of data we're generating now from machines pales in comparison to the volume of data we'll soon generate from our own bodies," says data security expert Dave Asprey. Writing in a Trend Micro blog, Asprey - who is one of the leaders in the emerging Quantified Self movement - explains his vision of a world in which personal biometrical data is shared via the cloud.
Cloud computing has caught the attention of business leaders around the world in every industry because of its enormous transformative potential. Visionary companies know that the value of the cloud is far greater than the current focus solely on technology and operating costs: when combined with a collaborative approach to designing processes, cloud computing will change how we do business.
Want to make sense of the hottest new concept in Enterprise IT? Want to understand in just hours what experts have spent many hundreds of days deciphering? Cloud computing is a technology that has rapidly evolving peppered with a lot of hype along the way. Customers find it hard to navigate through this and make sense of what aspects of this technology will give them real business benefit. Cloud Computing Bootcamp, led by our 2013 Bootcamp Instructor Larry Carvalho, is a great way to get a practical understanding of this technology. We offer multiple days of actionable insight into what vendor offerings are currently available and help you comprehend their strategy. The ever-popular Bootcamp, which is now held regularly around the world, is being held in conjunction with the 12th Cloud Expo, June 10-13, 2013, at the Javits Center, New York, NY.
Did you know that ninety percent of the data in the world has been created in the last two years? Every day, we create 2.5 quintillion (or 2.518) bytes of data, according to IBM. As corporations across all industries globally are struggling with how to retain, aggregate and analyze this mounting volume of what the industry refers to as Big Data, it also provides a unique opportunity for innovative startups that recognize the business prospects Big Data presents. Big Data is not just unlocking new information but new sources of economic and business value. Interactivity is driving Big Data, with people and machines both consuming and creating it. Digital companies focused on becoming good at aggregating and analyzing the data created by the end users of their product, who then provide their customers with solid insights taken from that data are at a distinct competitive advantage over others in the marketplace.
Industry-specific clouds are those PaaS, IaaS, and PaaS services that are tailored for a specific vertical, such as transportation, retail, finance, and health care. IDC sees a $65 billion market in these industry solutions for 2013, rising to $100 billion in 2016. The value of industry-specific clouds is that businesses within a vertical can connect to applications, processes, and databases that are pre-defined for that vertical within a public or private cloud. They can extend processes and databases into the business domain, versus defining the data and processes within a generic cloud-based platform. So, are industry specific clouds right for your business? What options are out there? How do you figure out the ROI?
SYS-CON Events announced today that Rackspace Hosting, the open cloud company, has been named "Platinum Plus Sponsor" of SYS-CON's 12th International Cloud Expo, which will take place on June 10-13, 2013, at the Javits Center in New York City, New York. Rackspace® Hosting (NYSE: RAX) is the open cloud company, delivering open technologies and powering more than 205,000 customers worldwide. Rackspace provides its renowned Fanatical Support® across a broad portfolio of IT products, including Public Cloud, Private Cloud, Hybrid Hosting and Dedicated Hosting. Rackspace has been recognized by Bloomberg BusinessWeek as a Top 100 Performing Technology Company, is featured on Fortune's list of 100 Best Companies to Work For and is included on the Dow Jones Sustainability Index. Rackspace was positioned in the Leaders Quadrant by Gartner Inc. in the "2011 Magic Quadrant for Managed Hosting." Rackspace is headquartered in San Antonio with offices and data centers around the world.
10th International Cloud Expo, held on June 11-14, 2012 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service. We invite you to enjoy our photo album of the show - we'll be adding new images all week.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
Cloud computing is becoming one of the next industry buzz words. It joins the ranks of terms including: grid computing, utility computing, virtualization, clustering, etc. Cloud computing overlaps some of the concepts of distributed, grid and utility computing, however it does have its own meaning if contextually used correctly. The conceptual overlap is partly due to technology changes, usages and implementations over the years. Trends in usage of the terms from Google searches shows Cloud Computing is a relatively new term introduced in the past year. There has also been a decline in general interest of Grid, Utility and Distributed computing. Likely they will be around in usage for quit a while to come. But Cloud computing has become the new buzz word driven largely by marketing and service offerings from big corporate players like Google, IBM and Amazon.
SYS-CON Events announced today that Dell Inc. has been named "Silver Sponsor" of SYS-CON's 12th International Cloud Expo, which will take place on June 10-13, 2013, at the Javits Center in New York City, New York. For more than 28 years, Dell has empowered countries, communities, customers and people everywhere to use technology to realize their dreams. Customers trust Dell to deliver technology solutions that help them do and achieve more, whether they're at home, work, school or anywhere in their world. Learn more about Dell's story, purpose and people behind its customer-centric approach.
One of the most compelling promises of the cloud is that you can pull out a credit card and be working in minutes. No purchase orders to fill out, no equipment to wait for on the loading dock. Just instant access to the resources you need, when you need them. But accessibility comes at a price, and an unintentional consequence may be that you create yet another orphaned identity silo. Enterprise IT has spent years consolidating its mishmash of directories, only to discover that cloud now threatens to turn back their hard-won victories. In his session at the 12th International Cloud Expo, Scott Morrison, CTO and Chief Architect at Layer 7 Technologies, will look at strategies to incorporate identity into cloud applications. Enterprise identity or social login can both be a part of your go-to-cloud strategy, but you must plan for this upfront, rather than try to retrofit identity and access control at a later date.
Cloud Expo, Cloud Expo East, Cloud Expo West, Cloud Expo Silicon Valley, Cloud Expo Europe, Cloud Expo Tokyo, Cloud Expo Prague, Cloud Expo Hong Kong, Cloud Expo Sao Paolo are trademarks and /or registered trademarks (USPTO serial number 85009040) of Cloud Expo, Inc.
The World's Most Influential Blogs