From the Blogosphere
Bare Metal Blog: Testing for Numbers or Performance?
What you test can say a lot about you
By: Don MacVittie
Dec. 14, 2012 10:00 AM
Along the lines of the first blog in the testing portion of the Bare Metal Blog series, I’d like to talk a bit more about how the testing environment, the device configuration, and the payloads translate into test results.
One of the problems most advanced mass education systems run into is the question of standardized testing. While it is true that you cannot fix what you have not determined is broken, like most things involving people, testing students for specific areas of knowledge does kind of guarantee that those doing the teaching will err on the side of preparing students to take the test rather than to succeed in life. The mere fact that there IS a test changes what is taught. It is of course possible to a make this into a massively positive proposition by targeting the standardized tests at the most important things students need to learn, but for our discussion purposes, the result is the same – the students will be taught to whatever is on that test first, and all else secondarily.
This is far too often true of vendor product testing also. The mere fact that there will be a test of the equipment, and most high-tech markets being highly competitive, makes things lean toward tweaking the device (or the test) to maximize test performance, in spite of what the real world performance will be.
The current most flagrant problem with testing is a variant on an old theme. Way back when testing the throughput of network switches made sense, there was a lot of “packets per second” testing with no payload. Well, you test the ability of the switch to send packets to the right place, but do not at all test the device in a manner consistent with the real world usage of switches. Today we have a whole slew of similar tests for ADCs. The purpose of an ADC is to load balance, optimize, and if needed secure the passage of packets. Primarily this is for application traffic because they’re Application Delivery Controllers. Yet, application traffic being layer seven kind of means that you need to do some layer seven decision-making if the device is to be tested in the real world. If the packet is a layer seven packet, but layer four switching is all that is performed on it, the test is completely useless to determining the actual capabilities of the device. And yet there is a lot of that type of testing going on out there right now. It’s time – way past time – to drive testing into the real world for ADCs. Layer seven decision making is much more complex and requires a deep look at the packets in question, meaning that the results will not be nearly as pretty as simple layer four switching packets are. While you cannot do a direct comparison of all of the optional features of two different ADCs simply because the level of optional functionality support is so broad once a solid ADC platform is deployed, but you can test the basic capabilities and responsiveness of the core products.
And that is what we, as an industry must begin to insist on. I use one single oddity in ADC testing here, but every branch of high-tech testing I’ve been involved in over the years – security, network gear, storage, application – all have similar “this is not good enough” testing that we need to demand is dropped in favor of solid testing that reflects a real-world device. Not your real-world device unless you are running the test lab, but a real-world device that is seeing – and more importantly acting upon – data that the device will encounter in an actual network, doing the job it was designed for.
As I mentioned in the last testing installment, you can make an ADC look astounding if your tests don’t actually force it to do anything. For our public testing, we have standards, and offer up our configuration and testing goals on DevCentral. Whether you use it to validate the test results F5 uses, or to set up the tests in your own environment, publicly talking about how testing is performed is a big deal. Ask your vendor for configuration files and testing plan when numbers are tossed at you, make certain you know what they’re testing when they try to impress you with over-the-top performance numbers. In my career, I have seen cases where “double the performance of our nearest competitor” was used publicly and was as close to an outright lie as possible, since the test and configuration were different between the two products the test claimed to compare.
When you buy any form of datacenter equipment, you’re going to be stuck with it for a good long while. Make certain you know how testing that is informing your decision was performed, no matter who did the testing. Independent third party testing sometimes isn’t so independent, and knowing that can make you more cautious when hooking your company with gear you’ll have to live with.
Technorati Tags: Testing,Application Delivery Optimization,Bare Metal Blog,F5 Networks,Don MacVittie
Bare Metal Blog Series:
Cloud Expo Breaking News
Top Stories for Cloud Expo 2012 East
In this Big Data Power Panel at the 10th International Cloud Expo, moderated by Cloud Expo Conference Chair Jeremy Geelan, Govind Rangasamy, Director of Product Management at Eucalyptus Systems; Kevin Brown; CEO of Coraid, Inc.; Christos Tryfonas, CTO and Co-Founder of Cetas; and Max Riggsbee, CMO and VP of Products for WhipTail, discussed such topics as: Big Data has existed since the early days of computing; why, then, do you think there is such an industry buzz around it right now? How is Big Data impacting storage and networking architecture in data centers? How about the intersection of Big Data Analytics and Cloud Computing - how big a sector is that and why? What's the difference between Big Data and Fast Data? ... (more)
Best Recent Articles on Cloud Computing & Big Data Topics
The Arlington, Virginia-based National Science Foundation has just released its "Report on Support for Cloud Computing" - in response to the America Competes Reauthorization Act of 2010, Section 524. It is an absolute must-read for all concerned with current and future research projects in Cloud Computing.
"The volume of data we're generating now from machines pales in comparison to the volume of data we'll soon generate from our own bodies," says data security expert Dave Asprey. Writing in a Trend Micro blog, Asprey - who is one of the leaders in the emerging Quantified Self movement - explains his vision of a world in which personal biometrical data is shared via the cloud.
Cloud computing has caught the attention of business leaders around the world in every industry because of its enormous transformative potential. Visionary companies know that the value of the cloud is far greater than the current focus solely on technology and operating costs: when combined with a collaborative approach to designing processes, cloud computing will change how we do business.
Want to make sense of the hottest new concept in Enterprise IT? Want to understand in just hours what experts have spent many hundreds of days deciphering? Cloud computing is a technology that has rapidly evolving peppered with a lot of hype along the way. Customers find it hard to navigate through this and make sense of what aspects of this technology will give them real business benefit. Cloud Computing Bootcamp, led by our 2013 Bootcamp Instructor Larry Carvalho, is a great way to get a practical understanding of this technology. We offer multiple days of actionable insight into what vendor offerings are currently available and help you comprehend their strategy. The ever-popular Bootcamp, which is now held regularly around the world, is being held in conjunction with the 12th Cloud Expo, June 10-13, 2013, at the Javits Center, New York, NY.
Did you know that ninety percent of the data in the world has been created in the last two years? Every day, we create 2.5 quintillion (or 2.518) bytes of data, according to IBM. As corporations across all industries globally are struggling with how to retain, aggregate and analyze this mounting volume of what the industry refers to as Big Data, it also provides a unique opportunity for innovative startups that recognize the business prospects Big Data presents. Big Data is not just unlocking new information but new sources of economic and business value. Interactivity is driving Big Data, with people and machines both consuming and creating it. Digital companies focused on becoming good at aggregating and analyzing the data created by the end users of their product, who then provide their customers with solid insights taken from that data are at a distinct competitive advantage over others in the marketplace.
Industry-specific clouds are those PaaS, IaaS, and PaaS services that are tailored for a specific vertical, such as transportation, retail, finance, and health care. IDC sees a $65 billion market in these industry solutions for 2013, rising to $100 billion in 2016. The value of industry-specific clouds is that businesses within a vertical can connect to applications, processes, and databases that are pre-defined for that vertical within a public or private cloud. They can extend processes and databases into the business domain, versus defining the data and processes within a generic cloud-based platform. So, are industry specific clouds right for your business? What options are out there? How do you figure out the ROI?
SYS-CON Events announced today that Rackspace Hosting, the open cloud company, has been named "Platinum Plus Sponsor" of SYS-CON's 12th International Cloud Expo, which will take place on June 10-13, 2013, at the Javits Center in New York City, New York. Rackspace® Hosting (NYSE: RAX) is the open cloud company, delivering open technologies and powering more than 205,000 customers worldwide. Rackspace provides its renowned Fanatical Support® across a broad portfolio of IT products, including Public Cloud, Private Cloud, Hybrid Hosting and Dedicated Hosting. Rackspace has been recognized by Bloomberg BusinessWeek as a Top 100 Performing Technology Company, is featured on Fortune's list of 100 Best Companies to Work For and is included on the Dow Jones Sustainability Index. Rackspace was positioned in the Leaders Quadrant by Gartner Inc. in the "2011 Magic Quadrant for Managed Hosting." Rackspace is headquartered in San Antonio with offices and data centers around the world.
10th International Cloud Expo, held on June 11-14, 2012 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service. We invite you to enjoy our photo album of the show - we'll be adding new images all week.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
Cloud computing is becoming one of the next industry buzz words. It joins the ranks of terms including: grid computing, utility computing, virtualization, clustering, etc. Cloud computing overlaps some of the concepts of distributed, grid and utility computing, however it does have its own meaning if contextually used correctly. The conceptual overlap is partly due to technology changes, usages and implementations over the years. Trends in usage of the terms from Google searches shows Cloud Computing is a relatively new term introduced in the past year. There has also been a decline in general interest of Grid, Utility and Distributed computing. Likely they will be around in usage for quit a while to come. But Cloud computing has become the new buzz word driven largely by marketing and service offerings from big corporate players like Google, IBM and Amazon.
SYS-CON Events announced today that Dell Inc. has been named "Silver Sponsor" of SYS-CON's 12th International Cloud Expo, which will take place on June 10-13, 2013, at the Javits Center in New York City, New York. For more than 28 years, Dell has empowered countries, communities, customers and people everywhere to use technology to realize their dreams. Customers trust Dell to deliver technology solutions that help them do and achieve more, whether they're at home, work, school or anywhere in their world. Learn more about Dell's story, purpose and people behind its customer-centric approach.
One of the most compelling promises of the cloud is that you can pull out a credit card and be working in minutes. No purchase orders to fill out, no equipment to wait for on the loading dock. Just instant access to the resources you need, when you need them. But accessibility comes at a price, and an unintentional consequence may be that you create yet another orphaned identity silo. Enterprise IT has spent years consolidating its mishmash of directories, only to discover that cloud now threatens to turn back their hard-won victories. In his session at the 12th International Cloud Expo, Scott Morrison, CTO and Chief Architect at Layer 7 Technologies, will look at strategies to incorporate identity into cloud applications. Enterprise identity or social login can both be a part of your go-to-cloud strategy, but you must plan for this upfront, rather than try to retrofit identity and access control at a later date.
Cloud Expo, Cloud Expo East, Cloud Expo West, Cloud Expo Silicon Valley, Cloud Expo Europe, Cloud Expo Tokyo, Cloud Expo Prague, Cloud Expo Hong Kong, Cloud Expo Sao Paolo are trademarks and /or registered trademarks (USPTO serial number 85009040) of Cloud Expo, Inc.
The World's Most Influential Blogs