From the Blogosphere
Bare Metal Blog: Introduction to FPGAs
FPGAs change a lot. Here’s why they’re a big deal
By: Don MacVittie
Nov. 17, 2012 09:00 AM
We’re having all of our sidewalks redone right this instant. In fact, I’ll include a picture of the “pavers” – which is the fancy new word for the stones used to build the sidewalk. If the construction and design team do something wrong, it will cost them a pretty penny to come back out, rip up the pavers (and the columns or knee wall they’re putting in with the pavers on the patio), and move things around or replace pavers to make it right. We hired a great company that has done good work for us in the past, so I’m not terribly worried about this possibility. It happens in construction, but happens a lot less with a reputable installer.
It does offer a solid introduction to Field Programmable Gate Arrays (FPGAs) though. Because before there were FPGAs, most hardware out there shipped with a well-defined, non-changeable logic path. It did what it did, and if the hardware designers made a mistake in this increasingly complex product, you were stuck with the results. Some EEPROMs were shipped with re-programmability, but the vast majority of hardware did not have any way to update it. If a bug appeared, you lived with it or the vendor took the very expensive step of replacing it. Much like what happens when pavers are installed incorrectly. The difference of course is that you can look at pavers and see if you think the work is right, while hardware needs to be run – and run a lot – before weaknesses show. Kind of like the case where pavers are laid down but the material underneath them is not properly prepared. The next spring you can expect a jungle to grow up between the pavers, but until then they look nice.
EEPROMs (Electrically Erasable Programmable Read Only Memory) and then FPGAs brought the ability to fix bugs in the field into the realm of hardware. As FPGAs progressed and became more complex, even real-time updating (as in on-the-fly) became a possibility. At this point, there are billions of gates on an FPGA, and they’re used in a wide variety of devices. If you’ve ever “Flashed the ROM” or “Updated Firmware” there is a good chance you’ve been updating the FPGA in the device (though of course, these terms are vague enough that it could be other things you’re updating too).
But the power of updating on-the-fly is huge. If for nothing else than prototyping and training. Need to teach people hardware design? How better than on a device that you can program, test, reprogram, test again… Indeed, for at-home use (having nothing to do with F5, just one of my many geek toys), I use an Actel FPGA to set up complex circuits. Actel is now MicroSemi, but I haven’t dealt with them since the change, so I don’t know any details there. But for designing circuits, you can’t beat it. I’ve abused mine, and it still does what I tell it to. Note I said “what I tell it to”, not “what I expect it to”… I’m not a professional at FPGA programming, but it is a lot of fun.
But in a professional setting, the power is even greater. Not only can you train staff in FPGA programming and prototype solutions with FPGAs, you can also ship with FPGAs installed. Having FPGAs installed means that a huge percentage of the logic that makes a device go can be updated as-needed. This helps the vendor by giving them a path to fixing logic errors that were not discovered before ship time (say because the error is not obvious until the device is under massive load for a long period of time). It helps the customer by giving them an obsolescent-resistant product. If the logic of the hardware can be updated, then the device is much more forward-compatible than those that are not. When an FPGA can have 500,000 to millions of logic elements on it, the level of re-programmability becomes amazing. No support for the newest standard that impacts your device? Download the update, and BAM! You’ve got support for a standard that might not have even existed when your device was originally designed.
This does of course come with some risks. A part of your system that was stable forever now has changes introduced to it dynamically, but most reputable vendors have tools/steps/security in place to protect their customers from hardware problems bringing down the entire system. I can’t speak for everyone, in fact, at this instant I can’t even authoritatively speak for F5, but this next week I’ll be talking to the hardware folks about what we do, and the next two installments in this blog will cover both what we do with FPGAs, and how we protect our customers.
Cloud Expo Breaking News
Top Stories for Cloud Expo 2012 East
In this Big Data Power Panel at the 10th International Cloud Expo, moderated by Cloud Expo Conference Chair Jeremy Geelan, Govind Rangasamy, Director of Product Management at Eucalyptus Systems; Kevin Brown; CEO of Coraid, Inc.; Christos Tryfonas, CTO and Co-Founder of Cetas; and Max Riggsbee, CMO and VP of Products for WhipTail, discussed such topics as: Big Data has existed since the early days of computing; why, then, do you think there is such an industry buzz around it right now? How is Big Data impacting storage and networking architecture in data centers? How about the intersection of Big Data Analytics and Cloud Computing - how big a sector is that and why? What's the difference between Big Data and Fast Data? ... (more)
Best Recent Articles on Cloud Computing & Big Data Topics
The Arlington, Virginia-based National Science Foundation has just released its "Report on Support for Cloud Computing" - in response to the America Competes Reauthorization Act of 2010, Section 524. It is an absolute must-read for all concerned with current and future research projects in Cloud Computing.
"The volume of data we're generating now from machines pales in comparison to the volume of data we'll soon generate from our own bodies," says data security expert Dave Asprey. Writing in a Trend Micro blog, Asprey - who is one of the leaders in the emerging Quantified Self movement - explains his vision of a world in which personal biometrical data is shared via the cloud.
Cloud computing has caught the attention of business leaders around the world in every industry because of its enormous transformative potential. Visionary companies know that the value of the cloud is far greater than the current focus solely on technology and operating costs: when combined with a collaborative approach to designing processes, cloud computing will change how we do business.
Want to make sense of the hottest new concept in Enterprise IT? Want to understand in just hours what experts have spent many hundreds of days deciphering? Cloud computing is a technology that has rapidly evolving peppered with a lot of hype along the way. Customers find it hard to navigate through this and make sense of what aspects of this technology will give them real business benefit. Cloud Computing Bootcamp, led by our 2013 Bootcamp Instructor Larry Carvalho, is a great way to get a practical understanding of this technology. We offer multiple days of actionable insight into what vendor offerings are currently available and help you comprehend their strategy. The ever-popular Bootcamp, which is now held regularly around the world, is being held in conjunction with the 12th Cloud Expo, June 10-13, 2013, at the Javits Center, New York, NY.
Did you know that ninety percent of the data in the world has been created in the last two years? Every day, we create 2.5 quintillion (or 2.518) bytes of data, according to IBM. As corporations across all industries globally are struggling with how to retain, aggregate and analyze this mounting volume of what the industry refers to as Big Data, it also provides a unique opportunity for innovative startups that recognize the business prospects Big Data presents. Big Data is not just unlocking new information but new sources of economic and business value. Interactivity is driving Big Data, with people and machines both consuming and creating it. Digital companies focused on becoming good at aggregating and analyzing the data created by the end users of their product, who then provide their customers with solid insights taken from that data are at a distinct competitive advantage over others in the marketplace.
Industry-specific clouds are those PaaS, IaaS, and PaaS services that are tailored for a specific vertical, such as transportation, retail, finance, and health care. IDC sees a $65 billion market in these industry solutions for 2013, rising to $100 billion in 2016. The value of industry-specific clouds is that businesses within a vertical can connect to applications, processes, and databases that are pre-defined for that vertical within a public or private cloud. They can extend processes and databases into the business domain, versus defining the data and processes within a generic cloud-based platform. So, are industry specific clouds right for your business? What options are out there? How do you figure out the ROI?
SYS-CON Events announced today that Rackspace Hosting, the open cloud company, has been named "Platinum Plus Sponsor" of SYS-CON's 12th International Cloud Expo, which will take place on June 10-13, 2013, at the Javits Center in New York City, New York. Rackspace® Hosting (NYSE: RAX) is the open cloud company, delivering open technologies and powering more than 205,000 customers worldwide. Rackspace provides its renowned Fanatical Support® across a broad portfolio of IT products, including Public Cloud, Private Cloud, Hybrid Hosting and Dedicated Hosting. Rackspace has been recognized by Bloomberg BusinessWeek as a Top 100 Performing Technology Company, is featured on Fortune's list of 100 Best Companies to Work For and is included on the Dow Jones Sustainability Index. Rackspace was positioned in the Leaders Quadrant by Gartner Inc. in the "2011 Magic Quadrant for Managed Hosting." Rackspace is headquartered in San Antonio with offices and data centers around the world.
10th International Cloud Expo, held on June 11-14, 2012 at the Javits Center in New York City, featured four content-packed days with a rich array of sessions about the business and technical value of cloud computing led by exceptional speakers from every sector of the cloud computing ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service. We invite you to enjoy our photo album of the show - we'll be adding new images all week.
Ulitzer.com announced "the World's 30 most influential Cloud bloggers," who collectively generated more than 24 million Ulitzer page views. Ulitzer's annual "most influential Cloud bloggers" list was announced at Cloud Expo, which drew more delegates than all other Cloud-related events put together worldwide. "The world's 50 most influential Cloud bloggers 2010" list will be announced at the Cloud Expo 2010 East, which will take place April 19-21, 2010, at the Jacob Javitz Convention Center, in New York City, with more than 5,000 expected to attend.
Cloud computing is becoming one of the next industry buzz words. It joins the ranks of terms including: grid computing, utility computing, virtualization, clustering, etc. Cloud computing overlaps some of the concepts of distributed, grid and utility computing, however it does have its own meaning if contextually used correctly. The conceptual overlap is partly due to technology changes, usages and implementations over the years. Trends in usage of the terms from Google searches shows Cloud Computing is a relatively new term introduced in the past year. There has also been a decline in general interest of Grid, Utility and Distributed computing. Likely they will be around in usage for quit a while to come. But Cloud computing has become the new buzz word driven largely by marketing and service offerings from big corporate players like Google, IBM and Amazon.
SYS-CON Events announced today that Dell Inc. has been named "Silver Sponsor" of SYS-CON's 12th International Cloud Expo, which will take place on June 10-13, 2013, at the Javits Center in New York City, New York. For more than 28 years, Dell has empowered countries, communities, customers and people everywhere to use technology to realize their dreams. Customers trust Dell to deliver technology solutions that help them do and achieve more, whether they're at home, work, school or anywhere in their world. Learn more about Dell's story, purpose and people behind its customer-centric approach.
One of the most compelling promises of the cloud is that you can pull out a credit card and be working in minutes. No purchase orders to fill out, no equipment to wait for on the loading dock. Just instant access to the resources you need, when you need them. But accessibility comes at a price, and an unintentional consequence may be that you create yet another orphaned identity silo. Enterprise IT has spent years consolidating its mishmash of directories, only to discover that cloud now threatens to turn back their hard-won victories. In his session at the 12th International Cloud Expo, Scott Morrison, CTO and Chief Architect at Layer 7 Technologies, will look at strategies to incorporate identity into cloud applications. Enterprise identity or social login can both be a part of your go-to-cloud strategy, but you must plan for this upfront, rather than try to retrofit identity and access control at a later date.
Cloud Expo, Cloud Expo East, Cloud Expo West, Cloud Expo Silicon Valley, Cloud Expo Europe, Cloud Expo Tokyo, Cloud Expo Prague, Cloud Expo Hong Kong, Cloud Expo Sao Paolo are trademarks and /or registered trademarks (USPTO serial number 85009040) of Cloud Expo, Inc.
The World's Most Influential Blogs