cloud

It’s a logical evolution of connectivity, software abstraction and commoditized computing power. It was predicted in 1961 by artificial intelligence expert John McCarthy, who envisioned utility computing with “time-shared computers” supplying the logic for “book-size home information terminals.” Cloud computing, like so many other developments, seems, simply, to make sense, even if the steps that paved the way didn’t always.

In 1999, when VMware resuscitated IBM’s virtual machine concept and applied it to the problematic x86 architecture, who cared that most x86 servers were only running at 10% capacity? Today, virtualization is key to data center optimization.

And remember how those Salesforce.com ads with the red line through the word “software” seemed an annoying rehash of 1990s application service providers and 1970s service bureaus? Nearly 93,000 customers later, software-as-a-service is driving interest in the cloud market.

In the mid-2006, didn’t Amazon’s leasing of its excess data center capacity on a utility computing basis only pique the interest of webmasters? Yet it didn’t take long for Amazon to lead the infrastructure-as-a-service charge.

As automatic and incremental as its horizontal crawl has been—and as overhyped as it has been during these depressed economic times—cloud computing has had an undeniable effect on data centers, software shops, major vendors, the U.S. government and Microsoft-ad-aware “to the cloud” consumers.

Start with hosting giant Rackspace. It had to turn on a dime in mid-2009, using agile techniques when recession pressures and new acquisitions forced internal software development to switch its hosting priorities to cloud solutions. “We changed from supporting large numbers of customized solutions to supporting really large numbers of standardized solutions,” said Troy Toman, director of software development at Rackspace.

Or look at Perforce Software, whose lean configuration management tools aren’t yet geared for the cloud. That hasn’t deterred customers from asking the proverbial “Are we there yet?” question. “I’ve been with Perforce for 11 years,” said Tony Smith, Perforce’s European technical director, responsible for the U.K. engineering team developing cloud solutions.

“For most of that time, you could count on one hand the number of requests we got for a managed solution. Nowadays they’re coming thick and fast. People are much more willing to adopt the software-as-a-service model, thanks to the success of companies like Workday and Salesforce.”

And what of Big Blue or Redmond? Both companies have platform-as-a-service plays as well as private cloud appliances. IBM launched its Smart Business Development and Test Cloud in 2009, while Microsoft commercialized Windows Azure and SQL Azure in 2010. They’ve tinkered with internal strategy and the sizes of their offerings, trying to home in on what developers and enterprises want from the cloud. But the behemoths are understandably wary of what the overall cloud market could mean to their existing licensed software businesses.

“Everybody who sells hardware and software, like IBM and Hewlett-Packard, you notice they are slow moving into the cloud,” said David Linthicum, author of “Cloud Computing and SOA Convergence in Your Enterprise.”

“That’s because IBM’s going to cannibalize their own sales if they put WebSphere into the cloud. That’s why they’re promoting the use of private clouds and why the private cloud space is booming.”

#!
Last and largest, there is the federal government. Launched in 2009, the Obama administration’s Federal Cloud Computing Initiative and online cloud computing storefront were the first salvos.

“The U.S. Government is the largest buyer of IT on the planet,” said Vivek Kundra, former federal CIO, in a July 2010 Congressional hearing on moving costly federal IT systems into the cloud. “We spend approximately US$80 billion annually on information technology systems.”

According to Kundra, the number of federal data centers has gone from 432 to 1,100 in a decade. “That is not sustainable in the long-term as we continue to plow capital in data center after data center,” he said.

While no one expects the U.S. government migration to public, private or hybrid utility computing models to be nimble or quick, the imperative has spurred a wave of procurements (including $2.5 billion for consolidated e-mail services), and savings, such as plans to close nearly 100 data centers this year. That’s had a ripple effect, with the General Services Administration vetting vendors both familiar and “disadvantaged” to offer on-demand IT resources through Apps.gov. These include Amazon, AT&T, Autonomic Resources, Computer Literacy World, Computer Technologies Consultants, Dell, Alaskan-native-owned Eyak Technology, General Dynamics Information Technology, Microsoft, Savvis and Verizon.

Blurring the view
“ ‘Cloud’ is, shall we say, not exactly the most well-defined term in the world,” said Slashdot cofounder Jeff Bates, now the head of Perforce’s cloud and community initiatives. “There are lots of different ways our customers describe their interest. To some, it means Perforce is hosted at their company, but we’re somehow doing the maintenance—but we’ve offered that for years; it’s P4Admin. Others want private cloud, or are interested in cloud bursting that may or may not be related to Perforce at all.”

Linthicum concurred: “We’re kind of in a hysteria now where customers are looking for apps to be delivered in ways they think they should be based on the hype they’re hearing.”

Semantic subtleties aside, there’s no denying application development will change. As with the Internet, cloud computing puts individual developers and small businesses on par with multinational corporations, and in an agile world, smaller entities may be better prepared to leverage the cloud. Microsoft, for instance, recommends against migrating existing applications to Azure in most cases. It simply doesn’t make sense yet to take significant investment in technology and legacy software off-site.

And cloud computing may be poorly defined in terms of market size thanks to the inclusion of private clouds, which sometimes do nothing more than leverage virtualization on existing internal IT resources. A good guess at the percentage of all applications that are currently cloudy? Ten percent.

“Some would say that an enterprise virtualizing its existing IT resources doesn’t make any difference in what we call cloud computing. But that’s only if you’re trying to define the market size,” said John Rhoton, author of “Cloud Computing Architected” and “Cloud Computing Explained: Implementation Handbook for Enterprises.”

“I don’t mind it because I see it more as a journey. Cloud is going to gain, but it’s never going to be a point where it takes over existing IT.”

“I don’t think it will be significantly above 50% in next 10 years,” said Linthicum, which would also explain why platform-as-a-service is destined to be a niche market. “PaaS is fairly new and not a huge deal. A lot of people are using it, but it’s not a huge inflecting market. If you consider it’s aimed at software developers, you realize it’s like the software development tools market, and it will grow in a similar way.”

It’s not just about infrastructure
But software developers can’t complain that the cloud doesn’t portend big changes. It is a revolutionary convergence of technologies, policies, standards and architectures. In fact, despite rumors of their death being greatly exaggerated thanks to the utility computing model, software developers might be best poised to evolve the cloud market.

It’s been 16 years since the Internet began to have a visible effect on the world. Last year marks that point for the cloud. According to a 2010 Price Waterhouse Coopers report (“The cloud you don’t know: An engine for new business growth” by Vinod Baya, Bud Mathaisel and Bo Parker), the big opportunity for cloud isn’t IT infrastructure, it’s the extensible enterprise. In that sense, Amazon EC2 is an example of exposing latent value to other businesses, which symbiotically built on that infrastructure, drove additional traffic to it and created new value around it.

That’s why a key requirement for cloud-based applications is to “architect for scalability,” said Rhoton. “Even if you think your app is only going to be serving a few thousand users, you never know where it’s going to go. But if it’s successful, people tend to repurpose it.

“So it’s wise to implement in a way that will allow it to self-scale easily in two directions: one, for a whole bunch of users, and two, for multi-tenancy, such as multiple admin domains, different security boundaries, or catering to different user groups.”

Elasticity is not optional, said Rhoton. “If you look at any successful app today, the people who developed it didn’t know it was going to be that successful.”

#!
Breathing new life into services
Another key characteristic is that this revolution expands the “network of networks” into a network of business platforms. Successfully exposing solutions, be they insurance risk calculations or travel reservation engines, requires modular logic that can be decoupled from other internal processes. It’s no wonder that an acronym has been revived in the cloud hype-cycle: SOA, or service-oriented architecture.

In his “Cloud Computing and SOA Convergence in Your Enterprise,” Linthicum defines SOA as “a strategic framework of technology that allows all interested systems, inside and outside of an organization, to expose and access well-defined services, and information bound to those services, that may be further abstracted to process layers and composite applications for solution development. In essence, SOA adds the agility aspect to architecture, allowing us to deal with system changes using a configuration layer rather than constantly having to redevelop these systems.”

“SOA has been modernized and localized for cloud computing,” said Linthicum. “For many moving into the cloud, it’s the first time they’ve heard of SOA. They are looking for guidance to get from point A to B.”

As he expands on the definition in the book, he hits upon some key benefits of SOA:

Reuse and “leveraging remote application behavior as if it existed locally;” business process agility; monitoring of “points of information and points of service, in real time, to determine the well-being of an enterprise or trading community;” and extensibility for inter-enterprise collaboration or process-sharing.

These sound like excellent points for any modern application. Just as agility depends heavily on metrics as well as iterative product discovery, cloud computing should use information to validate or evolve offerings.

Perhaps the most compelling SOA concept Linthicum espoused is that of governance, which itself can become a service to be consumed, and which is in sore need of updating, as pundits have noted. (The American Institute of Certified Public Accountants has a new Service Organization Control framework that replaces the early 1990s-era SAS 70 risk assessment for cloud computing.)

This, too, could become an opportunity for collective intelligence to rule, Linthicum proposed. Similar to the “given enough eyeballs, all bugs are shallow” tenet of open-source software development, in cloud governance, “Design patterns are already defined for you around specific categories of services, and pre-built policies exist around the operation of those services,” he wrote in “Cloud Computing and SOA Convergence.”

“In short, you are taking advantage of the community aspect of SOA governance delivered as a service to do most of the work for you—100,000 heads are better than one.”

Platform proliferation
It’s fascinating to watch the platform plays emerging from the ISV space. Perforce, for example, has put free Amazon Machine Images on EC2 for trial use purposes. As it looks for ways to broaden that offering, it seeks feedback from customers as to what tools could be shrink-wrapped with Perforce-as-a-Service.

Cloud-based storage could ameliorate long-distance file-transfer latency. But a cloud offering would likely be a minority position, as it could not support a 4,000-user installation.

“Our cloud machine is best for handling a few hundred users—two to three hundred,” said Perforce’s Smith. “It has to do with the latency of the disk I/O.”

Electric Cloud is an ISV whose name predates cloud computing. Its niche parallel build solution (ElectricCommander 3.8) is being rebranded for private clouds. The workflow and task automation engine sequences and runs test and build tasks across parallel workloads. What’s not clear is if this is technically different from the company’s existing solution, but perhaps cloud computing will be the big break this vendor has been waiting for.

Oracle, for its part, has staked a claim to vertical markets with a platform play around healthcare and life sciences. Oracle Health Sciences Cloud caters to researchers while maintaining privacy and security around patient data and HIPAA compliance. It includes software (Fusion Middleware) and hardware (the Oracle Exadata Database Machine and ZFS Storage Appliance). The SaaS portion includes services for electronic data capture, electronic patient-reported outcomes, study design, coding and dictionary management, trial randomization, drug supply management, research clinic automation, and safety management.

In the consumer space, Google’s open-source Android platform has been highly effective in achieving smartphone market share, but Apple has responded with a cloud storage offering: iCloud, which includes iTunes services. Lesser-known HipServ is a software platform for the home cloud media market with over 400,000 home users in 130 countries.

As indicated by its “to the cloud” marketing campaign, Microsoft has kept busy with multiple consumer offerings: Windows Phone 7, for example, will integrate messages seamlessly across SMS, social networks and chat; offer location-aware services; and allow knowledge workers to multitask and synchronize files across devices. The Microsoft Xbox Kinect may foster a thriving ecosystem for game programming.

Amazon, too, plays well with consumers. The Kindle looks to continue to grow in features, and it is coupled with the company’s incredibly long tail of content. Upcoming consumer cloud apps may involve practically sentient appliances that use the cloud as a hub (TVs that turn off when you leave the room, or teapots that turn on when you wake) and smart houses (A/C, lighting, security). It’s worth noting, however, that futurist McCarthy pooh-poohed the value of “toasters having websites.” It remains to be seen which of these consumer cloud integrations will take off and which will be too costly or annoying to implement.

Then there are all the developer-focused PaaS offerings: CloudBees, Dot Cloud, Heroku and the like. These implement Java, .NET, Ruby on Rails and other languages, and they supply developers with all necessary databases, servers, networks and tools—no installation or maintenance required.

#!
Learning from history
Clearly, many cloud innovations will become as ubiquitous and automatic as the Web is today. But there are plenty of risks, too.

According to an old story about the dot-com boom, a clothing retailer was admonished by venture capitalists for having a 1-800 number on its marketing materials. “Take that off, it looks dated. All you need is a website,” they said. But the retailer stuck with a multi-modal approach to taking catalog orders and watched as others prematurely banked on consumers having an instant comfort level with online shopping. The cloud is heavy with such silver-lined thinking.

It simply may not make sense for I/O-intensive operations. Database sharding is a concern, and the entire data model for cloud computing is evolving, with SQL Azure in the lead but by no means the last word on the subject. Indeed, it’s worth revisiting the old fallacies about distributed computing by Peter Deutsch:
• The network is reliable
• Latency is zero
• Bandwidth is infinite
• The network is secure
• Topology doesn’t change
• There is one administrator
• Transport cost is zero
• The network is homogenous

Of course, maturity is an issue. “If you look at the problems people are running into, especially with Azure or Google App Engine, they’re still pretty immature,” said Rhoton. Unplanned outages or requests not completing require a new architectural perspective. “You have to plan for components of your app going down. You have to have redundancy and loose enough coupling so mistakes don’t propagate to all other modules. Since the platform you’re running on is sort of opaque, you get results back that just tell you things aren’t working.”

As a result, he said, many developers take the approach of not even bothering to discover what has gone wrong. “If you run into the problem, you shoot down the instance and start up another. If you have a complex app running on Amazon, some of your components will stall or just won’t work. It may be difficult to diagnose why. The key isn’t to try to troubleshoot everything to the extreme.”

That said, transparency—from the cloud provider to you, and from you to your users—is key when things go awry.

What’s next
There were two main types of reaction after the big Amazon outage in the spring: First, cloud computing is less predictable than we thought; and second, it is fundamentally flawed. However, the outage and resultant less-than-1% data losses made two things clear: First, redundancy and regional isolation are critical if massive cloud vendors are to withstand outages; and second, a significant number of application developers have not architected well enough to avoid being felled by outages, attacks or data center server seizures by the FBI.

If this redundancy is so hard to come by, can application-level security be much of a priority either? Developers can’t use cloud vendor security and load-balancing as an excuse to avoid intelligent application design in the first place.

“We’ve had our first big hiccup this year with Amazon,” said Perforce’s Smith. “Netflix is the one that survived well because they hedged their bets and were spread over two clouds: Amazon and Rackspace. They also used their Chaos Monkey during tests. In the next 18 months, there will be a big security scare—Amazon’s data center will be rich pickings.”

But, he said, once that debacle is behind us, cloud computing will have achieved another level of maturity.

In the meantime, the application development life cycle will continue to adapt to the cloud, as will the tools and processes software that enterprises should be aware of as they embrace multi-tenancy.