With the growing acceptance of cloud computing as the next disruptive technology after the Internet, sensationalism around data center colocation risks abound (though stories about FBI raids and seizures often apply to traditional Web-hosting setups, not utility computing).

That said, a recent seizure of data-center servers that were leased to a reseller made it clear that how vendors communicate these problems to their customers is of the utmost importance. Two small, affected websites offering bookmarking services (Instapaper and Pinboard) took slightly different approaches to blogging about the problems. The latter was clear, conciliatory and offered information on how to close out accounts, given that user data was possibly in the hands of the FBI (though the seizure had nothing to do with either of these services). The former was pugnacious and rambling.

Granted, neither party really knew the why or how of the seizure, and DigitalOne, the Web-hosting reseller, didn’t seem to know much more. Ultimately, while these events may be rarer than power or network outages, they require the same architectural concern for redundancy and failover.

Pinboard has demanded to see the FBI’s warrant to confirm that the seizure of an enclosure that included their main database server had nothing to do with them. It also promises to make the already simply designed site (scripted in PHP and Perl, with MySQL for data storage, Sphinx for search, and Amazon S3 to store backups) more resilient to data-center downtime. On the plus side, the raid has resulted in more press and prominence than Pinboard might otherwise be receiving.

When it comes to data-center assets, however, enforcers are interpreting search-and-seizure laws with the creativity of a bebop soloist. Virtualization, data caching and multi-tenant applications will only make things worse, more likely for small customers cutting cost corners with shady neighbors than large corporations who have crafted careful hosting or hybrid cloud strategies.

Will the cloud be legal?
“The big issue is that the laws are really outdated,” said John Rhoton, author of “Cloud Computing Architected” and “Cloud Computing Explained: Implementation Handbook for Enterprises.” “They barely cope with the Internet, and don’t cope with cloud computing at all. On the Internet, data can go anywhere. Copying a file from your house to your neighbor’s might be routed through several foreign countries.”

Questions around location, jurisdictions and conflicting requirements are hurdles that legislators and industry regulators will have to come to terms with soon.

Reform efforts aim to do just that. The Electronic Communications Privacy Act, enacted in 1986, determines when law enforcement could tap into electronic communications, protecting the privacy of those using wireless devices, e-mail and the Internet. Since its passage, however, it has not been updated.

The Cloud Computing Act of 2011 is another proposal to deal with the transitory, transnational nature of communications and data in the cloud. It determines escalating criminal penalties for hacking into cloud services depending on whether single computers are breached or full-scale botnet warfare occurs. It also addresses data storage across national borders.

Will the cloud be global?
Within the European Union, stringency around data storage and privacy is causing consternation for American cloud vendors, and it may be the reason their interest in deploying to the cloud lags far behind that of those in America.

“Europe has a history of really strict privacy laws,” said Rhoton. ‘In Austria or Germany or France, the regulations are strict and things tend to be more bureaucratic. Here in Austria, on my own phone bill, I can’t see the full phone numbers of the people I’ve called.”

An organization called EuroCloud is trying to clarify legislation for software vendors and companies that might use the cloud, Rhoton explained. Elsewhere in the world, there are countries that will not have the infrastructure required anytime soon. Asian interest in the cloud could well be a reaction to American needs and a sign of outsourced activity, Rhoton surmised.

But the EU overall is aware that it needs to push businesses toward the cloud, going so far as to incorporate it into the Digital Agenda, Europe’s strategy for fomenting a healthy digital economy by 2020.

“EuroCloud is using its growing influence to encourage industry and government to adopt cloud services and set the correct parameters around regulation and interoperability. At the same time, the industry needs better infrastructure, more skills and lower barriers to enhance market success,” said EuroCloud president Pierre-José Billotte. “The success of this year’s EuroCloud Congress shows a growing awareness of the benefits of cloud computing that will help motivate investment in these areas.”

According to the European Commission’s Information Society, Europe’s IT investment levels are less than half that of America’s. Among the proposals to push cloud improvements are creating large-scale pilots as proofs-of-concept, and increasing Internet speeds to at least 30Mbps for all households (with at least 50% able to achieve 100Mbps) to match rates seen in Japan and South Korea. Though some pundits have predicted that Europe may leapfrog American development in the cloud due to a lack of legacy-application drag, the infrastructure, investment and privacy hurdles indicate otherwise.
#!
Will the cloud be private?
While small and medium-sized businesses are deploying to the cloud with few compunctions (and often under duress from investors unwilling to finance another data center), big corporations are taking things slower. If they have already invested in on-premise infrastructure, there may be no need to make the switch. Anecdotally, however, private cloud deployments are garnering the most interest, according to cloud pundit David Linthicum.

“Private clouds often involve a cloud appliance,” he said. “It’s a multi-tenant environment that offers APIs and autoprovisioning. The private cloud conundrum is that it’s not different from the traditional multi-tenant systems of the past.” Those systems? Mainframes.

And big iron could well persist—or even shine—in the cloud. While data-center utilization can be tricky to optimize above 30% even with virtualization and workload tools, mainframes tend to run at over 80% of capacity.

In fact, IBM now claims that, compared to server farms, the newest mainframes cost less to power and cool, and take up less space. Since 2009, the company has been offering private cloud computing on IBM System z mainframes: The zEnterprise all-in-one mainframe comprises the zEnterprise 196, the zEnterprise Unified Resource Manager and the zEnterprise BladeCenter Extension, and can run up to 100,000 virtual machines. The analyst group ITCandor predicted that mainframe and Unix machine sales will grow this year thanks to the private/hybrid cloud market.

Will the cloud be green?
The relative efficiency of big iron over blades in terms of utilization, power and cooling brings up another point: Data centers are emitting ever larger amounts of carbon dioxide and consuming growing amounts of power and land. The cost to power and cool 100 server racks can be in the US$2 to $3 million range.

The demand for data centers has spurred a building boom. Rackspace recently began retrofitting an abandoned shopping mall as a data center in San Antonio, Texas, where nuclear power will save millions of dollars annually.

Rackspace’s chief technology evangelist Dirk Elmendorf knows how important it is in his business to watch every cent of cost. The company only builds out as much as it will need.

“There’s nothing worse than being under the thumb of a lot of cost,” he said in an interview with blogger Robert Scoble. “I think that’s a side-effect of our nature but also from living through the bubble, when people made bad business decisions and that kept building on itself. It’s like being at that blackjack table and you keep doubling down, hoping it’s going to work out in the end. And for most companies, it didn’t work out.”

Data-center design too has improved under duress. Early designs (if they can be called designs) often oriented servers haphazardly, so that the fans of one might be blowing hot air into the cooling intake of another. In today’s economy, such mistakes can no longer be made. Demand is sizzling for dynamic resource management software that saves energy and improves utilization, consolidating workloads and switching off idle servers. Researchers out of IBM India have proposed pMapper, a tool that facilitates “power and migration cost-aware application placement in virtualized systems.”

According to the tool’s authors, “The current power density of data centers is typically around 100 watt per square foot and growing at the rate of 15–20% per year.” Optimizing power management alone is a difficult proposition: Virtualized platforms manage power via CPU idling in the hypervisor, throttling and consolidation actions.

However, the authors said, “commercial hypervisors drop all the power management actions that are taken by the OS. Further, for multi-tiered applications, a single VM instance may not be able to determine the application end-to-end QoS, thus necessitating the need for a power management channel from the management.” This could spawn a wave of development around intelligent power conservation.

What’s more, the energy consumption implications will become increasingly politicized. According to Greenpeace, an environmental lobbying group, “…At current growth rates, data centers and telecommunication networks will consume about 1,963 billion kilowatt hours of electricity in 2020. That is more than triple their current consumption and more than the current electricity consumption of France, Germany, Canada and Brazil combined.”

While data centers bring jobs to rural Washington state, they also bring environmental concerns—and a somewhat depressing aerial view of farms interspersed with sprawling, diesel-fueled data centers. Ever the power player, Google has a subsidiary, Google Energy, that resells clean energy from wind farms and other renewable sources.

A 2010 Greenpeace report rated the square footage, number of servers, power usage effectiveness, and percentage of dirty and renewable energy data centers from Apple, Google, Microsoft and Yahoo. With conservationists sensitized to the potential impact, it will take more than simply using cloud resources for a company to label its IT strategy as environmentally sound.

Will the cloud be hijacked?
The interesting new angle around illegal cloud activity is that it may come from your multi-tenant neighbors. Amazon EC2 has been surreptitiously employed to run the Zeus password-stealing botnet, and it was openly leveraged via a purchased account to attack Sony’s online entertainment systems. A new wave of attacks using cloud services purchased with stolen credit-card numbers could come now that the FBI’s focus on data-center hosting has made it riskier for hackers.

According to Michigan-based security expert Larry Ponemon, “This year, malicious attacks were the root cause of 31% of the data breaches studied. This is up from 24% in 2009 and 12% in 2008.”

He said that malicious attacks are costly because they often go undetected, require extensive investigations, are designed to make the criminals money, and are hard to remediate. “However, it’s not always the bad guys doing bad things that cause data breaches. It’s often your best employees making silly mistakes. Negligence is still the leading cause of data breaches at 41%,” he said.
#!
Will the government grow the cloud?
Regardless of the security risks, the success of the U.S. Defense Advanced Research Projects Agency in spurring the Internet shows that for all its bureaucracy, the government is still a formidable force for innovation. In the last year, Microsoft has provided a secure, private government cloud, the Navy is toying with public clouds for ship communications, IBM is supplying the Air Force with a “military-grade” cloud, and NASA Nebula uses a community cloud to inexpensively supply researchers with compute cycles in minutes rather than months.

According to former federal CIO Vivek Kundra’s February 2011 cloud computing strategy report, “An estimated $20 billion of the Federal Government’s $80 billion in IT spending is a potential target for migration to cloud computing solutions.” He noted the poor asset utilization (under 30%) of current Federal server farms and claimed cloud services could increase that rate to up to 70%. Also, popular Federal programs such as Cash for Clunkers would be able to scale rather than crash under the weight of citizen access to them.

Finally, it’s interesting that Kundra made the analogy between utility computing and wells and electricity. Could the day come when the cloud infrastructure is as important to average citizens as other amenities?

What does the future hold?
Utility computing is just ascending the bell curve. In “The Cloud at Your Service: The when, how, and why of enterprise cloud computing,” author Jothy Rosenberg predicted that it will affect application development in myriad ways, from the emergence of application frameworks and mashups, to new database storage mechanisms and sharding techniques:

• Application logic and storage will migrate to the cloud.
• “…companies with valuable data repositories will offer higher-level services hosted on existing clouds, each with a unique API.”
• “What most call PaaS (for example, Google’s App Engine) and its natural extension—Framework-as-a-Service—will become the predominant way applications are constructed in 10 years.”

Linthicum concurred: “I think in three years we won’t be as ‘hype-y’ about cloud computing; we’ll just bake it into the infrastructure. It will be systemic and will be consumed from homes and enterprises.” He estimated that cloud computing might represent 10-20% of IT in that timeframe.

“In five years, we’re going to see lot of additional best practices emerge,” said Linthicum. “A number of large companies will be making money and profit from cloud. You’ll see people reducing their IT spending by 20%. Those who have waited for providers to mature and consolidate will now be placing big bets.”

Last will come the commoditization phase, he believed. “We’ll see foreign countries coming into this space. You’ll be able to get everything that Amazon and Rackspace does from China as a service—from the same guys who are attacking our systems now.”

But John Rhoton offered up a more radical vision beyond data centers. “The next step past data centers is that there could be enough processing power and resources on all these different client systems spread around the world. People will have computers and devices with 100x more performance than you could actually use. Think about the extent to which we could share those resources and disk space.”

Will the cloud prevail, or will it be supplanted by the “Internet of things?” It’s simply too soon to tell.