Is literally everything about the cloud? You’d think so, going by the chatter from the biggest industry players. It seems that every company that wants to talk is pushing something to do with cloud computing. New service offerings from hosting providers. New tools for optimizing the performance of applications, or for making it easier to migrate, or for making cloud-based development more agile.
The cloud sure is seductive. In our company, we’re considering a migration to cloud technologies within the next 12 months. BZ Media, the organization behind SD Times, is a small company, and frankly I’d rather not be maintaining servers, either in-house or dedicated hardware in a collocation center. If the economics of cloud computing work out, and if reliability and scalability deliver what we need, then it’s a good thing.
Yet I’m puzzled. How much is cloud computing a software development conversation, rather than an operations conversation? Obviously the platforms are different: Windows Azure is different than Windows Server 2008. Microsoft’s SQL Azure is different than Microsoft’s SQL Server. The Java EE that VMware is pushing into Salesforce.com’s cloud isn’t the same Java EE that’s on your current in-house app server. Working with Amazon S3 is not the same as working with an EMC storage array.
So yes, there’s an undeniable learning curve ahead. But that’s what you’d encounter in any significant server platform change, whether cloud, on-premise or collocated.
Thus my confusion. How much does a software development team need to know about the cloud, beyond how to deploy to it and integrate applications with cloud-based apps? Tell me what you think at firstname.lastname@example.org.