With all the supposed technological benefits of the cloud, one has emerged as possibly the most exciting benefit for development managers: lowering your monthly bills. Every month, Amazon, Rackspace, GoGrid, Salesforce, or whichever service you’re using will bill you for the resources that you consume. And only for what you consume.

A cloud bill certainly represents a different expense model for your business than buying or leasing physical servers, in large part because your software is directly generating these bills.

In theory, the better your software is written, the lower your cloud charges may be. Conversely, the less efficient your software, the higher your charges may be.

This is a throwback to the days of mainframe computing, where departments or customers were charged for computing resource units. A rogue algorithm or a misfiring loop could incur hundreds or thousands of dollars in billing. With a close eye on their budget, programmers and systems analysts learned how to make sure their batch jobs would run cleanly and efficiently.

Those skills are needed today. In a model where you own your servers, and those servers live inside your data center, poorly written software means software that runs slowly. If a deployed application runs too slowly, the development team might work to optimize it—if anyone even notices. Alternatively, the IT staff could throw more hardware at the problem by moving the application to a bigger box, adding some memory or balancing the load between more machines. Either way, once the fix is done, it’s done.

That’s not true in the cloud. Inefficient code requires more resources, and therefore results in higher costs for your business—and more profit for the cloud provider. Depending on the application, it may be worth dedicating staff resources to code or database optimization. Reduce the bandwidth, reduce the memory, reduce the database access, improve the algorithms. In some cases, you may even be able to reduce the size of the virtual server your app needs in the cloud. All these efforts may pay off in next month’s bill.

The cost equation offers a tantalizing reason to take advantage of in-cloud services, such as dynamic scaling and server winnowing. Also, shrinking cloud servers during non-peak hours is another way to save money.

The cloud has the real potential to reduce your computing costs, and careful planning and performance optimization may make the costs even lower.

Changing of the guard
For the past eight years, this newspaper has recognized our industry’s top leaders and influencers in the SD Times 100. What we’ve seen is that while large companies, such as IBM and Microsoft, dominate many of the longer-running categories, innovation still occurs in smaller companies that are taking the industry in exciting new directions. Meanwhile, many of the old standbys are holding up as well as ever.

In 2003, the inaugural SD Times featured 10 categories, each with 10 companies. We recognized leaders and innovators in Test & Debug, Deployment Platforms, Components & Libraries, Modeling, and others. We had a category for Standards Bodies & Consortia, which has morphed into today’s Influencers category. This year, Agile & Collaboration joins the list of SD Times 100 categories.

That debut SD Times 100 listed such companies as Rational Software, Merant, Mercury Interactive and BEA Systems. All are gone, as Sun Microsystems now is, but the legacy of their work continues at IBM, Serena, HP and Oracle, perennial leaders of the industry.

This year, honorees now include open-source software projects and websites as well as companies. So, StackOverflow.com, Git and Gizmox take their place alongside the graybeards of the industry, along with the NoSQL movement, Django, Engine Yard and the OSGi Alliance.

We hope you enjoy the 2010 SD Times 100.