It’s no secret that the cycle of progress in computing is a large step forward in processor technology, followed by catch-up steps in supporting hardware, followed far later by software. This phenomenon has been in place ever since the emergence of the PC as a mass-computing platform. (Prior to that, vendors relied principally on closed systems, so they released hardware advances in conjunction with new software to exploit them.) Processor advances generally lead software by the widest margin. Network advances are probably next; while at the other end of the spectrum, memory architecture leads the least.

Processor breakthroughs, such as multicore and what is now called many-core, are still far from being fully utilized. Most cores on desktops tend to go unused. And even on x86 servers, it took virtualization to sop up all the execution pipelines that today’s server boxes deliver. (Consider that a low-end, four-way box with quad-core CPUs and hyper-threading enabled provides 32 execution paths.)

With Intel pre-announcing a strategy of hugely more cores per chip, it’s clear that parallelism in hardware will greatly exceed software’s reach for years to come. It’s hard to know exactly what factors drive the continued development of hardware features that go unused for so long, but it’s a trend that shows no sign of abating.

An advance that predates multiple cores but that is only now coming into its own is 64-bit computing on the desktop. In 2005, 64-bit extensions were made available on the x86 architecture after the famous AMD-Intel stare-down was won by AMD. Since then, OS vendors have made available 64-bit operating systems that worked well enough on x86 servers but gave desktop users little in the way of benefits. In fact, users who were tempted to use Windows x64 (or the Linux equivalents) found themselves handling a frustrating lack of working device drivers, as well as an absence of software that would take advantage of its 64-bitness. Even today, many PCs that run an x64 version of Windows place most of their apps in the “Program Files (x86)” directory, which is where 32-bit software lives and breathes.

However, two trends have suddenly pushed 64-bit desktops and laptops to the fore, where I believe they will finally find traction.

The first is the dramatic drop in the cost of RAM. Multi-gigabyte sticks of fast RAM from brand vendors can be had for less than US$30/GB. This means that the 32-bit threshold of 4GB is now no longer a price barrier, so much so that notebooks with 6GB of RAM are commonly available for under $1,000.

The second important contributor is the new wave of operating systems. Mac OS X led the way with its release of Snow Leopard in the middle of last year. The bundled apps are now all 64-bit. On the Microsoft side, the release of Windows 7 stimulated interest in upgrading operating systems and, in many instances, to new hardware whose RAM capacity now dictated 64 bits.

In addition, Microsoft has committed to rolling out an increasing number of 64-bit apps as the primary delivery format. A year from now, I predict, when someone identifies which OS they’re using, they’ll be obliged to tack on “32-bit” when that’s the case, because the understood covenant will be that all are 64-bit.

While this change is indeed upon us, it’s unclear that everyday users will have any need for so much capacity. Even today, it’s exceedingly difficult to fill 4GB on a notebook or PC unless you’re in massive Photoshop sessions or running workstation applications. The average user—the so-called knowledge worker—will be hard pressed to exceed 2GB, let alone 4GB. Still, as developers, we’re increasingly going to have to meet the expectation of 64-bitness and exploit its capacities.

Fortunately, the issues are well known. In C and C++, there are the famous ILP (integer-long integer-pointer) width issues, which are discussed in numerous places and increasingly resolved by renaming all primitive types to include their width, rather than relying on ambiguous terms such as “short” and “long.”

In Java and .NET, the issues are subtler: Garbage collection is a very different experience when there’s a 4GB heap to clean up. Fortunately, JVMs from the server-side (where 64-bit has been the norm for several years) are available. These, such as Oracle’s JRockit Real Time, specialize in careful management of the garbage collection cycle to do incremental, background cleanup rather than all-at-once cycles that can freeze the machine.

The machines are there, the OSes are in place, and the development tools are ready too. This means that, at last, ISVs and in-house IT departments should seriously consider the inevitable migration and begin planning for it, and embrace 64 bits for green-field projects.

Andrew Binstock is the principal analyst at Pacific Data Works. Read his blog at binstock.blogspot.com.