The world is not going to end on Dec. 21, 2012.
The operative meme here is that the great stone wheel of the Mayan calendar ends on that date. Therefore, the world is going to end.
Actually, the Mayans had a 400-year calendar cycle very much like our own. We add one day to our calendar every four years. Every hundred years, we don’t add a leap day, but every 400 years, we do. So 1900 didn’t have a leap day, but 2000 did. We diddle the calendar like this to keep it in sync with the actual progression of the Earth around the sun, so that our solstices and equinoxes always occur on the same day of our calendar.
The Mayans did the same thing, and every 400 years they started a new cultural cycle. Their civilization eventually collapsed, evaporated, vanished when they were no longer able to irrigate and fertilize their crops. They left behind some stepped pyramids and some calendar wheels. The Illiterati promptly assumed that the ancients knew some vast profound secrets that have remained unknown to humankind for thousands of years, and are still a mystery to modern science and technology. Ancient aliens usually figure into this meme, sometimes with crystal skulls.
The world did not end at the stroke of midnight on Dec. 31, 1999 either. That was a far more real and knowable threat to the stability of the information age. But we survived that one too.
On Jan. 1, 2000, airplanes did not fall out of the sky. Nuclear reactors did not melt down. Electrical grids did not go dark. Cell phones did not go dead. The Internet did not disappear. The predicted apocalypse did not occur.
Does anyone remember the Y2K panic? Or why it happened in the first place? Let’s recap.
Back in the Mesozoic era of computing, bytes were expensive, so programs had to be small. Whether you were using Fortran or COBOL or SNOBOL or hand-coding in assembly language, you had to be efficient.
A single byte can contain a numeric value between 0 and 255. That could have been enough to store 256 year values, but that would have required extra lines of code to translate that into a readable numeric value. So programmers stored the year value in two bytes, each byte containing an integer from 0 to 9. This gave the programmer 100 numeric values, 00 to 99. This was generally considered an efficient use of RAM and a good way to save space on precious storage media. In the 1960s, an 8-inch floppy could hold only 80K. (That’s K as in Kilobytes.)
With memory so spare, storing a year value as two bytes made more sense than using four bytes to add a redundant 1900 value. So year values were stored as 63 and 75 and 81 instead of 1963 and 1975 and 1981. At the time, 2000 was so far off that programmers operated under the assumption that everything then current would have been replaced by far more efficient machines and better software. It was a fair assumption. Moore’s Law was in high gear. Chip speeds and RAM capacities were doubling, the price per megabyte was falling. And we didn’t hit the heat ceiling until after the millennium. The word “legacy” was not part of the conversation because most people in the industry were looking ahead and not very many seemed to be considering the baggage we were dragging along from the past.
#!
Toward the end of the 1980s, only a few Cassandras were predicting the collapse of Western civilization. On Jan. 1, 2000, they warned that every machine still storing date values in only two bytes instead of four would now see the year as 1900, not 2000, and that would cause massive system failures everywhere. Accounting systems in particular would fail and the global economy would evaporate in an instant. This turned out to be a valid concern. Many companies, including many financial institutions, had not updated their existing software.
You could visit almost any hotel or restaurant and see that their clerks were still using DOS-based systems, probably some variant of dBase II. It was a little more startling to see a DOS-based inventory screen at Fry’s Electronics. And a look at the computer screens of your local bank might have been even more disturbing. But consider the state of the industry in 1995. Windows was only five years old—Windows 95 had just come out—but there were still concerns about the ruggedness of Windows for critical applications. And besides, all those DOS applications still worked fine. Why spend money fixing something that ain’t broke? IT Managers are rarely in a hurry to make more work for themselves.
Most personal computers are obsolete within five to seven years. Even for diehards, systems get upgraded or replaced at least once a decade. Software advances also take advantage of the increased power available. In the 1990s, it felt like we were in a mad caucus race of achievement, so it was easy to assume that most of the machines and most of the software in use in 1995 would probably be replaced by 1999, and the Y2K bug would only be an issue for those who hadn’t kept up.
But the real issue was never the PC. It was the embedded systems in banks and factories and distribution centers. It was the legacy code written in Fortran and COBOL and SNOBOL and assembly, code that hadn’t been updated in years. This was the real source of the Y2K “panic.”
Many of the original programmers were no longer available, there were fewer programmers working in those “ancient” languages, and much of that code was not well documented. Written before the days of structured languages and object-oriented programming, a lot of it was even spaghetti code. Programmers with any sort of skill in any of these languages had the opportunity to earn some serious bucks in the final months of the last millennium. Based on the evidence of what didn’t happen, the Y2K bug was firmly squashed before New Year’s Eve.
But what the Y2K non-event demonstrated was the creeping obsolescence of our information infrastructure. As widespread as Fortran, COBOL, ALGOL and SNOBOL were in the days of mainframes, they’ve been pretty much forgotten—put away like crazy old Aunt Emily off in the nursing home. While it might be premature to throw them on the cart when the Monty Python fellows trundle through hollering “Bring out your dead,” neither can they be considered mainstream languages anymore.
Other languages, like Forth and Lisp (and a few other experiments which never gained any real traction) have also been forgotten. Even the rock star of the 1980s, Turbo Pascal, has faded from the scene. Many younger programmers, if they’re aware of this history at all, regard these older languages as artifacts of an ancient time, obsolete and primitive. The most popular languages in use today include Java, PHP, C, C++, C#, Python, JavaScript, Perl, Ruby and Visual BASIC.
#!
Every so often, one of the anthropology or science magazines will report that the last surviving speaker of an indigenous language has died, making that language officially dead. Access to pronunciation, grammar, definitions and contextual usage has disappeared, and whatever may still be known of that language ends up in some journal of anthropology or even worse, lost on some back shelf of a museum.
Computer languages tend to be better documented, of course, and future students of “ancient technology” should have no shortage of manuals and textbooks to use as a starting point for their digital spelunking. And if all of our ancient computer languages were safely tucked away on some back shelf where they could do little more than trouble the sleep of computer science majors, there’d be no need for this exercise.
The deeper question is how many embedded systems are still running antiquated legacy code—code that cannot be maintained or upgraded because of a shrinking number of individuals proficient in that language? What happens when the last speaker of a dying language passes from the scene?
Twelve years ago, we had a lot of attention on this problem because we saw the consequences of such a failure. Code was inspected, updated or replaced, and we stepped into the third millennium, assuming that everything was okay. But is it really? How much legacy code is still out there? How many systems are still running on automatic without anyone in a position to monitor or patch or upgrade the software?
Over here in the PC world, a majority of users experience a near-total replacement of hardware and software every five to seven years, but even here in PC land there are people still running DOS. There are people still running Windows 2000 and Windows XP. It works for them, they’re used to it, they get their work done, and they see no reason to upgrade.
The PC version of the problem is that we are losing the ability to access documents written in obsolete file formats. WordPerfect and WordStar and Ami Pro are gone, but files created by those programs remain scattered across a myriad of backup media. The older you are, the more likely you are to have valuable information stored in formats you can no longer access. If you haven’t kept a copy of the legacy software, then you need to find a conversion program.
Getting a script from Scriptware into Final Draft can be a time-consuming chore. It’s not impossible, but it will ultimately require hand-checking the formatting of the script. Getting a document from Ami Pro into an .rtf format is easy, if you still have Ami Pro. If not, prepare to do some googling.
Converting a classic WordStar file means stripping off the high bits. I have a utility I wrote in Turbo Pascal to do that. It runs under Windows XP, but it won’t run in 64-bit Windows 7. Fortunately, I have an old laptop running XP. Notice how the pathways are growing ever more tenuous?
Where will we be in 10 or 15 years? What happens when we’re running Windows 12 and we can’t boot our legacy software at all? (Hmm. That might be a useful market niche for a small company to explore.) How much of our own history, both public and personal, will no longer be available to us? That might end up being far more significant than either the Mayan calendar or the Y2K non-event.
What do you think?
David Gerrold is the author of over 50 books, several hundred articles and columns, and over a dozen television episodes, including the famous “Star Trek” episode, “The Trouble with Tribbles.” He is also an authority on computer software and programming, and takes a broad view of the evolution of advanced technologies. Readers may remember Gerrold from the Computer Language Magazine forum on CompuServe, where he was a frequent and prolific contributor in the 1990s.