Java’s emergence 20 years ago was the last time a programming language enamored the industry. It was not the first time: It had been the rhythm of the programming community to anoint a new “it” mainstream programming language every seven years or so. While that pattern has clearly been disrupted, I believe that it is more than possible that another language will sweep into popularity in the coming years.
There were several things that set Java apart. First was a syntax that seemed, initially, to be close to C++. C++ at the time was greatly in demand, but many programmers were finding it challenging to master. By 1995, lip service to the object-oriented paradigm was established in most teams, but many developers struggled with C++, whose flexibility was a two-edged sword: You could program C++ in so many ways that it was difficult to know which was the “correct” object-oriented approach.
(Related: Java makes it to 20 years)
The syntax of Java led many programmers to embrace it as a simpler C++: There was the thought that you could prototype an application in Java and then port it to C++, and there was the opposite thought, that you could take your C++ codebase and port it to Java to run on Unix, Windows or the Macintosh (not that the clueless Apple was likely to remain in business for much longer).
The familiar syntax led many programmers to think that not only would their software projects be potentially portable to Java, they themselves would become portable—improving their resumés with minimal investment. (It may seem incredible, but 20 years ago programmers worried about finding employment.)
Another benefit of Java was that Java programs seemed much shorter than the equivalent C++ programs. This was, to some extent, an illusion, since the ratio of memory management to logic in code decreases as applications become larger. But it certainly seemed dramatic in the brief listings available to magazine articles.
I think Java’s single-best design decision was “Almost everything is an object.” Developers divided their world neatly into compiled versus interpreted languages, and the common wisdom was that interpreted languages were crippled in their performance. The prejudice against interpreted languages could often be bolstered with a simplistic benchmark, and the conversation rarely went further (as if most enterprise development centered around integer matrix manipulations). But Java, with its stack-based primitive types, was resistant to such easy dismissals, and its use of a virtual machine shifted the conversation from a neat “Compiled is fast, interpreted is slow” narrative into a more nuanced discussion of the state of the art.
Finally, one cannot talk about Java’s rise without talking about the World Wide Web. In 1995, the Web was where, perhaps, the Internet of Things is today: Companies might have some awareness that maybe this thing would have some impact, but it was primarily something that geeks enthused about at 9600 baud or less on CompuServe forums and Usenet newsgroups. (Fun fact: NNTP used to be for something other than porn! Or at least, in addition to porn.)
While it was not clear what the Web might become, the one thing that was widely agreed upon was that developers needed more than basic HTML. While Java had originally targeted embedded systems on consumer devices, Sun capitalized on the explosion of interest in Netscape’s browser and announced that Java would be “integrated” in Navigator. Many developers (including yours truly) took this to mean the browser was going to become a universal window (or window frame), that HTML’s text-readable tree was going to be generalized into a common data representation (imagine a hybrid JSON-DOM concept), and that developers would be able to enhance or override the evaluation of the DOM.
Yeah, so that didn’t happen. Instead, we got the travesty that was the browser plug-in model, about which the less said the better.
Instead of plug-ins, Web apps today are the realm of JavaScript, which is also celebrating its 20th birthday. Originally called “Mocha” and then “LiveScript,” by the end of 1995 Java so dominated the conversation that Brendan Eich’s language was rebranded “JavaScript” despite having absolutely no connection to the work of James Gosling and his team.
JavaScript is, today, a programming language that every developer must master. Not only is it at the core of every browser-based app, Node.js has gained a substantial server-side share of it as well. But while I think JavaScript is important, I think it’s a poor foundation for enterprise development. Of course, Java has flaws too, and as I’ve just discussed, its success was aided by contingency, misperception and lucky timing. There’s a strong case to be made that such things are critical for programming language popularity.
But it seems to me that if anything can restart the cycle of programming language dominance, it’s the manycore era. Admittedly, distributed cloud computing and post-PC mobile forms have diminished the primacy of desktop performance, but I believe this is only delaying an inevitable demand for a new concurrency model for mainstream development.
It seems to me that the most appealing language is likely to be a hybrid object-functional language, but that may be based more on advocacy than analysis. What do you think? What language can you imagine writing a retrospective on in 2035?