The Node.js community is a hoot. There is a period of time shortly after the creation of a new software development paradigm, but just before that paradigm takes off and becomes mainstream, that is the “fun” time, when enthusiastic developers around the world are fleshing out support for a new development environment, language or platform. It’s when Java was at its best, and when the Ruby on Rails community was still breaking the rules just by existing.

And it’s that “fun” period in which the Node.js community currently resides, as evidenced by the number of excited developers attending the Node Summit late last month. The best practices do not yet exist, and no one has really found a place where Node code doesn’t belong. It’s that happy time for an open-source project when the enthusiasts can make gobs of money on consulting, while the more playful among the community can still crank out new code and changes at a breakneck pace.

This is, perhaps, the most exciting time in the history of Node.js. You can be sure the project isn’t going to vanish anytime soon. But you can also be sure that the culture around Node.js is only going to become more corporate and stodgy, starting now. If you’re looking to join in on a software movement at just the right time, Node.js really is in that sweet spot where you can do real work with the project, but you can also still do just about anything you’d like with it in the name of experimentation.

Hadoop 2.0 to bring one cluster to rule them all
Hadoop, Hadoop, Hadoop. We certainly have spilled plenty of ink on the subject in these pages. For more than three years now, Hadoop has been the hottest of hot enterprise topics. But it’s also been a fairly limited platform. Map/reduce is all fine and dandy, but there’s an awful lot of computing that doesn’t use the algorithm at all.

But all of that is about to change. Hadoop 2.0 will be a data center operating system for processing big data, not just a massive box for storing map/reduce data. The rewriting of Hadoop’s core will produce a newer, more multitalented data platform, ready for performing any type of data processing imaginable. After all, half the appeal of Hadoop is having one place to put terabytes of data for analytics. What good is having all of that information if you are bound in by a set of functions that are limited to map/reduce variants?

Indeed, Hadoop 2.0 could, if executed properly, turn Hadoop into the center-point for your entire data center. It’s conceivable that in a Hadoop beyond 2.0, your applications, data and analytics all live inside your Hadoop cluster, running live queries on HBase, and storing user responses inside HDFS. It’s a compelling vision for a platform that has already taken the enterprise by storm. After all, the whole joy of Hadoop is that it’s really not that useful to anyone but big business.

Readin’, writin’ and software engineerin’
New York City is opening up a school for software engineers, and we think this is a great step forward for the industry. How many of you wish you had been able to learn C++ right along with algebra, geometry and Spanish?

Learning to be tech-minded at an early age can only further the innovation we’ve seen thus far. Think about it: Ten years ago, when this magazine first started, no one knew what a tweet or status update were, and a “pin” was your identifying number, not a photo that you like online. Agile was just a concept and waterfall still reigned supreme.