The past five years have seen the incredible growth of Hadoop, from small Apache project to the “next big thing.” Of course, as the next big thing, everything up to this point has simply been experimentation. Some organizations already have Hadoop deployments, but the vast majority of enterprises are still looking at this newfangled open-source project and weighing their options.
The truth is, now that Hadoop 2.0 has arrived, there are no more excuses for not starting down the road to Hadoop adoption. Your organization should seriously consider beginning experiments if it has not already. Why? Because Hadoop competency can drive business advantages. While those advantages were severely limited by Map/Reduce and the difficult learning curve for this new data-analysis platform, Hadoop 2.0 allows any type of workload to be moved onto this platform.
The time to get started is now. The Hadoop platform is enterprise-ready. All that remains is for the various distribution providers to polish their 2.0 branch offerings, and for the various enterprise necessities (security, governance, training) to start sorting themselves out. By the time those problems are solved, your experiments should give you an idea of what, exactly, you’ll need to adopt this growing platform.