Last time, I wrote of my conversation with Grady Booch, a legendary thinker who first made his mark in the mid-1990s (as co-inventor of the Unified Modeling Language and contributor to the Rational Unified Process). He continues to be on the cutting edge of development in his role as IBM Fellow.
He spoke of a new generation of information worker who’s not afraid to roll up his sleeves and get dirty in code. With websites such as Code.org, Booch said he sees “an effort to teach the average person coding as a skill. There is a degree of fragility in non-programmers building things, and it produces a security risk as well. You might meet a short-term need, but you don’t realize the technical debt as people build on it in ways that were never intended.”
Meanwhile, professional developers have a responsibility to deliver up frameworks that can glue many existing systems together, he said. “Like sewers and plumbing, they don’t get press but they are essential. Building large software systems is more like city planning. Facebook and LinkedIn had it great for a long time because they had no legacy code. [Facebook founder Mark] Zuckerberg would say, ‘Move fast and break things often.’ Now, they don’t want to break things as often” because of all the legacy code.
“Even Facebook is starting to begin having legacy issues,” said Booch. “It’s not a lot of fun dealing with [code] hygiene. It’s also easier not to floss, but you suffer the consequences if you don’t. In software, that means incurring technical debt, creating legacy code…”
Businesses today underfund their legacy systems, Booch said, because they see more of a competitive edge in Web and mobile user experiences. He noted a “transit point you see in startups”: as a company grows in maturity, they see the tedious stuff involved with software. And that has led to a rise in such things as microservices and containers: small piece of configured code that can easily be swapped for other microservices in other containers.
Booch pointed out that it’s challenging to build large systems in today’s world of agile development and continuous software delivery. “So today we build lots of small systems,” he said. “We build systems we teach and that learn, and that becomes an element of software design we didn’t have in the past.”
Systems that learn. Artificial intelligence. The rise of the machines.
Some fear a takeover by super-intelligent machines that could lead to the extinction of the human race as foretold in books and films, such as “Ex Machina.” Philosopher Nick Bostrom is the founding director of the Future of Humanity Institute and has done work on the existential risk of artificial intelligence. “I don’t fear the rise of robotic overlords,” Booch said. “I fear the shaky software on which today’s world exists.”
Booch referred to the Future of Life Institute’s open letter calling for research on the societal impacts of artificial intelligence, signed by Stephen Hawking, Elon Musk, Noam Chomsky, Steve Wozniak and many others. They fear the rise “of autonomous killing machines,” Booch said. “We need transparency and public discourse” around the research and construction of artificially intelligent machines. Machines can learn, he continued, but when they learn how to learn, “we’ve reached a threshold where there’s no turning back.”
But he cautioned that we’re still at least a generation away from building a system with the intelligence of a reptile: flexible and adaptable, but lacking the versatility of the human mind. “And we’re years off from where machines can move from system to system. The systems we’re building now change the way we work. Not only machines evolve. There are deep-learning algorithms and neural networks, and we still don’t have good theories into why they work.”
Today, Booch is working on cognitive systems. “What does it mean to take Watson and embody it in the world?” As for his research, he said, “It’s at a point in time we don’t know the journey but know the direction to walk. We can see the end point but don’t know the mountains or crevices along the way.”