Personal digital assistants like Apple’s Siri and Microsoft’s Cortana are just the beginning. Soon we’ll be interacting with all manner of applications and online services—everything from managing our calendars to ordering pizzas and reserving hotel rooms—not with the buttons, forms and controls that we’re used to, but by chatting with them using natural language. Or, at least, so say the big brains at some of tech’s largest companies. Could they be right?
Lately it seems as though the whole industry has gone chatbot crazy. Facebook reportedly already has more than 11,000 bots running on its messaging network, with some 23,000 developers signed up to use its bot development tools. And many of their most fervent proponents say chatbots will replace traditional mobile apps altogether in just a few years.
(Related: University creates AI bot that can sing its own Christmas songs)
That’s bound to leave some developers with their heads spinning. For some, developing a chatbot instead of a GUI application will seem like such an alien concept that they won’t know where to begin. But the truth is that a bot-driven future needn’t be scary, and it probably won’t look much different from the developer landscape of today. Developers will still be using familiar tools and APIs; they’ll just be using them in new ways.
What bots are and aren’t
First things first: Obviously not every traditional application is going away. A chatbot can’t effectively replace Angry Birds, let alone something like Excel or Photoshop. But some studies show mobile app downloads are declining, with many mobile users becoming unwilling to install a new app for every brand they engage with.
If that’s the case, then a single app that can respond to a variety of queries—from “remind me to pick up the kids at 6 p.m.” to “I need to order some new eyeglasses”—is an interesting idea.
Still another driver for the rise of chatbots is the massive adoption of messaging apps like Facebook Messenger and WhatsApp. According to figures from analyst firm Activate, around 2.5 billion people are already signed up for at least one such app, any of which makes an ideal front end for connecting with chatbots.
In addition, a “chatty” interface opens the door to a variety of new devices beyond PCs, tablets and phones. Traditional GUIs work fine for devices with screens, but what about a wearable, an in-car system, or a connected home device like Amazon Echo?
Make no mistake, though: We’re still a long way away from the fully conversant, human-sounding robots of sci-fi stories. While the tech titans are working on artificial intelligence and machine learning to power their bots, attempts to bolt primitive AI onto bots have so far been disappointing. (Witness Microsoft’s Twitter-bot, “Tay,” which started spewing racist hate speech after a few days of being “trained” by online troublemakers.)
Stepping up to bot
So what will the typical chatbot look like? Simply put, it will be a new kind of interface that lets humans interact with information systems in new ways, much like mobile apps when they first began to displace web apps. The main difference is that while users are comfortable clicking through several pages of a GUI, making decisions by interacting with controls and filling out forms, chatbot interfaces will need to reduce tasks to a minimal number of transactions and automate more steps along the way. For example, if you say “I need a large pepperoni pizza right away,” the bot might respond with, “Would you like to use the credit card on file?”
Behind the scenes, though, a chatbot’s back-end needs are remarkably similar to those of a mobile app. It needs to authenticate and send credentials, query databases, store information, and interact with systems ranging from payroll and sales automation to e-commerce engines. And the keys to unlock these systems for tomorrow’s bots will be—you guessed it—APIs.
In fact, most developers probably won’t write the linguistic portion of their bots themselves, either. Even fairly simple natural language processing can be tricky to get right. Instead, they’ll probably want to use any of the several public APIs that provide access to the big players’ cloud-based machine learning systems. Similarly, if they want their bots to interface with messaging platforms from the likes of Amazon, Facebook or Twitter, they’ll connect with APIs for that, too.
Much as with modern mobile apps, you can think of a bot as essentially a UI that sits on top of a layer of APIs that allow it to orchestrate what it needs to do. The rise of this API-first development style is a powerful tool, one that has made it possible to move relatively easily from web apps to mobile apps, chatbots and beyond. But it also presents challenges to organizations more accustomed to building monolithic apps based on traditional, waterfall-style development practices. Thankfully, the emergence of API-management solutions as a new category of middleware has helped mitigate many of these concerns by automating such functions as discovery, security and life-cycle management within the API layer.
The rise of chatbots certainly doesn’t mark the first time developers have been tasked with changing how they think about application design in recent years, and it certainly won’t be the last. The takeaway should be that shedding old ideas and practices now and adopting an API-first mindset will not only give an organization a head start on the chatbot wave, but it will also make it better prepared for whatever comes next.