Augmented intelligence is growing as an approach to artificial intelligence, in a way that helps humans complete tasks faster, rather than being replaced by machines entirely. 

In an IBM report called “AI in 2020: From Experimentation to Adoption,” 45% of respondents from large companies said they have adopted AI, while 29% of small and medium-sized businesses said they did the same. 

All of these companies are still in the early days of AI adoption, and are looking for ways to infuse it to bolster their workforce.  

Ginni Rometty, the former CEO of IBM, said in a talk at the World Economic Forum that augmented intelligence is the preferable lens through which to look at AI in the future.

RELATED CONTENT:
A role in identity verification
Implications and practical apps for AI and ML in embedded systems

“I actually don’t like the word AI because cognitive is much more than AI. And so, AI says replacement of people, it carries some baggage with it and that’s not what we’re talking about,” Rometty said. “By and large we see a world where this is a partnership between man and machine and that this is in fact going to make us better and allows us to do what the human condition is best able to do.” 

Augmented intelligence is a cognitive technology approach to AI adoption that focuses on the assistive role of AI.

“I would explain augmented intelligence as something where you are augmenting a human being to do their tasks a lot better or more efficiently,” said Dinesh Nirmal, the vice president of Data and AI Development at IBM. “You’re not replacing anyone, but you are augmenting the skill set of that individual.”

The choice of the word augmented, which means “to improve,” reinforces the role human intelligence plays when using machine learning and deep learning algorithms to discover relationships and solve problems.

While a sophisticated AI program is certainly capable of making a decision after analyzing patterns in large data sets, that decision is only as good as the data that human beings gave the system to use.

Full automation is a ‘delusion’
“Full automation is a myth,” said Svetlana Sicular, research vice president at Gartner. “There is quite a bit of delusion that everything can be automated. There are certain things that can be automated, but humans have to be in the loop.”

She added that the majority of situations for creating full automation are very expensive and very hard to reach. 

“Once in a while, AI will go berserk for multiple reasons simply because I’m using it at the edge. So if you consider my phone the edge, I might lose the connection or there might be too many people in this area. Like in one instance, my navigation kept telling me on the highway to turn left while there was nowhere to turn left and I was actually thinking, ‘What if I am in an autonomous car?’ ” Sicular said. 

There are tasks that require expertise and decision-making that can only be accomplished by the essential creativity that only humans could bring to the table, according to David Schubmehl, research director of Cognitive/AI Systems at IDC. 

“AI is really an assistant to help you get done with the mundane and mindless tasks more quickly so you can focus on the more challenging creative aspects,” Schubmehl said. 

Autonomous AI is being used across organizations that typically require very repeatable tasks such as customer churn and telecommunications, recommendations in retail and supply chain, Sicular mentioned. 

Since AI adoption is in its early stages, enterprises don’t necessarily know whether adopting AI models would greatly expedite their efficiency. Sicular said that first there must be a lot of analysis as to whether AI is really worth adopting for certain use cases. 

Sicular also said that there are two large trends happening in the world of AI: the industrialization of machine learning and AI to make them better for scaling, and also the democratization of AI to spread the benefits of the technology evenly.

AI’s move to industries has led companies to look for an all-in-one solution. 

“Data scientists are far fewer compared to developers. That’s why there’s a big effort to try to deliver some kinds of AI in a box that could be slightly modified by developers, and another big effort is how to scale it,” Sicular said. 

Up until recently, all machine learning was done manually, which means there were PhDs who could develop their own custom algorithms. And most of those algorithms were not developed and deployed at scale, according to Sicular. 

“You buy a service off the shelf, you go through the crowd, you can get image recognition, speech recognition, forecasting, text analytics. All of this is available without having specialists in your organization or skilled people and so on,” Sicular said. “But the question that’s at the core of augmented intelligence is how this is being adopted and for what purposes.” 

Augmented intelligence can be seen in sales coaching in which inside sales employees are getting advice as they talk to customers. 

In healthcare, AI is used to help doctors or specialists find some of the things that they have missed rather than sift through hundreds of thousands of documents.

AI for IT is called AIOps, which is augmenting the workload of managers and IT workers by helping them detect anomalies or disruptions much faster. 

“A lot of customers are having trouble with the amount of data that’s coming in,” IBM’s Nirmal said. “Let’s say you have IoT data, transaction data, behavioral data, social data and other places, and the question is how do you do data discovery and its classification. At least every enterprise that we are working with, there’s a lot of interest in adopting AI.”

An example of augmenting would be a data engineer finding data that looks like code but was put into the zip code field. The data engineer can then determine whether the data makes sense in its place. If it’s not a zip code for example, but instead a social security number, then the data engineer can go and change it. The machine learning model will then know that this number is not a zip code for next time. 

Another big area of interest is in creating alerts that can detect anomalies and can be used in data centers, according to Nirmal. 

“Customers are always wanting to figure out a problem before it happens. So anomaly detection becomes pretty critical to make sure that you see a lot of alerts for our data coming or logs coming in,” Nirmal said. “There are some tasks such as fraud detection in which AI tends to generate a lot of false alerts and humans with deep vertical knowledge need to then oversee the process.”

Augmented intelligence can also refer to tools such as AI transcriptions from meetings or add-ons to PowerPoint that make recommendations on how to improve the slides as one goes  through them. 

Developers also have access to tools that use AI to create more efficient code reviews to speed up the SDLC. For example, DeepCode shows errors in code based on troves of similar scenarios that occured before, and then provides context on how to fix them. 

“What Grammarly does for written texts, we do the exact same thing for developers,” said Boris Paskalev, the co-founder and CEO of DeepCode.  “There are many areas where I believe that augmented intelligence actually helped developers because now we have a machine learning representation for code. In the DeepCode platform, you can really add any service on top of it because you really have the knowledge of the global development community that you can index in real-time. So we can get the answers in seconds, which is quite exciting, considering these capabilities did not exist just a couple of years ago.”

All in all, companies are growing their investments in AI and it is becoming a fundamental part of every industry’s digital transformation, according to IDC’s Schubmehl. 

“Amazon wouldn’t be where it was without machine learning and AI, because it’s a huge part of its offering,” Schubmehl said. “We’re really in the early days. We’ve finally gotten out of the prototyping phase and we’re actually starting to put real AI and machine learning into production. I think we still have years and years to go before AI is what I would call fully mature.”