Neural networks can now tackle one of the most important problems our smartphone generation has ever faced: What emoji to use?

On a more serious note, neural networks (a computer system that’s modeled after the neurons in the brain) have become a powerful tool for software and robotics. Facebook uses neural networks to identify faces in photos, Google uses them for search purposes, and neural networks are behind plenty of virtual programming assistants, as well.

(Related: Neural nets try to figure out fonts)

Developers of mobile apps realize how neural networks can do just about anything, including the developers of a new app called Dango. Dango was just released into the world this week, available for Android (with iOS compatibility on the way).

Dango is not just another app or keyboard for your phone; it is a floating assistant that runs on an Android device, and it predicts things like emojis, stickers and GIFs based on what you are writing to someone.

Since Dango is suggesting an emoji, that means it needs to understand what a person is writing and predict the correct emoji response.

That’s where neural networks come in. It’s taught by randomly initializing parameters and showing the network millions of real-world examples of emojis across the web. Each time the network receives a new training example, it adjusts the parameters so that it can better predict things like:

0610.sdt-blog-dango1or…

0610.sdt-blog-dango2

According to Dango’s website, the writing of the Unicode Consortium has standardized 1,624 emojis, all which can be combined to form elaborate meanings. The semantic concepts that Dango can represent are much greater than simply the number of given emoji, according to Dango’s website. That means Dango can suggest stickers and GIFs because it has learned about Internet culture.

The process of creating Dango was a little complicated in the beginning. Developers first started with an approach that they would map words directly to the emoji. So, the pizza emoji would in fact map to the word “pizza,” and so on. This wasn’t effective because there are tons of possible combinations of words that determine meaning in a way that is impossible to describe with simple mapping. For example:

0610.sdt-blog-dango3

Because of the endless possibilities, Dango uses recurrent neural networks (RNN), which are a neural network architecture where connections between units form a directed cycle, allowing RNNs to use their internal memory to process arbitrary sequences of inputs. RNNs can keep track of what they saw earlier, so it can tell the difference of emojis that represent “I am happy,” or, “I am not very happy.”

When texting friends or family members, sometimes it’s easy to communicate through the use of GIFs or emojis. Why ask a friend if they want to go out for drinks when it’s easy to type three beer emojis followed by a question mark? The team behind Dango agrees, and wrote that emojis, stickers and GIFs have become a popular part of language, despite that it’s still “labor-intensive to use them in an advanced way.”

“This visual language has matured alongside technology, and this symbiotic relationship will continue, with new technology informing new language, which in turn informs the technology again,” according to Dango’s website. “Communication in the future will have artificial intelligence tools adapted to you, helping you seamlessly weave imagery with text.”

Let’s just let artificial intelligence say what we are feeling, especially when words just aren’t enough ?