Abstract neural network

Facebook is continuing its mission to make the world more connected in its latest research effort. The Facebook Artificial Intelligence Research (FAIR) team announced a new approach to neural machine translation. According to the researchers, the approach is able to achieve nine times the speed and accuracy of existing neural systems.

This approach is designed to break down language barriers so everyone can consume content in their preferred language, Facebook said.

As part of its research, the team is making the sequence modeling tool site source code and trained systems available to the open software community. With the code, researchers can develop custom models for translation, text summarization and other tasks, according to the FAIR team.

“A major consideration with neural machine translation for practical applications is how long it takes to get a translation once we show the system a sentence. The FAIR CNN model is computationally very efficient and nine times faster than strong RNN systems. Much research has focused on speeding up neural networks through quantizing weights or distillation, to name a few methods, and those can be equally applied to the CNN model to increase speed even more, suggesting significant future potential,” the team wrote in a post.

The researchers hope their process will pave the future for other text processing tasks such as dialogue systems that can respond to complex questions better.