MetaMind Pushes Deep Learning Boundary of Natural Language Processing

Friday, June 26, 2015

MetaMind Pushes Deep Learning Boundary of Natural Language Processing

 Machine Learning
Deep learning start-up MetaMind has published details of a system that is more accurate than other language processing methods.  The company is developing technology designed to be capable of a range of different artificial-intelligence tasks.





Google and Facebook, are investing huge sums into the  research and development of improved artificial intelligence algorithms for processing language.

Around the world, various research groups are making steady progress toward improving a computer's language skills especially using recent advances in machine learning.

"The insight—and it’s almost trivial—is that every task in NLP is actually a question-and-answer task."


The most recent development in Natural Language Processing (NLP) comes from a start-up called MetaMind, which has developed a language recognition system that is more accurate than the leading systems available on the market.

MetaMind has published new research detailing how their neural networking system uses a kind of artificial short-term memory to answer a wide range of questions about a piece of natural language.

According to MetaMind, the system can answer everything from very specific queries about what the text describes to more general questions like “What’s the sentiment of the text?” or “What’s the French translation?” The research, due to appear next week at Arxiv.org, a popular online repository for academic papers, echoes similar research from Facebook and Google, but it takes this work at step further.

MetaMind was founded by Richard Socher, a prominent machine-learning expert who obtained his PhD from Stanford where he worked with Chris Manning and Andrew Ng. Socher tested his algorithms using a data set compiled by Facebook for measuring machine performance at routine comprehension tasks. MetaMind's software ended up outperforming Facebook's own algorithms.

MetaMind Image Classifier

The new technology is designed to be capable of different artificial-intelligence tasks including image classification and sentiment analysis. The work is indicative of ongoing success in giving machines more efficient learning and comprehension.

“The insight—and it’s almost trivial—is that every task in NLP is actually a question-and-answer task,” Socher told Wired.

A key to this progress is an approach known as deep learning, a relatively new field of artificial intelligence research that aims to perfect tasks such as face and language recognition.

MetaMind Natural Language Processing

"[MetaMind's] deep learning technology is going to have enormous impact in multiple industries …” says Marc Benioff, the CEO of Salesforce.

Related articles
The MetaMind system processes data using what Socher calls “episodic memory.” Like how Demis Hassabis describes DeepMind's algorithms, this is something akin to the way the brain treats short-term memory in the hippocampus. The system must “remember” one fact before determining what another is, based on the natural language data supplied.

“You can’t do transitive reasoning without episodic memory,” Socher says.

And, he explains, you can use much the same setup to do analyze sentiment or translate words into a new language. “One model—one dynamic memory network—can solve these very different problems,” he says.

MetaMind's founder says that his work has made significant progress toward more generalizable artificial intelligence. “This idea of adding memory components is something that’s in the air right now,” he says. “A lot of people are building different kinds of models, but our goal is to try to find one model that can perform lots of different tasks.”

In the talk below, Socher describes how deep learning algorithms can learn language.




SOURCE  MIT Technology Review

By 33rd SquareEmbed

0 comments:

Post a Comment