Embedding Language Processing


The post provides an overview of how the RM2 Network employs unsupervised learning to process Natural Language using reference visual inputs along with the object label, just as humans do. We believe that in order to deliver effective machine-human communication, we need to integrate visual cues with language that will provide the ability to learn, reason, explain abstracts and understand the sentiment in a given conversation and help maintain context at all times.

In order to explain how the language processing works, we present an overview of the entire network and how unsupervised learning is conducted for autonomous learning

RM2 Network

The RM2 Network is a hybrid model for Unsupervised Learning that combines aspects of Kohonen’s Self-Organizing Map (SOM) and Recurrent Networks like Hopfields Network. Click here to read more on all the existing models that influence the hybrid.


Natural Language Conversation with a machine

Wondered how Ava in ExMachina was able to communicate so fluently with Caleb? The article attempts to decode this contextual feature of Ava which requires cognitive computing in real-time.

For a machine to understand what the humans are saying and to respond back pertinently, it should be a learning machine that decodes language and can assemble replies on the fly considering the contexts, intent and sentiment behind the dialogue with the human

Technically it would mean decomposing sentences in real-time (as the human speaks), extraction of contexts, intent and the sentiment in their sentences and construction of the answer which satisfies the expectations of the dialogue through relevance and intelligence.

The idea of this article is to explain how a simple algorithm can detect real-time context and sentiment to generate relevant answers for better user experience. The method is an improvisation of Joseph Weizenbaum’s Eliza which is a natural language conversation program. The improvisation involves auto-creating keywords and their ranks and is able to retain contexts for an ongoing conversation.