SHARE

Big companies like Google, Apple and Amazon have designed amazing AI tech that can respond to voice commands. In the history of technology, this is a major advancement. Still, Siri, Alexa, and other AIs do not quite come close to understanding human language as well as humans can.

For example, an AI like Siri can respond to a simple voice command like “call Jane.” However, Siri cannot respond to a more complex voice command like “call Jack, no Jane.” In this scenario, Siri will dial for Jack, which is not what you want her to do.

An up-and-coming tech startup called Pat wants to change this sad scenario. Pat aims to improve the way AIs like Siri and Alexa understand the complexities of human language. So, when you tell Siri “call Jack, no Jane,” she can call the right person.

Why Is It So Difficult for AI to Understand Human Language?

Human language is extremely complex. This complexity is, in fact, what makes us human. Tech companies have struggled to design algorithms that capture all the grammatical and semantic complexities of language. There are a number of ways to approach this problem. Pat has chosen to solve this problem by enabling machine learning based on a theory of grammar called role and reference (or RRG).

The RRG theory model was developed by noted linguist Robert Van Valin, Jr., with help from William A. Foley, another linguist mostly famous for his research on Austronesian languages. Both Valin and Foley are actively involved with Pat in developing a proprietary neural network that will make AIs interpret natural language better.

Unlike other theories of grammar, the RRG model is not based on grammatical structures of English. Obviously, syntax varies from language to language. But the semantics, how meaning is conveyed, remains the same. The RRG model aims to explain how meaning is generally conveyed in context in a natural sentence. Understanding the semantics of a sentence is crucial to making an AI fully comprehend a complex phrase like “call Jack, no Jane.”

An RRG-based algorithm can make an AI understand a sequence of words with relation to grammar and context in order to convey the intended meaning. It’s this type of algorithm that Pat ultimately hopes to master.

Great Potential for the Future of AI

If successful, an RRG-based algorithm can present immense opportunities for AI. Tech companies now understand that semantics, rather than syntax that fuels programming languages, is the crucial aspect of advancing human-AI interaction.

Companies are doing their best to make computers learn natural language using machine learning. Another startup, DefinedCrowd, uses crowdsourced big data to enable better machine learning. Tech giants like Google and Microsoft have invested billions in research that uses long short term memory (LSTM) techniques to help AI put words in context. Pat’s algorithm in development could potentially automate machine learning so there’s no need for extensive data gathering or expensive tools.

Pat’s Beta Testing Underway

Pat recently announced that the company is preparing to release a private beta. More details will be unveiled in the future. The company has already raised $2.5 million for development, and is seeking $3 million more in investments.

Others in the field have noted that Pat would need huge amounts in investments to make the algorithm a reality. It’s also important to note that natural language is in a constant state of evolution. New words and phrases get added to the lexicon, and new rules get made along with it. In this context, what Pat can truly achieve is indeed fascinating.

SHARE
Marty Rogers is a lifestyle, family and business blogger from the UK. He owns multiple online businesses and makes a living working from home, and writing about it on his blog. His interests include SEO, MMA, Snooker and the Countryside.

NO COMMENTS