Natural Language Processing — What’s the big deal?

With recent AI advances where algorithms have been called sentient and can explain jokes — this article is a 30,000 foot view on the most important breakthroughs in NLP and their impacts

Skanda Vivek
Emergent Phenomena

--

Grammar and Machines

In 1957, a struggling linguistics graduate wrote a revolutionary book with a revolutionary premise: by understanding the rules behind grammar, it is possible to predict all grammatical sentences of any language. The author was Noam Chomsky, and the book — Syntactic Structures. The seeds were planted in linguistics and other fields — that machines could learn language; because after all — what are machines good for, if not following complex rules?

Chomsky proposed a hierarchy of grammars where sentences could be represented as a trees. In the example below, the sentence “John hit the ball” consists of a noun phrase (“the ball”) which in turn is contained by a verb phrase (“hit the ball”).

As the intersection between language and machines started shifting towards computer science, in 1966 a breakthrough was made. Dr. Joseph Weizenbaum at MIT made a computer program that could act as a psychotherapist that could have conversations as of the sort below:

ELIZA (1966) Conversation

--

--