Sitemap

What’s missing from AI — Part 1

11 min readJul 25, 2025

… the missing link to usher in “Next Generation AI” since the 1930s

Press enter or click to view image in full size
A brain in hand, a robot here ponders the use of ‘contextual meaning’ to help it emulate humans. But how do we store meaning? Photo by julien Tromeur on Unsplash

Background

In the 1930s, the American focus on behaviourism turned the linguistics world from the science of signs (semiotics) to one aligned with one of the great scientists in history, Pāṇini, who lived perhaps as far back as the 7th century BC. The use of Pāṇini’s linguistic model by Leonard Bloomfield led to linguistics excluding meaning, such as in the influential Chomsky monograph, Syntactic Structures, published in 1957.

My proposed move back to semiotics is a side effect of the highly influential work of Robert D. Van Valin, Jr., whose development of Role and Reference Grammar (RRG) over the past 40+ years creates a clear distinction between the words and phrases in a language (morpho-syntax) and their meaning in context (contextual meaning). RRG views the world’s diverse languages with a layered model that links from morphosyntax to meaning and back using a single algorithm.

For AI to emulate humans, it needs far more capability than today, such as with a simple hierarchical model: a basic animal brain capability — and it needs meaning!

The Use of Meaning Starts Next Generation AI

--

--

John Ball
John Ball

Written by John Ball

I'm a cognitive scientist working on NLU (Natural Language Understanding) systems based on RRG (Role and Reference Grammar). A mouthful, I know!

No responses yet