32 Comments
User's avatar
Rex Eloquens's avatar

If I remember correctly, Russell praised the early Wittgenstein's philosophy of words being a sort of picture of the world, that they were atomic little bits, and that anything that could not be boiled down to those bits was jargon. Russell's hardheadedness could not appreciate the latter Wittgenstein, which focused more on ways of life and on culture—as well as branching out into other branches that were unimportant to a lot of analytics, such as aesthetics.

 I think I prefer Derrida's views of language as opposed to the latter Wittgenstein's, and it's amazing how critics of both typically pull the same card that "it can lead to relativism and what about meaning!”But those are weak attacks on them both. Most well-versed in either will understand that they were not the harbingers of relativism, but that meaning in language was never static or eternally fixed.

Expand full comment
Linguistically Yours!'s avatar

1. Formal Grammars and Syntax Trees

Chomsky’s hierarchy of formal grammars (1956) directly influenced the design of programming languages and early natural language processing (NLP) systems. His classification of grammars—regular, context-free, context-sensitive, and recursively enumerable—provided a theoretical framework for parsing human and machine languages.

Example: Context-free grammars (CFGs), which underpin Chomsky’s early work, are still used in compilers and some NLP parsing tools.

Application: These grammars were foundational for symbolic AI and early rule-based machine translation systems like SYSTRAN.

2. Universal Grammar and Language Modelling

Chomsky's idea of Universal Grammar (UG), which posits an innate, language-specific cognitive structure, influenced AI researchers attempting to model human language understanding through structured symbolic rules.

Contribution: UG inspired early knowledge-based systems which tried to encode syntactic and semantic rules in a top-down fashion.

Limitation: UG assumes a rich internal structure not easily captured in algorithms, making it hard to implement computationally without vast hand-coded rule sets.

3. Critique of Statistical Methods

Ironically, Chomsky's strong criticism of statistical models (notably his 1957 review of B.F. Skinner) shaped the debate within AI about data-driven vs rule-based approaches. His scepticism of probabilistic models stood in contrast to what later became standard in machine learning and NLP.

His opposition to behaviourist approaches foreshadowed concerns about black-box models in deep learning, although he also remains critical of today's data-heavy systems.

Expand full comment
mvo's avatar

Great short essay on a topic that seems to attract long-winded commentators. Chomsky’s followers have had 60-70 years to discover universal commonalities among the 10,000 languages they have studied. Few academic projects have failed so completely over such a long period. The lesson of AI is that the brain just learns by example. No need for a universal grammar or special language modules. Very young children figure out meaning from context. Wittgenstein was right.

Expand full comment
Linguistically Yours!'s avatar

AI and human language learning are fundamentally different processes. AI systems, like large language models, rely on vast amounts of curated data and computational power to identify patterns, whereas children acquire language with limited input, often containing errors and ambiguity. This supports Chomsky’s claim that humans must have some innate predisposition to deduce complex grammatical structures from incomplete data.

Additionally, while context and interaction undeniably aid language acquisition, they do not fully explain how children can generate novel sentences they have never heard before, a hallmark of linguistic creativity. Chomsky’s Universal Grammar provides a framework for understanding this ability by positing innate structural principles common to all languages. While AI demonstrates remarkable pattern recognition, it does not possess understanding or the biological mechanisms underpinning human language, thus making direct comparisons problematic.

This debate is likely to go on for while, still!!

Expand full comment
Nathan Ormond's avatar

A lot of mention of the critics. Not much mention of the push in the other direction (which seems to be the more empirically robust set of theories!)

Expand full comment
Andrew Robinson's avatar

Bernard Suits decisively challenges Wittgenstein’s analysis of ‘games,’ demonstrating that they can, in fact, be clearly defined. Suits argues that games are voluntary attempts to overcome unnecessary obstacles. A definition precise enough to counter Wittgenstein’s claim that ‘game’ is an indefinable family resemblance concept. If Suits is correct, then Wittgenstein’s analogy collapses; language cannot be compared to games if games themselves have a strict definition.

Expand full comment
Kaiser Basileus's avatar

A system of thought does not have to be exhaustive or even well organized to be complete. He took his ideas where he wanted to in the way he wanted to. If you want them to be something else, something more, that's on you. Be your own philosopher, don't bemoan the insufficiencies of others. If you think it wants organisation, organise it!

Expand full comment
ConfusedWanderer's avatar

Are there any scholars who agreed with Wittgenstein in the concept of language games or other ideas in Philosophical Investigations?

Expand full comment
Linguistically Yours!'s avatar

Yes, even if they don't agree entirely they have built upon or adapted Wittgenstein concepts.

I can think of Alasdair MacIntyre, Hanna Pitkin, Peter Winch and a few more

Expand full comment
Charles Lambdin's avatar

Similarities with Saussure?

Expand full comment
Linguistically Yours!'s avatar

There are overlaps.

Both reject the idea that words have intrinsic meaning, instead focusing on relational or contextual meaning.

They also both challenge the idea that language merely mirrors reality.

They agree that language is dependent on collective rules.

Saussure and Wittgenstein both highlight language's real-world usage rather than abstract or idealised systems, making meaning context-dependent and not fixed.

However, Saussure viewed language as an abstract system of signs structured by internal relations, while Wittgenstein saw language as a set of practical activities whose meaning emerges from social use.

Expand full comment
Lance S. Bush's avatar

Probably quite a lot. That they aren't more prominent in contemporary academic philosophy says something bad about contemporary academic philosophy. Not Wittgenstein. Wittgenstein's views of language are far closer to the empirical reality we're now converging on, while Chomsky's views will be eclipsed and seen as an unfortunate nadir in the study of language.

Expand full comment
Linguistically Yours!'s avatar

I am no Chomskianite but his contribution, however flawed in places, cannot be ignored. It has given enquiry leads to neurolinguists and psycholinguists. He forced linguistics to move away from a narrow behaviourist framework.

Expand full comment
Lance S. Bush's avatar

I'm not sure that was a good thing.

Expand full comment
Mikael Lind's avatar

I think that the fact that humans understand some grammatical features of our language better than AI despite the fact that AI has been trained on millions of texts at least gives some credit to Chomsky's legacy. Chomsky did make many important contributions to the field, despite eventually being proven wrong on many accounts.

Expand full comment
Lance S. Bush's avatar

Why would that give credit to Chomsky?

Expand full comment
Linguistically Yours!'s avatar

For being one of the stepping stone. Chomskyan linguistics has contributed significantly to early AI, particularly in areas related to formal grammar, language parsing, and symbolic computation. However, it is true that its influence has waned in contemporary AI dominated by statistical and data-driven models.

Expand full comment
Mikael Lind's avatar

Because he claimed that the human ability to learn grammar was more than just a complex parrot ability. We don't just say what we hear, we extract a very complex grammar from what we hear, something that seems to go deeper than just surface level structure. The trace theory could be said to explain one such feature.

Expand full comment
Nathan Ormond's avatar

Mist who did moved out of philosophy into different fields

Expand full comment