Discussion about this post

User's avatar
Rex Eloquens's avatar

If I remember correctly, Russell praised the early Wittgenstein's philosophy of words being a sort of picture of the world, that they were atomic little bits, and that anything that could not be boiled down to those bits was jargon. Russell's hardheadedness could not appreciate the latter Wittgenstein, which focused more on ways of life and on culture—as well as branching out into other branches that were unimportant to a lot of analytics, such as aesthetics.

 I think I prefer Derrida's views of language as opposed to the latter Wittgenstein's, and it's amazing how critics of both typically pull the same card that "it can lead to relativism and what about meaning!”But those are weak attacks on them both. Most well-versed in either will understand that they were not the harbingers of relativism, but that meaning in language was never static or eternally fixed.

Expand full comment
Linguistically Yours!'s avatar

1. Formal Grammars and Syntax Trees

Chomsky’s hierarchy of formal grammars (1956) directly influenced the design of programming languages and early natural language processing (NLP) systems. His classification of grammars—regular, context-free, context-sensitive, and recursively enumerable—provided a theoretical framework for parsing human and machine languages.

Example: Context-free grammars (CFGs), which underpin Chomsky’s early work, are still used in compilers and some NLP parsing tools.

Application: These grammars were foundational for symbolic AI and early rule-based machine translation systems like SYSTRAN.

2. Universal Grammar and Language Modelling

Chomsky's idea of Universal Grammar (UG), which posits an innate, language-specific cognitive structure, influenced AI researchers attempting to model human language understanding through structured symbolic rules.

Contribution: UG inspired early knowledge-based systems which tried to encode syntactic and semantic rules in a top-down fashion.

Limitation: UG assumes a rich internal structure not easily captured in algorithms, making it hard to implement computationally without vast hand-coded rule sets.

3. Critique of Statistical Methods

Ironically, Chomsky's strong criticism of statistical models (notably his 1957 review of B.F. Skinner) shaped the debate within AI about data-driven vs rule-based approaches. His scepticism of probabilistic models stood in contrast to what later became standard in machine learning and NLP.

His opposition to behaviourist approaches foreshadowed concerns about black-box models in deep learning, although he also remains critical of today's data-heavy systems.

Expand full comment
30 more comments...

No posts