Understanding Language: The Linguistic Mesh
Language is a Great Mystery
Linguistic definition itself is a great mystery. It seems that we cannot escape recursion, circularity, and reification in the way we define words. There are no linguistic primitives.
Language is not a simple set of discrete building blocks but rather a vast, interconnected mesh of meanings, each term defined in relation to others. This interdependence is particularly evident when we attempt to define mental operations. Words like "consciousness," "awareness," "attention," "intention," "understanding," and "meaning" form a closed set, where each term is ultimately dependent on others for its definition. There is no final, independent grounding for any of these concepts within language itself.
Even function words—prepositions, conjunctions, pronouns, and articles—are essential elements of this mesh. They derive meaning not from direct reference to the world but through their relational role in structuring discourse. Prepositions such as "in" and "on" have no intrinsic meaning outside of their contextual usage, and conjunctions like "and" and "but" serve functions that require further interpretation. This suggests that meaning is not an atomic property of words but an emergent phenomenon of the entire linguistic system.
The dependency of words on other words creates an apparent infinite regress, though it is not truly infinite, as language is bounded by the number of words available in a given lexicon. However, this bounded system still lacks a foundational base, making definitions inherently circular at some level. This is why dictionaries cannot be fully self-contained; they require an external reference point—shared human experience and cognitive context—to be fully understood.
We seem to learn words by context without being explicitly taught, in many cases, just by hearing them, just by reading them, and inferring the context. Sometimes we get it wrong. Sometimes we get it more or less right. It's almost idiosyncratic what one believes a word to mean, but in general we can communicate because we have some rough common understanding. Imperfect, though, that often is.
One of the issues is that we can tie some words to the concrete, to events and objects that are immediate to perception. Other times we deal with abstractions, and abstractions piled upon abstractions, yet we seem to be able to use these to communicate somehow—more or less, anyway. So often we end up in the metaphysical mire with our abstractions, so divorced from concrete reality that they become gobbledygook, although we don't often recognize that.
I've read the assertion that infants don't actually need to be taught words. They learn them anyway. I don't know if there's more than anecdotal evidence pointed at this conclusion, but it's interesting. A lot of words we don't seem to get explicit instruction upon. We don't go to dictionaries. We just somehow or other come up with an understanding, and the understanding is going to be idiosyncratic. Words have multiple meanings, and we have to learn to distinguish the meanings from the context, and we often get it wrong. That's why we have the notion of category mistakes, reification, and conflation—all mistakes that we make with language.
Over time, as we age, we find that the younger generation repurposes words and invents new ones. And we may be perplexed, since they'll use existing words with a totally different meaning. Whether by mistake, misunderstanding, or deliberate innovation, words do get repurposed and their meanings revised. Often new words get introduced, sometimes to fill a need, sometimes for reasons that seem quite opaque. As we age, we have to come to grips with these words and try to understand what they mean, if we encounter them. Some words may never reach us, as they remain restricted to channels of communication we do not use. But when we do encounter them, we must figure out their meaning. This is common enough. New words get introduced, and people unfamiliar with them must work to understand their meaning. Often, we have to look them up or have someone explain them. And sometimes, we may not grasp their full significance for quite a while. Our understanding of a word’s meaning often evolves over time.
Language is not static. We have no way of knowing just when and how language arose, but it's pretty clear that there are proto-languages amongst many species of animals, although conventional linguists deny them. Many animals have languages of some complexity—perhaps not the same level as human language, but still functional. Some studies suggest that animals can even name other animals in their sphere, using different sounds for individuals, including offspring. Whether or not this is fully substantiated, it illustrates that language evolves and likely developed through evolutionary pathways rather than appearing fully formed.
Languages transmute over time, diverging into mutually incomprehensible versions across generations and geographic regions. This ongoing, continuous process is influenced by necessity—creating terms for new concepts—but also by randomness, misunderstandings, and factors beyond prediction. Despite these changes, language remains our primary tool for structuring thought and communicating ideas, however imperfectly.
The issue extends beyond individual words to grammar itself. The grammatical framework we use to describe language is not a natural structure but an imposed system, often inherited from older linguistic traditions, such as Latin in the case of English grammar. While useful as a descriptive tool, grammar does not capture the full complexity of how language operates in practice.
This structural entanglement of language reveals something profound about human cognition: thought itself is not easily reducible to a set of linguistic primitives. Instead, our conceptual understanding is shaped by a vast web of interrelations, where meaning arises from patterns of use rather than from fixed definitions. This is why attempts to precisely define abstract concepts such as "truth," "justice," or "reason" inevitably lead to further interpretation rather than resolution.
Thus, language is not merely a tool for communication but a reflection of the structure of thought itself. The linguistic mesh is not a flaw—it is an essential feature of how we make sense of the world.

