Understanding Language: Rewording
Once again, I try to come to grips with the mysteries of language and how we can say the same thing in a potentially infinite number of ways without losing inherent meaning. This is all metaphysical.
Note: It’s something like this in my view. My words this time. Introduction, summary and readings courtesy of ChatGPT 4.0. Given more motivation, I could tighten it up a bit I suppose. Would that be rewording?
Introduction
Language is a profoundly flexible and mysterious phenomenon, enabling the rewording of ideas in an infinite number of ways while preserving meaning. The ability to manipulate language—whether through paraphrasing, translation, summarization, or abstraction—is central to human communication and thought. Yet, how we achieve this remains largely enigmatic. Thought itself appears to precede language in an inchoate form, only later becoming structured through words. The act of rewording extends beyond individual creativity; it is a fundamental part of lexicography, technical writing, and even artificial intelligence (AI) language models. However, while humans engage in rewording with intentionality and understanding, AI mimics this process through brute-force statistical predictions without true comprehension. This essay explores the nature of rewording, its implications for meaning, the recursive structure of language, and the role of AI in attempting to simulate this uniquely human capacity.
The Nature of Rewording
I know that we have any number of examples of words translating other words from other languages or within the same language: rewording, manipulating for greater clarity, or maybe for greater obscurity: precising, summarizing, condensing, abridging, abstracting, expanding, revising, revising, and revising, but always rewording. It's a great mystery just how we do that, but I maintain that with skill, even those dense and cryptic prose can be reworded for greater clarity.
Empirically, we've been able to do this in innumerable cases. I posit that we can do it in any case, perhaps at the cost of less concision, but in principle, I think even the densest of writing, the most opaque prose, can be reworded. And experience shows this is true. There's a whole industry of authors trying to rewrite the obscure thoughts of other authors and succeeding, to some extent. Is the meaning exact? Is the original proses incoherent or completely absurd? Well, that's hard to say, since very few of us can understand the original words. It could be. Maybe yes, maybe no.
The Inchoate Nature of Thought and Language
When we have a thought, inchoate, or without language, sometimes the language emerges so quickly we don't even recognize that the inchoate precedes the words. But the words that come out may have meaning to ourselves and to others. But just how we express something seems quite random. And if we want to change the wording, we can do it. We may change the wording again, we can do that as well.
I suppose with any individual, there might be a limit to how clever they can be. But in principle, we can infinitely change the wording while preserving the essential meaning. Of course, in practice, it's unproven and unprovable. But I see no reason in principle why we could not come up with an infinite variety of words and an infinite variety of times with an infinitely long lifetime with an infinitely strong desire to do so.
Thoughts emerge in an inchoate, formless state before language shapes them. The transition from thought to language is often imperceptible but fundamental. Thoughts emerge prior to language. These thoughts are not necessarily sensation, or visualization, or imagination. They're inchoate, they're formless. How they emerge, perhaps through associations, perhaps seemingly quite randomly, but they emerge. And they're not language. But they get turned into language, at least with most of us.
Some people don't have language, so that change into words can't happen. Infants don't have language, so that doesn't happen. There are some few born deaf folks who have never heard words, never learned to read or write, never learned to sign. Animals don't have language, so that doesn't happen. But most of us have language.
The Infinite Potential of Rewording
Language allows for infinite (or near-infinite) rewording of the same essential meaning. The ability to paraphrase is an art and a skill, central to human communication.
So it's a great mystery that we can express the same inchoate thought in so many different ways. And these inchoate thoughts, turned into language, lead to other thoughts associatively. Sometimes the inchoate thoughts slip away, they're just on the tip of the tongue as in “I forgot what I going to say.”
It's all very mysterious, bizarre really, and it's tied intimately to the whole idea of thinking, consciousness. All deep mysteries, understanding, misunderstanding, meaning, all deep mysteries, yet rewording is the essence of how we work. We can say any number of words to capture the same essential meaning.
And it's an art, people, to do this with clarity so that others can understand. It's something that can be taught formally, but only learned through practice.
Denotation vs. Connotation in Rewording
Preserving the essential meaning (denotation) in rewording is possible, but connotation is trickier. The distinction between the two is often unclear, yet it matters in communication.
So, a question arises of how we may preserve some aspects of the essential denotation. Can we do so without losing some connotation? Even though concepts are fuzzy, denotation versus connotation, it's not always clear just how to differentiate between the two terms.
Lexicography and the Circularity of Definitions
Lexicographers engage in structured rewording, attempting to define words using other words. This leads to inherent circularity in dictionaries and raises questions about the limits of defining meaning.
Well, the lexicographer's art is, in essence, the art of rewording things. I'm not even sure how lexicographers work, but they must consult reams of reference sources to get a sense of what a given word means and its origin. I'm sure there's a great deal of research involved.
The results are variable. Each dictionary has a different description or descriptions. The oddities of words, where they can have multiple meanings, they can have synonyms, they can have antonyms, they can have homonyms, all of those things, makes it even harder. That they have connotations and denotations makes it even harder.
It's amazing that a lexicographer can actually come up with coherent dictionaries. Quite often, they're circular. As a kid, I used to look up all these naughty words and find they were always defined in terms of other naughty words. So, it was infinite regress and circularity.
Is this deliberate, or just the way lexicographers happen to end up with such structures? So again, lexicographers are engaged in an exercise of rewording, trying to explain words with other words, and coming up with an infinite variety of ways of doing so. Perhaps infinite is just a metaphor.
Recursive Structure of Language
Words are defined by other words, creating an unavoidable recursive structure. This is most evident in dictionary definitions but is present throughout all language use.
Some people can refine wording for better clarity, but how they do it remains mysterious. Technical writers and editors excel at this skill, but the process itself is difficult to formalize.
Abstraction and the Limits of Rewording
Abstraction involves distilling core elements, but it also means removing specificity. The word “abstraction” itself has multiple meanings, overlapping with summarization, précis, and synopsis.
While rewording seems infinitely possible in principle, practical constraints exist. The act of translation, summarization, and reinterpretation always involves trade-offs.
Even understanding, deciphering meaning, inexplicable. Rewording is inexplicable. How do we know what words are going to be simpler and more clearly understood by others? Still, people can do it. Skilled writers can do it.
Abstraction is a mystery. It has multiple meanings. One of them would be looking for the core elements. But there are other meanings, probably very different meanings, but we use one word, abstraction, summarization, abstraction, précis, abstracts, synopsis. They're all similar.
Translation as a Form of Rewording
Translation is essentially a form of rewording, differing only in requiring knowledge of different vocabulary and grammar. This positions translation as a fundamentally similar process to paraphrasing within a single language.
I see no essential difference in this from any other type of rewording, other than the difficulty of learning a different vocabulary and grammar, and different cutural contexts. But essentially, it's a rewording exercise. Rewording never involves a one-to-one translation.
AI and the Imitation of Rewording
AI models mimic rewording but do so through brute-force statistical predictions. They synthesize text without understanding meaning, revealing patterns that human readers then interpret.
A great mystery of course is how large-language model AI do something similar with a presumably totally different mechanism requiring a brute-force computation, deterministic computation with a pseudo-random element, using certain algorithms based on a database of human understandings and misunderstandings and lies, and coming out with some synthesis of all of this based on trained-in word frequencies, weightings, if you will.
Yet it comes out with prose that is usually grammatically correct by going against the underlying dataset with all its imperfections. It comes out sometimes as absolute garbage and sometimes as stuff that even appears to be insightful and intelligent. It's not.
But the algorithm reveals that inherent somehow within the large corpus of text that the large-language model AI use is meaning when interpreted by readers.
AI outputs can appear insightful or nonsensical, depending on human interpretation. Meaning does not reside in the algorithm but emerges from language patterns and human inference.
Summary
Rewording is a fundamental aspect of human communication, allowing ideas to be expressed in multiple ways without necessarily losing their core meaning. This essay explores how thoughts emerge in an inchoate state before being shaped into language and how rewording operates at various levels, from lexicography to abstraction and translation. The distinction between denotation and connotation complicates the process, as meaning is not always preserved in exact form. Dictionaries and language structures inherently rely on circular definitions, illustrating the recursive nature of words defining other words. Technical writers and editors refine language for clarity, but the underlying process remains elusive. While AI language models simulate rewording, they presumably do so through statistical pattern recognition rather than understanding. Ultimately, language’s infinite flexibility and recursive structure make it both a powerful tool and an ongoing mystery.
References
Crystal, D. (2010). The Cambridge encyclopedia of language (3rd ed.). Cambridge University Press.
Note: A broad exploration of language structure, meaning, and translation, relevant to the essay's discussion on rewording and meaning.
Eco, U. (2003). Mouse or rat? Translation as negotiation. Phoenix.
Note: Discusses the complexities of translation, rewording, and how meaning shifts across languages.
Jakobson, R. (1959). On linguistic aspects of translation. In R. A. Brower (Ed.), On translation (pp. 232–239). Harvard University Press.
Note: A foundational work on the nature of translation and rewording, including the distinction between different types of linguistic equivalence.
Landauer, T. K., & Dumais, S. T. (1997). A solution to Plato’s problem: The latent semantic analysis theory of acquisition, induction, and representation of knowledge. Psychological Review, 104(2), 211–240.
Note: Introduces Latent Semantic Analysis, a key model in understanding how words relate to meaning statistically, relevant to both human cognition and AI.
Lyons, J. (1995). Linguistic semantics: An introduction. Cambridge University Press.
Note: Covers the difference between denotation and connotation, a major theme in the essay.
Manning, C. D., & Schütze, H. (1999). Foundations of statistical natural language processing. MIT Press.
Note: Discusses how AI processes rewording and translation using statistical techniques.
McCarthy, J. (1981). Artificial intelligence: A century-long perspective. AI Magazine, 2(3), 1–11.
Note: Touches on AI’s attempts to replicate human language processing, relevant to the essay's discussion of AI-generated text.
Nunberg, G. (1995). Transfers of meaning. Journal of Semantics, 12(1), 109–132.
Note: Explores how meaning shifts across different phrasings and rewordings.
Pinker, S. (2007). The stuff of thought: Language as a window into human nature. Viking.
Note: Examines how thoughts and concepts become language, tying into the inchoate nature of thought discussed in the essay.
Searle, J. R. (1978). Literal meaning. Erkenntnis, 13(1), 207–224.
Note: Discusses whether a core "literal" meaning remains intact across different rewordings.
Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27(3), 379–423.
Note: While not strictly linguistic, this classic work on information theory underpins how language is structured and processed, both in human cognition and AI.

