Sunday, October 23, 2016

Language of Thought

Atkins and Levin focused on semantic differences between several very similar words in English. They examined the "official" dictionary definition of the words shake, quake, tremble, shiver, vibrate, and others commonly used by English speakers as synonyms. They concluded that while most native speakers have slightly different connotations of the different words in their minds, the dictionary definitions of the words are mostly very similar to one another, even interchangeable. They, along with Haspelmath, beg the question of whether dictionaries should or need to perceive semantic distinctions in language -- currently, official dictionaries are very slow to pick up on gradual and semantic changes in language, but Haspelmath, Atkins, and Levin ask the question of whether "lexicographers [be informed] by linguists' analyses" of language.

Slobin takes a different turn from analyzing semantics of one language and instead analyzes grammatical structure across multiple languages. Specifically, Slobin divides major languages into two categories, verb-framed languages and satellite-framed languages. The former rely on the verb itself to give a phrase or sentence most of its meaning/connotation, while the latter rely on words surrounding the verb to give a sense of how the verb is performed. Because of these differences in syntax and word positioning, different types of languages follow different narrative patterns when telling the same story. Some focus more on the actions performed by the subjects, while others express those same actions but give more attention to the objects or incidentals in the background.

All of these readings highlighted differences in a single language and across languages in trying to convey a similar thought or meaning. I found myself connecting these evolved differences across languages to the language of thought hypothesis, a theory in cognitive science, linguistics, and the philosophy of mind, which states that thought is represented in the mind and processed through a language-like system of symbols and rules. A consequence of this theory is that natural language and thought are so closely intertwined that throughout our early years of mental development, they almost become the same -- that is, structured thought and abstract reasoning cannot exist without a symbolic vocabulary. We need some kind of systematic vocabulary in order to manipulate ideas abstractly in our minds; this vocabulary gives a rise to a need for syntax, or how to put together these symbols in a way that makes sense. I found myself wondering to what extent the structure of our first language informs the structure of our individual syntaxes of thought. Does the fundamental structure of our natural language (for example, whether our first language contains a rich vocabulary where utterance of a single verb can tell us about how it is performed, the context/situation, etc., or whether this language contains a set of very basic action verbs whose exact meaning we must extrapolate through context each time we hear it), determine the way in which we fundamentally think -- does the confinement of thoughts to language limit the thoughts we can have? After all, if there exists no word for a certain phenomenon, idea, or feeling, we cannot give it life with words, we cannot immortalize it with language, we cannot share it with others, and it slips away from our fingertips as fast as it appeared -- how can we be sure in our minds that it exists at all?

1 comment:

  1. You have an interesting take on these articles--the connection between language and thought has long been debated, and I personally think there is something to it. However, I think it is also important not to radicalize the connection, as did older linguists like Whorf. "Whorfian" ideology, which presents language as absolutely essential to thought, has since been largely rejected by the linguist community, although there is undoubtedly something behind the way we think that has to do with the tongue we speak.

    ReplyDelete