Sunday, October 16, 2016

Carnie’s “Generative Grammar” provided immense insight into contrasting theories on language acquisition, particularly on the debate regarding Chomsky’s concept of Universal Grammar. Chomsky’s iconic notion that “language is an instinct” seems to discount the possibility of the creation of a strong AI with human-level linguistic capabilities. Indeed, when Carnie states that, “Some facts about language seem to be built into our brains, or innate,” (Carnie, 15) in Chapter 1, it is implied, by extension, that without a functioning “brain”, one cannot obtain the necessary facts about language. If we flip this argument on its tail end, we could ask: is linguistic ability, or knowledge regarding these “facts” of language, evidence of some sort of “brain”, or innate mind? If so, could we ever construct an AI that has the linguistic capabilities and understanding of Universal Grammar, and say that it, too, has a “brain”?
These complex questions sparked in my mind throughout Carnie’s portrayal of the premises behind Universal Grammar, and were connected to the themes in my Symbolic Systems lectures, as well. In an assigned SYMSYS1 reading of “How Language Works” from Pinker’s The Language Instinct, Pinker describes arguments as to why the discrete combinatorial system of grammar in Language is innately human, and therefore responds to these questions with a resolute yes and no, respectively. Pinker suggests that in “the way language works, then, is that each person’s brain contains a lexicon of words and the concepts they stand for (a mental dictionary) and a set of rules that combine the words to convey relationships (a mental grammar).” (Pinker, 85) It can be deduced in that while an AI is programmable to understand the mental dictionary, such as specific language variation in vocabulary that must “clearly be learned or memorized,” (Carnie, 23) the mental grammar contains built-in aspects such as the “that-trace effect” (Carnie, 20) that cannot simply be programmed into an AI. Most importantly, because of the nature of building an encoding a specific set of functions to respond to stimuli linguistically, it is highly unlikely that an AI could ever have absolute discretely combinatorial grammar.
Because of this immense distinction between what can and can’t be learned, there is also an obvious divide between the syntactical and semantic capabilities of a human and a machine. While Apple’s Siri can use recursion and encoded word-chain systems to respond to a prompt, for example, it would likely be thrown off by the use of a human-constructed neologism, or by the use of any prompt that was not initially encoded in the AI. This spotlights one of the integral challenges that bridges both fields of Linguistics and Symbolic Systems: how can one program the humanely “innate” capabilities of Language into a non-human entity?
Only time – and innovation – can tell.

Works Cited
 Carnie, Andrew. Syntax: A Generative Introduction. Malden, MA: Blackwell Pub., 2007. Print.
Pinker, Steven. "How Language Works." The Language Instinct: How the Mind Creates Language. N.p.: n.p., n.d. N. pag. Print.





No comments:

Post a Comment