Sunday, October 16, 2016

Syntax Need We Do Even?

Andrew Carnie begins his book by making the reader conscious of his/her/everything-in-between's ability to understand the words and the sentences that are being presented. He suggests that linguistics studies how this language perception occurs and that understanding syntax, the sentence structures of a language, is part of this pursuit. In his first chapter, Carnie describes the scientific approach to describing and defining syntax, highlighting the methodology of analyzing gathered data, making observations, and forming hypotheses about syntax in languages.

In his next chapter, Carnie goes on to talk about the individual words that make up these sentences, maintaining that an understanding of the different types of building blocks is integral to a concept of the overall framework. He raises the interesting point with an accompanying example, that even given a seemingly-gibberish phrase, an English speaker could identify the parts of speech of the nonsensical words based on their position in the overall sentence and their affixes. I wonder if in languages where word order and grammar rules are less strict, and if parts of speech followed less inherent patterns, if such a task would be as possible? I also wonder if given enough context clues, we could determine meaning even with muddied/non-existent syntax, such as in a jumbled up sentence that still makes sense?


The third chapter moves on from the smaller parts of the sentence, the individual words, to a discussion of the overall structure, and the useful hierarchical representation in a tree diagram with constituent branches, made up of parts of speech/phrases that follow certain rules and allow for the recursive nature of language, which in SYMSYS 1 came up frequently in the discussion of “context free grammars.” Astonishing, and something that Carnie addresses in his book, is the fundamental ability of humans to understand Language even with its infinite possibilities, and whether that is indicative of some innate ability, some Universal Grammar. If any part of it is truly innate, does that mean teaching computers to speak and understand Language is no longer possible? Or can we somehow discover and model the structures of the brain built to understand?

1 comment:

  1. Regarding your last question about the possibility of computers understanding and speaking language, one important factor that Carnie does not have an answer for is the extent to which our language facility is innate. He definitely champions the Universal Grammar theory, but he talks about statistical models of language acquisition only briefly. It seems like the more we uncover about the processes underlying language acquisition and use, the closer we will be to training computers to model that behavior.

    ReplyDelete