Sunday, October 16, 2016

Carnie's Intro to Syntax

Chapter One: Carnie provides a definition of syntax by contrasting it with other subfields of linguistics. Whereas morphology looks at how sounds are grouped into words and semantics looks at what sentences/phrases mean, syntax studies how words (post-morphology but pre-syntax) are organized into sentences and phrases. The prevailing theory of syntax among linguists (though it may be better deemed a family of theories) is Generative Grammar, which says that sentences are generated by unconscious processes which it is the job of syntactic theory to describe/model. Carnie uses the scientific method to conduct his investigations meaning that he formulates putative rules, checks them against the data (often using intuitions of well formed-ness or ‘judgments’ as Carnie calls them) and revises them in accordance with the results. But where does this knowledge of grammar come from? Carnie, following Noam Chomsky, maintains that at least some of this knowledge is innate, in us from birth, rather than acquired.

Chapter Two: Carnie spends this chapter discussing the various parts of speech that make up a language, as well as how to identity which part of speech a particular word falls under. One answer, familiar to most of us from elementary school, is that part of speech of a word can be determined by asking what the word means. This is, of course, not a very scientific method, and is unsurprisingly rejected by Carnie. Instead, he proposes to identity parts of speech by an investigation of a word’s morphological distribution, i.e., a determination of the prefixes and suffixes a word allows to appear on it, as well as syntactic distribution, i.e., a determination of what other words can appear near the word.

Chapter Three: With the parts of speech now identified from the previous chapter, Carnie begins a close investigation of a sentence’s structure. Sentences consist of constituents (roughly, units that function together in a sentence like “the student” in “the student ate a sandwich”) as well as a hierarchical structure which in turn consists of constituents glued onto each other to form bigger and bigger constituents, until at the end of this process a sentence arises. In the remainder of the chapter, Carnie anatomizes the parts of speech, determining rules for the mode of combination of their subconstituents.

I thought the arguments adduced in chapter one for the innateness of grammatical knowledge were very interesting. However, the one offered on page 17 for premise (ii) [Rule governed infinite systems are unlearnable] struck me as potentially problematic - not necessarily in the sense that it has an easily identifiable flaw, but in the sense that it seems to prove too much. The argument is roughly that children, in figuring out the rules for correlating sentences with situations, are never given access to the infinite number of possible sentences and correlations and so nothing would seem to underlie the confidence children have with the rules (which budget for an infinite number of circumstances) of language. But does this not work equally well to demonstrate that knowledge of arithmetic, say the addition function, is also innate? We are, after all, only given a finite number of correlations (e.g., 3+3 = 6, 7+3 = 10), so what might underlie our confidence that we’ve gotten ahold of the right rule of addition?

No comments:

Post a Comment