Sunday, October 16, 2016

Grammar Is Actually Interesting

I really enjoyed this week's reading because there were so many connections between another one of my classes, SymSys 1, and my study of grammar and Latin in middle and high school.

Carnie gave a brief and interesting overview of the basics of syntax. According to him, linguists studying syntax operate largely on the scientific method: observing data about language, making generalizations based on patterns, testing them, and revising them as needed. One interesting point was that linguists are especially looking for exceptions in expected patterns, that they can then try to explain. He described how they use corpora to find patterns. This reminded me of an article I read about Google Translate - instead of doing some sort of structural parsing and translation word by word, it works by trawling vast corpora of language translation examples to find similar examples.

Pretty stupidly, I never seriously thought about the human capacity for language, even though I love to read and write. I didn't know that most linguists have agreed on “Universal Grammar” - the idea that the language facility is innate. The proof for this is that 1) syntax is a productive, recursive, and infinite system, 2) rules governing infinite systems are unlearnable, and therefore that syntax is unlearnable. Since we have it, though, it must be innate, since we could not have learned it. We seem to be born with the knowledge that certain sentences are ungrammatical; indeed, when parents correct their kids' language, they're usually correcting their content, not grammar. Carnie suggested that children learn early on which structural conventions their language uses, like subject-verb-object order vs subject-object-verb order.

Chapters 2 and 3 reminded me of studying grammar when I was younger. I did a very strict grammar curriculum from around third to eighth grades that involved diagramming sentences and labelling parts of speech in many sentences. Diagramming sentences was a fun puzzle to me back at age eleven, and now I see how sentence diagrams can show the recursive, tree-like nature of language's structure. While the sentence diagrams I did weren't quite trees, they did allow branching and depicted components as a tree would. It was interesting to learn new notation for grammar, like depicting intransitive sentences as [NP _ ], transitive sentences as [NP _ NP] or [NP _ {NP | CP} ], and ditransitive sentences as [NP _ NP NP], [NP _ NP PP], [NP _ NP {NP | PP} ], or [NP _ NP {NP | PP | P}]. Though I knew about direct and indirect objects, I'd never learned the term “ditransitive” before.

Last week for another class, SymSys 1, I read excerpts from Stephen Pinker's work on language. It included a deeper explanation of the recursive nature of language, generative grammar, principles and parameters theory, and the structure of phrases (heads, determiners, etc.). The Carnie reading was a good review of this material. It was interesting to connect principles of grammar to principles of computer science - particularly the idea that common structures, like recursion and abstraction, unite both programs and language. Later this quarter in CS 103, I'll also be learning about context-free grammars and regular expressions. I love exploring all these connections - I'm starting to see more and more the beauty of Symbolic Systems!

No comments:

Post a Comment