This reading gives an introduction to some of the basics of
syntax. It describes the position of the field of syntax at a level between
morphology and semantics. It also introduces the concept of generative grammar,
which refers to the idea of creating sentences by following a set of
subconscious principles; referencing generative grammar early in the chapter
prepares the reader for the strong theme of ‘language approached
scientifically’ that runs through the rest of the reading. This first chapter
also takes the reader through rule formulation in syntax (using anaphors as an
example), explains some of the complexities of language acquisition, touches on
infinite systems and the logical problem of language acquisition, among other
things.
Chapter two talks about parts of speech and why they matter
in syntax. This section emphasizes that, often contrary to popular convention,
a word’s part of speech is determined by analyzing the word’s distribution in
the sentence and not the meaning. Chapter three elaborates on sentence
structure, referencing syntactic trees, rules for writing these trees, and
constituents (and what tests should be performed to identify them).
Overall, this outline supports one of the most interesting
facts about the human capacity to use language: We are capable of generating
never-before-seen sentences and understanding them based on a set of rules that
we subconsciously acquire. Moreover, these rules determine how we structure the
sentences that we speak, and we alter the structures based on context. For
instance, we alter the structure of ambiguous sentences. We change around their
phrases to match the situation at hand. A phrase like I saw an elephant in my house doesn’t make it clear whether the
person or the elephant is in the house, yet we usually understand what is being
meant based on the given context. The process might seem trivial – something we
experience everyday, but the underlying processes are remarkable.
I wonder how these syntactic concepts feed into the computational
theory of mind. If the mental process for outputting sentences is thought of as
very strict and methodical given as input an idea that we want to express, then
the parallels between syntactic processes in the mind and software (which is
often thought of as an input-output process) might seem clear. But if the
process for generating grammar is thought of as iterative and dynamic based on
the interplay between external changes in context and internal adaptations to
those changes, then the similarities between the brain and the computer might
blur. Both of these perspectives may potentially be adopted based on Carnie's outline.
No comments:
Post a Comment