-------- Original Message --------
Subject: [evol-psych] On Grammar vs. Language in Neurolinguistics
Date: Fri, 18 Oct 2002 18:26:56 -0500
From: Ian Pitchford <ian.pitchford@...>
Reply-To: Ian Pitchford <ian.pitchford@...>
Organization: http://human-nature.com/
To: evolutionary-psychology@yahoogroups.com


_________________________________________________________________


From Scienceweek October 4, 2002 Vol.6 Number 40


=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

5. On Grammar vs. Language in Neurolinguistics

Massimo Piattelli-Palmarini (University of Arizona, US) discusses
grammar vs. language, the author making the following points:

1) Two styles of explaining the science of mind and behavior
have been competing for as long as anyone cares to remember:
empiricist, centering on habit formation, statistical learning,
imitation and association; and rationalist, focusing on the
projection of internally represented rules. Despite relentless
effort, the former has delivered rather meager results, whereas
the latter, with its pivotal concept of an internally
represented grammar, has produced the solid "conceptual
cognitive revolution".

2) For a rationalist cognitive scientist, a grammar is a finite
mental object, systematically assigning abstract structures to
all the well-formed expressions of a language --that is, to each
member of a set that, for natural languages (such as Chinese or
Italian), is infinite and discrete. Infinite, because every
speaker of a language can produce and understand an unlimited
number of new grammatical sentences. Discrete, because
continuous modification of a sentence to change it into another
is impossible. No sentence could be halfway between "It's a good
car, but they don't sell it" and "It's a good car, but they
don't tell it."

3) A grammar capable of generating complex structures for all
well-formed sentences of a natural language must have recursive
rules, because phrasal constituents can contain other phrasal
constituents of the same or higher kinds ("The young doctor's
three beautiful sisters" is a noun phrase containing another
noun phrase; "The spy who came in from the cold" is a noun
phrase containing a sentence). Moreover, structural rules of
sentence formation can be applied recursively to embed relative
clauses embedding other relative clauses, without limit (as in
"This is the cat that killed the rat that ate the malt that lay
in the house that Jack built"). Because such grammars are
finite, whereas the languages they generate are infinite and
contingently shaped by use, it is advantageous, and
methodologically cogent, to consider the concept of grammar as
primary, and that of language as derived.

4) Since the mid-1950s, powerful formal criteria, derived from
analysis of the artificial languages of mathematics and computer
programming, have been applied to the study of natural languages
to determine principles by which a given class of grammars can
generate a given target language. A universal ('Chomsky')
hierarchy of grammars (automata) was established: the most
powerful class contains as a subclass the immediately less
powerful one, and so on. In tune with the dominant
empiricist-inductivist tradition of the 1950s, the first
grammars to be explored at the lowest level in the hierarchy
were probabilistic and finite-state. From a very large corpus of
ascertained utterances of the language, one can compute the
conditional probability that a word (or string of words) will
follow another.

References:

1. Wasow, T. in Foundations of Cognitive Science (ed. Posner,
M.) 161-205 (MIT Press, Cambridge, Massachusetts, 1991)

2. Chomsky, N. The Minimalist Program (MIT Press, Cambridge,
Massachusetts, 1995)

3. Pullum, G. K. & Scholtz, B. C. Nature 413, 367 (2001)

Nature 2002 416:129

Web Links: neurolinguistics Chomsky

Related Background Brief:

MORE THAN WORDS. In the popular view, a language is merely a
fixed stock of words. Purists worry about foreign loanwords;
conservatives decry slang; and groundless claims that there are
hundreds of Eskimo words for snow are constantly made in popular
writing, as if nothing matters about languages but their
lexicons. But the popular view cannot be right, because (as
linguist Paul Postal has observed) membership in the word stock
of a natural language is open. Consider this example: "GM's new
Zabundra makes even the massive Ford Expedition look
economical." If English had an antecedently given set of words,
then this expression would not be an English sentence at all,
because 'Zabundra' is not a word (we just invented it). Yet the
sentence is not just grammatical English, it is readily
interpretable (it clearly implies that the Zabundra is a large,
fuel-hungry sports utility vehicle produced by General Motors).
Similar points could be made regarding word borrowing, personal
names, scientific nomenclature, onomatopoeisis, acronyms, loaned
words, and so on; English is not a fixed set of words. A more
fundamental reason that a language cannot just be a word stock
is that expressions have syntactic structure. For example, in
most languages, the order of words can be significant: "Mohammed
will come to the mountain" contains the same words as "The
mountain will come to Mohammed", but the expressions are very
different. Geoffrey K. Pullum: Nature 2001 413:367.

Related Background:

ON THE ACQUISITION OF LANGUAGE BY CHILDREN

J.R. Saffran et al (University of Wisconsin Madison, US) discuss
the acquisition of language by children, the authors making the
following points:

1) Before infants can begin to map words onto objects in the
world, they must determine which sound sequences are words. To
do so, infants must uncover at least some of the units that
belong to their native language from a largely continuous stream
of sounds in which words are seldom surrounded by pauses.
Despite the difficulty of this reverse-engineering problem,
infants successfully segment words from fluent speech from
approximately 7 months of age.

2) How do infants learn the units of their native language so
rapidly? One fruitful approach to answering this question has
been to present infants with miniature artificial languages that
embody specific aspects of natural language structure. Once an
infant has been familiarized with a sample of this language, a
new sample, or a sample from a different language, is presented
to the infant. Subtle measures of surprise (e.g., duration of
looking toward the new sounds) are then used to assess whether
the infant perceives the new sample as more of the same or
something different. In this fashion, we can ask what the infant
extracted from the artificial language, which can lead to
insights regarding the learning mechanisms underlying the
earliest stages of language acquisition.

3) Syllables that are part of the same word tend to follow one
another predictably, whereas syllables that span word boundaries
do not. In a series of experiments, it has been found that
infants can detect and use the statistical properties of
syllable co-occurrence to segment novel words. More
specifically, infants do not detect merely how frequently
syllable pairs occur, but rather the probabilities with which
one syllable predicts another. Thus, infants may find word
boundaries by detecting syllable pairs with low transitional
probabilities. What makes this finding astonishing is that
infants as young as 8 months begin to perform these computations
with as little as 2 minutes of exposure. By soaking up the
statistical regularities of seemingly meaningless acoustic
events, infants are able to rapidly structure linguistic input
into relevant and ultimately meaningful units.

Proc. Nat. Acad. Sci. 2001 98:12874

ScienceWeek
http://www.scienceweek.com

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=


Top Books - Behavioral Sciences
http://www.amazon.com/exec/obidos/redirect?tag=darwinanddarwini&path=tg/browse/-/226685


Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.

-- 
M. Hubey
-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o-o
The only difference between humans and machines is that humans
can be created by unskilled labor. Arthur C. Clarke

/\/\/\/\//\/\/\/\/\/\/ http://www.csam.montclair.edu/~hubey