Universal grammar
Universal grammar, in modern linguistics, is the theory of the genetic component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that a certain set of structural rules are innate to humans, independent of sensory experience. With more linguistic stimuli received in the course of psychological development, children then adopt specific syntactic rules that conform to UG. It is sometimes known as "mental grammar", and stands contrasted with other "grammars", e.g. prescriptive, descriptive and pedagogical. The advocates of this theory emphasize and partially rely on the poverty of the stimulus argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established, as some linguists have argued languages are so diverse that such universality is rare. It is a matter of empirical investigation to determine precisely what properties are universal and what linguistic capacities are innate.
Argument
The theory of universal grammar proposes that if human beings are brought up under normal conditions, then they will always develop language with certain properties. The theory proposes that there is an innate, genetically determined language faculty that knows these rules, making it easier and faster for children to learn to speak than it otherwise would be. This faculty does not know the vocabulary of any particular language, and there remain several parameters which can vary freely among languages which must also be learned. Evidence in favor of this idea can be found in studies like Valian, which show that children of surprisingly young ages understand syntactic categories and their distribution before this knowledge shows up in production.As Chomsky puts it, "Evidently, development of language in the individual must involve three factors: genetic endowment, which sets limits on the attainable languages, thereby making language acquisition possible; external data, converted to the experience that selects one or another language within a narrow range; principles not specific to the Faculty of Language."
Occasionally, aspects of universal grammar seem describable in terms of general details regarding cognition. For example, if a predisposition to categorize events and objects as different classes of things is part of human cognition, and directly results in nouns and verbs showing up in all languages, then it could be assumed that rather than this aspect of universal grammar being specific to language, it is more generally a part of human cognition. To distinguish properties of languages that can be traced to other facts regarding cognition from properties of languages that cannot, the abbreviation UG* can be used. UG is the term often used by Chomsky for those aspects of the human brain which cause language to be the way that it is, but here for the purposes of discussion, it is used for those aspects which are furthermore specific to language.
In the same article, Chomsky casts the theme of a larger research program in terms of the following question: "How little can be attributed to UG while still accounting for the variety of 'I-languages' attained, relying on third factor principles?".
Chomsky has speculated that UG might be extremely simple and abstract, for example only a mechanism for combining symbols in a particular way, which he calls "merge". The following quote shows that Chomsky does not use the term "UG" in the narrow sense UG* suggested above:
"The conclusion that merge falls within UG holds whether such recursive generation is unique to FL or is appropriated from other systems."
In other words, merge is seen as part of UG because it causes language to be the way it is, universal, and is not part of the environment or general properties independent of genetics and environment. Merge is part of universal grammar whether it is specific to language, or whether, as Chomsky suggests, it is also used for an example in mathematical thinking.
The distinction is the result of the long history of argument about UG*: whereas some people working on language agree that there is universal grammar, many people assume that Chomsky means UG* when he writes UG.
Some students of universal grammar study a variety of grammars to extract generalizations called linguistic universals, often in the form of "If X holds true, then Y occurs." These have been extended to a variety of traits, such as the phonemes found in languages, the word orders which different languages choose, and the reasons why children exhibit certain linguistic behaviors.
Other linguists who have influenced this theory include Richard Montague, who developed his version of this theory as he considered issues of the argument from poverty of the stimulus to arise from the constructivist approach to linguistic theory. The application of the idea of universal grammar to the study of second language acquisition is represented mainly in the work of McGill linguist Lydia White.
Syntacticians generally hold that there are parametric points of variation between languages, although heated debate occurs over whether UG constraints are essentially universal due to being "hard-wired", a logical consequence of a specific syntactic architecture or the result of functional constraints on communication.
Relation to the evolution of language
In an article entitled "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where they have a universal grammar.The first hypothesis states that the faculty of language in the broad sense is strictly homologous to animal communication. This means that homologous aspects of the faculty of language exist in non-human animals.
The second hypothesis states that the FLb is a derived and uniquely human adaptation for language. This hypothesis holds that individual traits were subject to natural selection and came to be specialized for humans.
The third hypothesis states that only the faculty of language in the narrow sense is unique to humans. It holds that while mechanisms of the FLb are present in both human and non-human animals, the computational mechanism of recursion is recently evolved solely in humans. This is the hypothesis which most closely aligns to the typical theory of universal grammar championed by Chomsky.
History
The term "universal grammar" predates Noam Chomsky, but pre-Chomskyan ideas of universal grammar are different. For Chomsky, UG is " theory of the genetically based language faculty", which makes UG a theory of language acquisition, and part of the innateness hypothesis. Earlier grammarians and philosophers thought about universal grammar in the sense of a universally shared property or grammar of all languages. The closest analog to their understanding of universal grammar in the late 20th century are Greenberg's linguistic universals.The idea of a universal grammar can be traced back to Roger Bacon's observations in his Overview of Grammar and Greek Grammar that all languages are built upon a common grammar, even though it may undergo incidental variations; and the 13th century speculative grammarians who, following Bacon, postulated universal rules underlying all grammars. The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. An influential work in that time was Grammaire générale by Claude Lancelot and Antoine Arnauld, who built on the works of René Descartes. They tried to describe a general grammar for languages, coming to the conclusion that grammar has to be universal. There is a Scottish school of universal grammarians from the 18th century, as distinguished from the philosophical language project, which included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith. The article on grammar in the first edition of the Encyclopædia Britannica contains an extensive section titled "Of Universal Grammar".
This tradition was continued in the late 19th century by Wilhelm Wundt and in the early 20th century by linguist Otto Jespersen. Jespersen disagreed with early grammarians on their formulation of "universal grammar", arguing that they tried to derive too much from Latin, and that a UG based on Latin was bound to fail considering the breadth of worldwide linguistic variation. He does not fully dispense with the idea of a "universal grammar", but reduces it to universal syntactic categories or super-categories, such as number, tenses, etc. Jespersen does not discuss whether these properties come from facts about general human cognition or from a language specific endowment. As this work predates molecular genetics, he does not discuss the notion of a genetically conditioned universal grammar.
During the rise of behaviorism, the idea of a universal grammar was discarded. In the early 20th century, language was usually understood from a behaviourist perspective, suggesting that language acquisition, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success. In other words, children learned their mother tongue by simple imitation, through listening and repeating what adults said. For example, when a child says "milk" and the mother will smile and give her child milk as a result, the child will find this outcome rewarding, thus enhancing the child's language development. UG reemerged to prominence and influence in modern linguistics with the theories of Chomsky and Montague in the 1950s–1970s, as part of the "linguistics wars".
In 2016 Chomsky and Berwick co-wrote their book titled Why Only Us, where they defined both the minimalist program and the strong minimalist thesis and its implications to update their approach to UG theory. According to Berwick and Chomsky, the strong minimalist thesis states that "The optimal situation would be that UG reduces to the simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is... called the Strong Minimalist Thesis." The significance of SMT is to significantly shift the previous emphasis on universal grammars to the concept which Chomsky and Berwick now call "merge". "Merge" is defined in their 2016 book when they state "Every computational system has embedded within it somewhere an operation that applies to two objects X and Y already formed, and constructs from them a new object Z. Call this operation Merge." SMT dictates that "Merge will be as simple as possible: it will not modify X or Y or impose any arrangement on them; in particular, it will leave them unordered, an important fact... Merge is therefore just set formation: Merge of X and Y yields the set."
Chomsky's theory
Chomsky argued that the human brain contains a limited set of constraints for organizing language. This implies in turn that all languages have a common structural basis: the set of rules known as "universal grammar".Speakers proficient in a language know which expressions are acceptable in their language and which are unacceptable. The key puzzle is how speakers come to know these restrictions of their language, since expressions that violate those restrictions are not present in the input, indicated as such. Chomsky argued that this poverty of stimulus means that Skinner's behaviourist perspective cannot explain language acquisition. The absence of negative evidence—evidence that an expression is part of a class of ungrammatical sentences in a given language—is the core of his argument. For example, in English, an interrogative pronoun like what cannot be related to a predicate within a relative clause:
Such expressions are not available to language learners: they are, by hypothesis, ungrammatical. Speakers of the local language do not use them, and would note them as unacceptable to language learners. Universal grammar offers an explanation for the presence of the poverty of the stimulus, by making certain restrictions into universal characteristics of human languages. Language learners are consequently never tempted to generalize in an illicit fashion.
Presence of creole languages
The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's controversial language bioprogram theory. Creoles are languages that develop and form when disparate societies come together and are forced to devise a new system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items, known as a pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole. Unlike pidgins, creoles have native speakers and make use of a full, systematic grammar.According to Bickerton, the idea of universal grammar is supported by creole languages because certain features are shared by virtually all in the category. For example, their default point of reference in time is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject. Another similarity among creoles can be seen in the fact that questions are created simply by changing the intonation of a declarative sentence, not its word order or content.
However, extensive work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar at all. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. They found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, they tend to standardize the language that they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin-development situation, children systematize the language they hear, based on the probability and frequency of forms, and not that which has been suggested on the basis of a universal grammar. Further, it seems to follow that creoles would share features with the languages from which they are derived, and thus look similar in terms of grammar.
Many researchers of universal grammar argue against a concept of relexification, which says that a language replaces its lexicon almost entirely with that of another. This goes against universalist ideas of a universal grammar, which has an innate grammar.
Criticisms
maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language. Similarly, Jeffrey Elman argues that the unlearnability of languages assumed by universal grammar is based on a too-strict, "worst-case" model of grammar, that is not in keeping with any actual grammar. In keeping with these points, James Hurford argues that the postulate of a language acquisition device essentially amounts to the trivial claim that languages are learnt by humans, and thus, that the LAD is less a theory than an explanandum looking for theories.Morten H. Christiansen and Nick Chater have argued that the relatively fast-changing nature of language would prevent the slower-changing genetic structures from ever catching up, undermining the possibility of a genetically hard-wired universal grammar. Instead of an innate universal grammar, they claim, "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics".
Hinzen summarizes the most common criticisms of universal grammar:
- Universal grammar has no coherent formulation and is indeed unnecessary.
- Universal grammar is in conflict with biology: it cannot have evolved by standardly accepted neo-Darwinian evolutionary principles.
- There are no linguistic universals: universal grammar is refuted by abundant variation at all levels of linguistic organization, which lies at the heart of human faculty of language.
Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time such as how children correct grammar generalizations like goed to went through repetitive failure. This implies that word learning is a probabilistic, error-driven process, rather than a process of fast mapping, as many nativists assume.
In the domain of field research, the Pirahã language is claimed to be a counterexample to the basic tenets of universal grammar. This research has been led by Daniel Everett. Among other things, this language is alleged to lack all evidence for recursion, including embedded clauses, as well as quantifiers and colour terms. According to the writings of Everett, the Pirahã showed these linguistic shortcomings not because they were simple-minded, but because their culture—which emphasized concrete matters in the present and also lacked creation myths and traditions of art making—did not necessitate it. Some other linguists have argued, however, that some of these properties have been misanalyzed, and that others are actually expected under current theories of universal grammar. Other linguists have attempted to reassess Pirahã to see if it did indeed use recursion. In a corpus analysis of the Pirahã language, linguists failed to disprove Everett's arguments against universal grammar and the lack of recursion in Pirahã. However, they also stated that there was "no strong evidence for the lack of recursion either" and they provided "suggestive evidence that Pirahã may have sentences with recursive structures".
Daniel Everett has argued that even if a universal grammar is not impossible in principle, it should not be accepted because we have equally or more plausible theories that are simpler. In his words, "universal grammar doesn't seem to work, there doesn't seem to be much evidence for . And what can we put in its place? A complex interplay of factors, of which culture, the values human beings share, plays a major role in structuring the way that we talk and the things that we talk about." Michael Tomasello, a developmental psychologist, also supports this claim, arguing that "although many aspects of human linguistic competence have indeed evolved biologically, specific grammatical principles and constructions have not. And universals in the grammatical structure of different languages have come from more general processes and constraints of human cognition, communication, and vocal-auditory processing, operating during the conventionalization and transmission of the particular grammatical constructions of particular linguistic communities."