Syntax

In linguistics, syntax ()[1][2] is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, agreement, the nature of crosslinguistic variation, and the relationship between form and meaning. There are numerous approaches to syntax which differ in their central assumptions and goals.

The word syntax comes from Ancient Greek: σύνταξις "coordination", which consists of σύν syn, "together", and τάξις táxis, "an ordering".

One basic description of a language's syntax is the sequence in which the subject (S), verb (V), and object (O) usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV. The other possible sequences are VSO, VOS, OVS, and OSV, the last three of which are rare. In most generative theories of syntax, these surface differences arise from a more complex clausal phrase structure, and each order may be compatible with multiple derivations.

The Aṣṭādhyāyī of Pāṇini (c. 4th century BC in Ancient India), is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory (as works on grammar were written long before modern syntax came about).[3] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.

For centuries, a framework known as grammaire générale (first expounded in 1660 by Antoine Arnauld in a book of the same title) dominated work in syntax: as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought.[citation needed]

However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.[citation needed]

The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire générale.[4]) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "subject – copula – predicate". Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.

The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001).[5])

There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton,[6] sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists (e.g., Gerald Gazdar) take a more Platonistic view, since they regard syntax to be the study of an abstract formal system.[7] Yet others (e.g., Joseph Greenberg) consider syntax a taxonomical device to reach broad generalizations across languages.

Syntacticians have attempted to explain the causes of word-order variation within individual languages and cross-linguistically. Much of such work has been done within frameworks of generative grammar which assumes that the core of syntax depends on a genetic structure which is common to all mankind. Typological research of the languages of the world has however found few absolute universals, leading some to conclude that none of syntax has to be directly genetic.

Alternative explanations have been sought in language processing. It is suggested that the brain finds it easier to parse syntactic patterns which are either right or left branching, but not mixed. The most widely held approach is the performance–grammar correspondence hypothesis by John A. Hawkins who suggests that language is a non-innate adaptation to innate cognitive mechanisms. Cross-linguistic tendencies are considered as being based on language users' preference for grammars that are organized efficiently, and on their avoidance of word orderings which cause processing difficulty. Some languages however exhibit regular inefficient patterning. These include the VO languages Chinese, with the adpositional phrase before the verb, and Finnish which has postpositions; but there are few other profoundly exceptional languages.[8]

Dependency grammar is an approach to sentence structure where syntactic units are arranged according to the dependency relation, as opposed to the constituency relation of phrase structure grammars. Dependencies are directed links between words. The (finite) verb is seen as the root of all clause structure and all the other words in the clause are either directly or indirectly dependent on this root. Some prominent dependency-based theories of syntax are:

Lucien Tesnière (1893–1954) is widely seen as the father of modern dependency-based theories of syntax and grammar. He argued vehemently against the binary division of the clause into subject and predicate that is associated with the grammars of his day (S → NP VP) and which remains at the core of most phrase structure grammars. In the place of this division, he positioned the verb as the root of all clause structure.[9]

Categorial grammar is an approach in which constituents combine as function and argument, according to combinatory possibilities specified in their syntactic categories. For example, where other approaches might posit a rule that combines a noun phrase (NP) and a verb phrase (VP), CG would posit a syntactic category NP and another NP\S, read as "a category that searches to the left (indicated by \) for an NP (the element on the left) and outputs a sentence (the element on the right)." So the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word requiring an NP as an input and produces a sentence level structure as an output. This complex category is notated as (NP\S) instead of V. The category of transitive verb is defined as an element that requires two NPs (its subject and its direct object) to form a sentence. This is notated as (NP/(NP\S)) which means "a category that searches to the right (indicated by /) for an NP (the object), and generates a function (equivalent to the VP) which is (NP\S), which in turn represents a function that searches to the left for an NP and produces a sentence."

Tree-adjoining grammar is a categorial grammar that adds in partial tree structures to the categories.

Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars. One common implementation of such an approach makes use of a neural network or connectionism.

Functionalist models of grammar study the form–function interaction by performing a structural and a functional analysis.

Generative syntax is the study of syntax within the overarching framework of generative grammar. Generative theories of syntax typically propose analyses of grammatical patterns using formal tools such as phrase structure grammars augmented with additional operations such as syntactic movement. Their goal in analyzing a particular language is to specify rules which generate all and only the expressions which are well-formed in that language. In doing so, they seek to identify innate domain-specific principles of linguistic cognition, in line with the wider goals of the generative enterprise. Generative syntax is among the approaches that adopt the principle of the autonomy of syntax, assuming that meaning and communicative intent is determined by the syntax rather than the other way around.

Generative syntax was proposed in the late 1950s by Noam Chomsky, building on earlier work by Zellig Harris, Louis Hjelmslev, among others. Since then, numerous theories have been proposed under its umbrella:

The Cognitive Linguistics framework stems from generative grammar, but adheres to evolutionary rather than Chomskyan linguistics. Cognitive models often recognise the generative assumption that the object belongs to the verb phrase. Cognitive frameworks include: