Syntax (linguistics)

From Citizendium
Revision as of 06:00, 19 August 2008 by imported>John Stephenson (disambig box)
Jump to navigation Jump to search
This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.
Linguistics
Phonology
Syntax
Morphology
Semantics
Pragmatics
Theoretical linguistics
Generative linguistics
Cognitive linguistics
Language acquisition
First language acquisition
Second language acquisition
Applied linguistics
Psycholinguistics
Phonetics
Sociolinguistics
Creolistics
Evolutionary linguistics
Linguistic variation
Linguistic typology
Anthropological linguistics
Computational linguistics
Descriptive linguistics
Historical linguistics
Comparative linguistics
History of linguistics
Languagenaturalconstructed
Grammar
This article is about syntax in linguistics. For other uses of the term syntax, please see syntax (disambiguation).

In linguistics, syntax[1] is the study of the rules, or 'patterned relations' that govern the way words combine to form phrases and phrases to form sentences. The combinatory behaviour of words is governed to a first approximation by their part of speech (noun, adjective, verb, etc., a categorization that goes back in the Western tradition to the Greek grammarian Dionysios Thrax). Modern research into natural language syntax attempts to systematize descriptive grammar and, for many practitioners, to find general laws that govern the syntax of all languages. It is unconcerned with prescriptive grammar (see Prescription and description).

There are many theories of formal syntax - theories that have in time risen or fallen in influence. Most theories of syntax share at least two commonalities. First, they hierarchically group subunits into constituent units (phrases). Second, they provide some system of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality. Most formal theories of syntax offer explanations of the systematic relationships between syntactic form and semantic meaning. Syntax is defined, within the study of signs, as the first of its three subfields (the study of the interrelation of the signs). The second subfield is semantics (the study of the relation between the signs and the objects to which they apply), and the third is pragmatics (the relationship between the sign system and the user).

In the framework of transformational-generative grammar (of which Government and Binding Theory and Minimalism are recent developments), the structure of a sentence is represented by phrase structure trees, otherwise known as phrase markers or tree diagrams. Such trees provide information about the sentences they represent by showinging the hierarchical relations between their component parts.

There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are Phrase structure grammars and ID/LP grammars, the latter having a slight explanatory advantage over the former.[2] Dependency grammar is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. Algebraic syntax is a type of dependency grammar.

A modern approach to combining accurate descriptions of the grammatical patterns of language with their function in context is that of systemic functional grammar, an approach originally developed by Michael A.K. Halliday in the 1960s and now pursued actively on all continents. Systemic-functional grammar is related both to feature-based approaches such as Head-driven phrase structure grammar and to the older functional traditions of European schools of linguistics such as British Contextualism and the Prague School.

Tree adjoining grammar is a grammar formalism with interesting mathematical properties which has sometimes been used as the basis for the syntactic description of natural language. In monotonic and monostratal frameworks, variants of unification grammar are often preferred formalisms.


Footnotes

  1. From the Greek συν (syn, meaning 'co-' or 'together') and τάξις (táxis, meaning 'sequence, order, arrangement').
  2. citation needed


See also

External links