Syntax (linguistics): Difference between revisions
imported>Markus Baumeister m (Added Computers Workgroup cat and WP attribution was missing) |
imported>John Stephenson (Cleanup) |
||
Line 1: | Line 1: | ||
{{linguistics}} | {{linguistics}} | ||
In [[linguistics]], '''syntax''', originating from the [[Greek language|Greek]] words συν (''syn'', meaning "co-" or "together") and τάξις (''táxis'', meaning "sequence, order, arrangement"), is the study of the rules, or "patterned relations" that govern the way words combine to form phrases and phrases to form sentences. The combinatory behavior of words is governed to a first approximation by their [[part of speech]] ([[noun]], [[adjective]], [[verb]], etc., a categorization that goes back in the Western tradition to the Greek grammarian [[Dionysios Thrax]]). Modern research into natural language syntax attempts to systematize descriptive grammar and, for many practitioners, to find general laws that govern the syntax of all languages. It is unconcerned with prescriptive grammar (see [[Prescription and description]]). | In [[linguistics]], '''syntax''', originating from the [[Greek language|Greek]] words συν (''syn'', meaning "co-" or "together") and τάξις (''táxis'', meaning "sequence, order, arrangement"), is the study of the rules, or "patterned relations" that govern the way words combine to form phrases and phrases to form sentences. The combinatory behavior of words is governed to a first approximation by their [[part of speech]] ([[noun]], [[adjective]], [[verb]], etc., a categorization that goes back in the Western tradition to the Greek grammarian [[Dionysios Thrax]]). Modern research into natural language syntax attempts to systematize descriptive grammar and, for many practitioners, to find general laws that govern the syntax of all languages. It is unconcerned with prescriptive grammar (see [[Prescription and description]]). | ||
There are many theories of ''formal syntax'' — theories that have in time risen or fallen in influence. Most theories of syntax share at least two commonalities. First, they hierarchically group subunits into constituent units (phrases). Second, they provide some system of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality. Most formal theories of syntax offer explanations of the systematic relationships between syntactic form and [[semantic]] meaning. | There are many theories of ''formal syntax'' — theories that have in time risen or fallen in influence. Most theories of syntax share at least two commonalities. First, they hierarchically group subunits into constituent units (phrases). Second, they provide some system of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality. Most formal theories of syntax offer explanations of the systematic relationships between syntactic form and [[semantic]] meaning. Syntax is defined, within the study of [[sign|signs]], as the first of its three subfields (the study of the interrelation of the signs). The second subfield is [[semantics]] (the study of the relation between the signs and the objects to which they apply), and the third is [[pragmatics]] (the relationship between the sign system and the user). | ||
In the framework of [[transformational-generative grammar]] (of which ''[[Government and binding|Government and Binding Theory]]'' and ''Minimalism'' are recent developments), the structure of a [[Sentence (linguistics)|sentence]] is represented by ''phrase structure trees'', otherwise known as ''phrase markers'' or ''tree diagrams''. Such trees provide information about the sentences they represent by showinging the hierachical relations between their component parts. | In the framework of [[transformational-generative grammar]] (of which ''[[Government and binding|Government and Binding Theory]]'' and ''Minimalism'' are recent developments), the structure of a [[Sentence (linguistics)|sentence]] is represented by ''phrase structure trees'', otherwise known as ''phrase markers'' or ''tree diagrams''. Such trees provide information about the sentences they represent by showinging the hierachical relations between their component parts. | ||
There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are [[Phrase structure grammar]]s and [[ID/LP grammar]]s, the latter having a slight explanatory advantage over the former. | There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are [[Phrase structure grammar]]s and [[ID/LP grammar]]s, the latter having a slight explanatory advantage over the former.<ref>citation needed</ref> [[Dependency grammar]] is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. [[Algebraic syntax]] is a type of dependency grammar. | ||
A modern approach to combining accurate descriptions of the grammatical patterns of | A modern approach to combining accurate descriptions of the grammatical patterns of | ||
Line 19: | Line 18: | ||
The syntax of computer languages is often at level-2 (ie, a [[context-free grammar]]) in the [[Chomsky hierarchy]]. As such the possible ordering of tokens is usually very restricted. The analysis of a program's syntax is usually performed using an automatically generated program known as a [[parser]] which often builds an [[abstract syntax tree]]. | The syntax of computer languages is often at level-2 (ie, a [[context-free grammar]]) in the [[Chomsky hierarchy]]. As such the possible ordering of tokens is usually very restricted. The analysis of a program's syntax is usually performed using an automatically generated program known as a [[parser]] which often builds an [[abstract syntax tree]]. | ||
==Footnotes== | |||
<div class="references-2column"> | |||
<references/> | |||
</div> | |||
==See also== | ==See also== | ||
* [[Grammar]] | * [[Grammar]] | ||
* [[ | * [[Linguistics]] | ||
==External links== | ==External links== | ||
* [http://www.allsyntax.com AllSyntax.com Programming Languages] | |||
* [http://www.ling.upenn.edu/~beatrice/syntax-textbook The syntax of natural language] | * [http://www.ling.upenn.edu/~beatrice/syntax-textbook The syntax of natural language: | ||
An online introduction using the Trees program] by Beatrice Santorini & Anthony Kroch, University of Pennsylvania. | |||
[[Category:Linguistics]] | [[Category:Linguistics]] | ||
[[Category:Linguistics Workgroup]] | [[Category:Linguistics Workgroup]] | ||
[[Category:Computers Workgroup]] | [[Category:Computers Workgroup]] |
Revision as of 23:35, 11 March 2007
In linguistics, syntax, originating from the Greek words συν (syn, meaning "co-" or "together") and τάξις (táxis, meaning "sequence, order, arrangement"), is the study of the rules, or "patterned relations" that govern the way words combine to form phrases and phrases to form sentences. The combinatory behavior of words is governed to a first approximation by their part of speech (noun, adjective, verb, etc., a categorization that goes back in the Western tradition to the Greek grammarian Dionysios Thrax). Modern research into natural language syntax attempts to systematize descriptive grammar and, for many practitioners, to find general laws that govern the syntax of all languages. It is unconcerned with prescriptive grammar (see Prescription and description).
There are many theories of formal syntax — theories that have in time risen or fallen in influence. Most theories of syntax share at least two commonalities. First, they hierarchically group subunits into constituent units (phrases). Second, they provide some system of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality. Most formal theories of syntax offer explanations of the systematic relationships between syntactic form and semantic meaning. Syntax is defined, within the study of signs, as the first of its three subfields (the study of the interrelation of the signs). The second subfield is semantics (the study of the relation between the signs and the objects to which they apply), and the third is pragmatics (the relationship between the sign system and the user).
In the framework of transformational-generative grammar (of which Government and Binding Theory and Minimalism are recent developments), the structure of a sentence is represented by phrase structure trees, otherwise known as phrase markers or tree diagrams. Such trees provide information about the sentences they represent by showinging the hierachical relations between their component parts.
There are various theories as to how best to make grammars such that by systematic application of the rules, one can arrive at every phrase marker in a language (and hence every sentence in the language). The most common are Phrase structure grammars and ID/LP grammars, the latter having a slight explanatory advantage over the former.[1] Dependency grammar is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. Algebraic syntax is a type of dependency grammar.
A modern approach to combining accurate descriptions of the grammatical patterns of language with their function in context is that of systemic functional grammar, an approach originally developed by Michael A.K. Halliday in the 1960s and now pursued actively on all continents. Systemic-functional grammar is related both to feature-based approaches such as Head-driven phrase structure grammar and to the older functional traditions of European schools of linguistics such as British Contextualism and the Prague School.
Tree adjoining grammar is a grammar formalism with interesting mathematical properties which has sometimes been used as the basis for the syntactic description of natural language. In monotonic and monostratal frameworks, variants of unification grammar are often preferred formalisms.
Syntax in computer science
Another, though related, meaning of the term syntax has been evolved in the field of computer science, especially in the subfield of programming languages, where the set of allowed reserved words and possible token order in a program is called the syntax of a language.
The syntax of computer languages is often at level-2 (ie, a context-free grammar) in the Chomsky hierarchy. As such the possible ordering of tokens is usually very restricted. The analysis of a program's syntax is usually performed using an automatically generated program known as a parser which often builds an abstract syntax tree.
Footnotes
- ↑ citation needed
See also
External links
- AllSyntax.com Programming Languages
- [http://www.ling.upenn.edu/~beatrice/syntax-textbook The syntax of natural language:
An online introduction using the Trees program] by Beatrice Santorini & Anthony Kroch, University of Pennsylvania.