Dependency Grammar

Jump to: navigation, search

A Dependency Grammar is a Natural Language Grammar that is based on an underlying relationship between words.



  • (Wikipedia, 2013) ⇒ Retrieved:2013-12-3.
    • Dependency grammar (DG) is a class of modern syntactic theories that are all based on the dependency relation and that can be traced back primarily to the work of Lucien Tesnière. The dependency relation views the (finite) verb as the structural center of all clause structure. All other syntactic units (e.g. words) are either directly or indirectly dependent on the verb. DGs are distinct from phrase structure grammars (= constituency grammars), since DGs lack phrasal nodes. Structure is determined by the relation between a word (a head) and its dependents. Dependency structures are flatter than constituency structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech and Turkish.


  • (Dickinson, 2005) ⇒ M. Dickinson. (2005). “Dependency Grammar. Lecture notes. Linguistics 564 - Computational Grammar Formalisms. Georgetown University. (slides.pdf)



  • 1998_ALinguisticComparisonConstituencyDependency
    • "Word Grammar [Hudson 1984, 1990, 1996] is a grammar system which encompasses both a syntactic and a semantic theory.
  • (Schneider, 1998) ⇒ G. Schneider. 1998. A Linguistic Comparison of Constituency, Dependency, and Link Grammar. Master's Thesis, University of Zurich. (thesis.pdf)
    • Connection, which corresponds to dependency in dependency grammar, is the most basic relation between words [Weber 1997: 21 ff.]. The simple sentence Peter sleeps consists of the elements (i) Peter, (ii) sleeps and (iii) the connection between them: Peter <= sleeps (1) shows one common way to express dependency. The arrow points from the head to the dependent. (The direction of dependencies will be discussed in 2.3.2) … Assuming Boolean true for the following functions, f(A) expresses that A is an element of the set f, while A(f) expresses that f is an element of the set A. sleeps(Peter) means that there is a set of sleepers, and that Peter is one of them. Peter(sleeps) means that sleeping is one of the activities of the set of activities of Peter. It is difficult to assess which of these functions is “closer” to the meaning of the sentence Peter sleeps.
    • According to [Radford 1988: 545 ff] every construction is now assumed to have a head, an assumption expressed as 'The Endocentricity Constraint'.
    • Heads always subcategorise for dependents. A sentence element (a word, a compound, a nucleus or an idiomatic expression) may have several dependents, but usually only one head. Because the notion of head is so important, discussions on its status take a very prominent role in dependency theory.


  • (Tapanainen and Järvinen, 1997) ⇒ P. Tapanainen and Timo Järvinen. '1997. Non-Projective Dependency Parser.” In: Proceedings of the 5th Conference on Applied Natural Language Processing (ANLP). (paper)


  • (Mel'čuk, 1988) ⇒ I. Mel'čuk (1988). “Dependency Syntax: Theory and Practice." The SUNY Press.
    • "PS-syntax's main working principle is CONSTITUENCY and CONSTITUENT CATEGORIES, which tends to insist on taxonomy, i.e. classification and distribution. Dependency (= D-) syntax is based on RELATIONS between ultimate syntactic units, and it therefore tends to be concerned with meaningful links, i.e. semantics.
    • "A D-tree reveals the structure of an expression in terms of hierarchical links between its actual elements.


  • (Hudson 1980: 196)
    • "I have also argued (...) that ’dependency’ is just another name for the relation between a ’frame’ and its ’slots’ – in other words, for the relation of 'strict subcategorisation', which in transformational grammar is relegated to the lexicon. If the claims of this paper are right, then, they constitute strong further support for the 'pan-lexical' model in which the grammar is virtually identified with the lexicon. … Like the traditional the pan-lexicon refers only to words.