Natural Language Understanding Wiki
Register
Advertisement

The holy grail of NLP is to perform complex common-sense reasoning about the content of the text while reading. Reasoning is not possible without assumptions of what certain part of the text is about. Global inference (where the interpretation of the whole text is assigned at once) is not capable of doing this. On the other end of the spectrum, incremental inference create interpretation of atomic chunks of the text one by one. As of 2017, few researchers pursuit an incremental approach and papers about it often motivated by practical reasons or psychological plausibility.

TODO: http://www.coli.uni-saarland.de/~vera/incrementalitySeminar.html

This article is in an early stage. Action is needed to make it more useful.


Global learning and beam search[]

"a framework of online global discriminative learning and beam-search decoding for syntactic processing (Zhang and Clark, 2011b), which has recently been applied to a wide variety of natural language processing (NLP) tasks, including word segmentation (Zhang and Clark, 2007), dependency parsing (Zhang and Clark, 2008b; Huang and Sagae, 2010; Zhang and Nivre, 2011; Bohnet and Kuhn, 2012), context free grammar (CFG) parsing (Collins and Roark, 2004; Zhang and Clark, 2009; Zhu et al., 2013), combinational categorial grammar (CCG) parsing (Zhang and Clark, 2011a; Xu et al., 2014) and machine translation (Liu, 2013), achieving state- of-the-art accuracies and efficiencies. In addition, due to its high efficiencies, it has also been applied to a range of joint structural problems, such as joint segmentation and POS-tagging (Zhang and Clark, 2008a; Zhang and Clark, 2010), joint POS-tagging and dependency parsing (Hatori et al., 2011; Bohnet and Nivre, 2012), joint mor- phological analysis, POS-tagging and dependency parsing (Bohnet et al., 2013), and joint segmenta- tion, POS-tagging and parsing (Zhang et al., 2013; Zhang et al., 2014).

...

First, beam-search enables highly efficient decoding, which typically has linear time complexity, depending on the incremental process. Second, free from DP-style constraints and Markov-style independence assumptions, the framework allows arbitrary features to be defined to capture structural patterns. In addition to feature advantages, the high accuracies of this framework are also enabled by direct interactions between learning and search (Daume ́ III and Marcu, 2005; Huang et al., 2012; Zhang and Nivre, 2012)." [1]

See also: [2]

  • Syntax: Zhang & Nivre (2012)[1]
  • Semantics: Das et al. (2014)?[2],

Konstas et al. (2014)[3].

Note: Beam search can hurt performance e.g. when a parser is trained on gold decisions (Webster and Curran, 2014, pp. 2132[4]; Zhang and Nivre, 2012[1])

Tasks[]

Dependency parsing[]

Constituency parsing[]

Mi & Huang (2015)

Cross & Huang (2016)[5]

Semantic parsing[]

Zhao & Huang (2015), Ambati et al. (2015).

AMR parsing[]

[3]

Coreference resolution[]

Webster and Curran (2014)[4] managed to achieve good results (in the low 60s) using a shift-reduce algorithm. The stack contains clusters and ones that are linked are promoted to the top to encode (psychological) recency.

Tuggener (2016)[6]: for German

Joint (multitask) systems[]

Bohnet & Nivre, 2012[7]; Hatori et al., 2012[8]

References[]

  1. 1.0 1.1 Zhang, Y., & Nivre, J. (2012). Analyzing the Effect of Global Learning and Beam-Search on Transition-Based Dependency Parsing. In COLING (Posters) (pp. 1391–1400).
  2. Das, D., Chen, D., Martins, A. F., Schneider, N., & Smith, N. A. (2014). Frame-semantic parsing. Computational Linguistics, 40(1), 9-56.
  3. Konstas, I., Keller, F., Demberg, V., & Lapata, M. (2014). Incremental Semantic Role Labeling with Tree Adjoining Grammar.
  4. 4.0 4.1 Webster, K., & Curran, J. R. (2014). Limited memory incremental coreference resolution. In COLING (pp. 2129–2139).
  5. Cross, J., & Huang, L. (2016). Incremental parsing with minimal features using bi-directional LSTM. arXiv preprint arXiv:1606.06406.
  6. Tuggener, Don. Incremental Coreference Resolution for German. Diss. University of Zurich, 2016.
  7. Bohnet, B. & Nivre, J. (2012) A Transition-Based System for Joint Part-of-Speech Tagging and Labeled Non-Projective Dependency Parsing. In Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Jeju Island, Korea, July 12-14, 2012, pp. 1455–1465. Association for Computational Linguistics. (ISBN: 978-1-937284-43-5).
  8. Hatori, J., Matsuzaki, T., Miyao, Y. & Tsujii, J. (2012) Incremental Joint Approach to Word Segmentation, POS Tagging, and Dependency Parsing in Chinese. In Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Jeju Island, Korea, July 8-14, 2012. pp. 1216-1224. Association for Computational Linguistics. (ISBN: 978-1-937284-24-4).
Advertisement