FANDOM


TODO: read Weiss et al. (2015) more carefully

PennTreebank WSJ, Stanford dependency, transition-based Edit

Model Greedy Beam search Reference Notes
UAS LAS UAS LAS
Neural network, cube, 1 hidden layer, AdaGrad 91.8 89.6 Chen & Manning (2014)[1]
Stack-LSTM 93.1 90.9 Dyer et al. (2015)[2]
ReLu, 2 hidden layers, momentum 93.19 91.18 Weiss et al. (2015)[3]
ReLu, 2 hidden layers, momentum+structured perceptron 93.99 92.05 Weiss et al. (2015)[3] beam size=8
ReLu, 2 hidden layers, momentum+tri-training 93.19 91.18 Weiss et al. (2015)[3]
ReLu, 2 hidden layers, momentum+structured perceptron+tri-training 94.26 92.41 Weiss et al. (2015)[3] beam size=8

PennTreebank WSJ, Stanford dependency, graph-based Edit

Model UAS LAS Reference Notes
First-order, tanh-cube 92.14 90.92 Pei et al. (2015)[4]
First-order, tanh-cube, phrase vector 92.59 91.37 Pei et al. (2015)[4]
Second-order, tanh-cube, phrase vector 93.29 92.13 Pei et al. (2015)[4]

References Edit

  1. Chen, D., & Manning, C. (2014). A Fast and Accurate Dependency Parser using Neural Networks. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) (pp. 740–750). Doha, Qatar: Association for Computational Linguistics.
  2. Dyer, C., Ballesteros, M., Ling, W., Matthews, A., & Smith, N. A. (2015). Transition-Based Dependency Parsing with Stack Long Short-Term Memory. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 334–343. Retrieved from http://www.aclweb.org/anthology/P15-1033
  3. 3.0 3.1 3.2 3.3 Weiss, D., Alberti, C., Collins, M., & Petrov, S. (2015). Structured Training for Neural Network Transition-Based Parsing. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (pp. 323–333). Association for Computational Linguistics.
  4. 4.0 4.1 4.2 Pei, W., Ge, T., & Chang, B. (2015). An Effective Neural Network Model for Graph-based Dependency Parsing. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (pp. 313–322). Association for Computational Linguistics.