FANDOM


TODO: read Weiss et al. (2015) more carefully

PennTreebank WSJ, Stanford dependency, transition-based Edit

Model Greedy Beam search Reference Notes
UAS LAS UAS LAS
Neural network, cube, 1 hidden layer, AdaGrad 91.8 89.6 bedru
Stack-LSTM 93.1 90.9 Dyer et al. (2015)[1]
ReLu, 2 hidden layers, momentum 93.19 91.18 Weiss et al. (2015)[2]
ReLu, 2 hidden layers, momentum+structured perceptron 93.99 92.05 Weiss et al. (2015)[2] beam size=8
ReLu, 2 hidden layers, momentum+tri-training 93.19 91.18 Weiss et al. (2015)[2]
ReLu, 2 hidden layers, momentum+structured perceptron+tri-training 94.26 92.41 Weiss et al. (2015)[2] beam size=8

PennTreebank WSJ, Stanford dependency, graph-based Edit

Model UAS LAS Reference Notes
First-order, tanh-cube 92.14 90.92 Pei et al. (2015)[3]
First-order, tanh-cube, phrase vector 92.59 91.37 Pei et al. (2015)[3]
Second-order, tanh-cube, phrase vector 93.29 92.13 Pei et al. (2015)[3]

References Edit

  1. Dyer, C., Ballesteros, M., Ling, W., Matthews, A., & Smith, N. A. (2015). Transition-Based Dependency Parsing with Stack Long Short-Term Memory. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), 334–343. Retrieved from http://www.aclweb.org/anthology/P15-1033
  2. 2.0 2.1 2.2 2.3 Weiss, D., Alberti, C., Collins, M., & Petrov, S. (2015). Structured Training for Neural Network Transition-Based Parsing. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (pp. 323–333). Association for Computational Linguistics.
  3. 3.0 3.1 3.2 Pei, W., Ge, T., & Chang, B. (2015). An Effective Neural Network Model for Graph-based Dependency Parsing. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (pp. 313–322). Association for Computational Linguistics.
Community content is available under CC-BY-SA unless otherwise noted.