DFKI-LT - A Novel Disambiguation Method For Unification-Based Grammars Using Probabilistic Context-Free Approximations
A Novel Disambiguation Method For Unification-Based Grammars Using Probabilistic Context-Free Approximations
3 Proceedings of the 19th International Conference on Computational Linguistics (COLING'02), August 24 - September 1, o.A., Taipei, Taiwan, 2002
We present a novel disambiguation method for unificationbased grammars (UBGs). In contrast to other methods, our approach obviates the need for probability models on the UBG side in that it shifts the responsibility to simpler contextfree models, indirectly obtained from the UBG. Our approach has three advantages: (i) training can be effectively done in practice, (ii) parsing and disambiguation of contextfree readings requires only cubic time, and (iii) involved probability distributions are mathematically clean. In an experiment for a midsize UBG, we show that our novel approach is feasible. Using unsupervised training, we achieve 88% accuracy on an exactmatch task.
Files: BibTeX, Kiefer:2002:NDM.pdf