Menu
Viewing 5 papers from 2017 in AllenNLP
Clear all
    • ACL 2017
      Matthew E. Peters, Waleed Ammar, Chandra Bhagavatula, and Russell Power
      Pre-trained word embeddings learned from unlabeled text have become a standard component of neural network architectures for NLP tasks. However, in most cases, the recurrent network that operates on word-level representations to produce context sensitive representations is trained on relatively…  (More)
    • ACL 2017
      Srinivasan Iyer, Ioannis Konstas, Alvin Cheung, Jayant Krishnamurthy, and Luke Zettlemoyer
      We present an approach to rapidly and easily build natural language interfaces to databases for new domains, whose performance improves over time based on user feedback, and requires minimal intervention. To achieve this, we adapt neural sequence models to map utterances directly to SQL with its…  (More)
    • EMNLP 2017
      Kenton Lee, Luheng He, Mike Lewis, and Luke Zettlemoyer
      We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or handengineered mention detector. The key idea is to directly consider all spans in a document as potential mentions and learn distributions…  (More)
    • EMNLP 2017
      Jayant Krishnamurthy, Pradeep Dasigi, and Matt Gardner
      We present a new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables. Our parser is an encoder-decoder neural network with two key technical innovations: (1) a grammar for the decoder that only generates well-typed logical forms; and (2) an entity…  (More)
    • ACL 2017
      Luheng He, Kenton Lee, Mike Lewis, Luke S. Zettlemoyer
      We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations. We use a deep highway BiLSTM architecture with constrained decoding, while observing a number of recent…  (More)