Science Parse parses scientific papers (in PDF form) and returns them in structured form. It supports these fields: title, authors, abstract, sections (each with heading and body text), bibliography entries, each with title, authors, venue, and year, and finally mentions (i.e., places in the paper where bibliography entries are mentioned).
Given a pair of sentences (premise, hypothesis), the decomposed graph entailment model (DGEM) predicts whether the premise can be used to infer the hypothesis. The model decomposes the support for a structured graph representation of the hypothesis into support for its individual nodes and edges. This model was designed for the SciTail dataset and is described in more detail in SciTail: A Textual Entailment Dataset from Science Question Answering (AAAI’18). This repository also contains two baseline textual entailment models built using our NLP library, AllenNLP.
With alexafsm, developers can model dialog agents with first-class concepts such as states, attributes, transition, and actions. alexafsm also provides visualization and other tools to help understand, test, debug, and maintain complex FSM conversations. Learn more about alexafsm in this article.
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.