Lasting Impact Award

The Lasting Impact Award honors the most impactful paper published by AI2 researchers three years prior to the award date. Papers are nominated by our institute's research leaders, and the final winner is selected by our Scientific Advisory Board using several criteria, including a paper's Highly Influential Citations.


  • Bidirectional Attention Flow for Machine Comprehension

    Minjoon Seo, Aniruddha Kembhavi, Ali Farhadi, and Hannaneh HajishirziICLR2017
    Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query. Recently, attention mechanisms have been successfully extended to MC. Typically these methods use attention to focus on a small portion of the context and summarize it with a fixed-size vector, couple attentions temporally, and/or often form a uni-directional attention. In this paper we introduce the Bi-Directional Attention Flow (BIDAF) network, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Our experimental evaluations show that our model achieves the state-of-the-art results in Stanford Question Answering Dataset (SQuAD) and CNN/DailyMail cloze test.
  • Lasting Impact Award Winners 2020


  • XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks

    Mohammad Rastegari, Vicente Ordonez, Joseph Redmon, and Ali FarhadiECCV2016
    We propose two efficient approximations to standard convolutional neural networks: Binary-Weight-Networks and XNOR-Networks. In Binary-Weight-Networks, the filters are approximated with binary values resulting in 32x memory saving. In XNOR-Networks, both the filters and the input to convolutional layers are binary. XNOR-Networks approximate convolutions using primarily binary operations. This results in 58x faster convolutional operations (in terms of number of the high precision operations) and 32x memory savings. XNOR-Nets offer the possibility of running state-of-the-art networks on CPUs (rather than GPUs) in real-time. Our binary networks are simple, accurate, efficient, and work on challenging visual tasks. We evaluate our approach on the ImageNet classification task. The classification accuracy with a Binary-Weight-Network version of AlexNet is the same as the full-precision AlexNet. We compare our method with recent network binarization methods, BinaryConnect and BinaryNets, and outperform these methods by large margins on ImageNet, more than 16% in top-1 accuracy.
  • 2019 Lasting Impact Award Winners