EarthRanger's user conference, v2 of Tulu and Unified-IO, a better way to model climate, and more.

Subscribe

January 2024

AI2 Newsletter

A photograph of attendees of the 2023 EarthRanger User Conference standing together in one room.

Top Story

EarthRanger User Conference On Demand

2023's EarthRanger User Conference (ERUC) showcased passion, projects, and environmental visionaries in Cape Town, South Africa. In its eighth year, ERUC featured over 200 speakers and put collaboration at the forefront as like-minded individuals connected over their efforts to save the planet. Now, anyone can access 62 videos taken from the event via the link below. You can also read the inspiring recap of the event on EarthRanger's blog.

Watch here ➞

Enhancing LM Adaptation With Tulu v2

How far can we push the limits of open-source models? That's what AI2 researchers are testing with Tulu v2. Tulu 2 models achieve at or near state-of-the-art performance on AlpacaEval and LMSYS’s Chatbot Arena for openly released models, and at the time of release were state-of-the-art on MT-Bench for all open models. The two core changes to v2 are a new high-quality dataset mixture and DPO training on top of SFT. Access the full paper and more assets in the researchers' blog post below.

Read the blog ➞

The Unreasonable Effectiveness of Easy Training Data

Can LLMs generalize from easy to hard problems? New research from AI2's Aristo team demonstrates that models solve college test questions when trained on 3rd-grade questions. This surprising result could be great news for the scalability and cost-effectiveness of model training.

A model trained on easy data (e.g., 3rd Grade problems) does almost as well on college test problems as a model trained on college problems (Mixtral-8x7B prompted with k = 10 examples). Random accuracy is 25%.
Learn more ➞
Windmills in a large body of waterl

ACE, the AI2 Climate Emulator

The last year has seen a revolution in the field of weather prediction as practitioners have adopted AI-driven forecasting tools. Climate modeling can similarly benefit from AI, but needs a longer timeframe than typical 14-day weather forecasts. The AI2 Climate Modeling team presents ACE, an emulator that can run a decade-long simulation in one hour, nearly 100 times faster and using 100 times less energy than an equivalent physics-based atmospheric model.

Learn more ➞
A dock leading out into a lake, with a gray, cloudy sky and mountains in the background.

Unified-IO 2

Training a multimodal AI Model with massive input and output modalities from scratch is a tall order, but that's exactly what a team of researchers have achieved with Unified-IO 2. Unified-IO 2 is the first autoregressive multimodal model that is capable of understanding and generating image, text, audio, and action, and the model, inference code, and training code are all open-source!

Learn more ➞

More from us

➞ Q&A: UW researchers answer common questions about language models like ChatGPT

➞ AI for Earth: 2023 in Review

➞ Designing Guiding Principles for NLP for Healthcare: A Case Study of Maternal Health

learn more about the allen institute of artificial intelligence

Work with us

AI2 Newsletter archive

X
LinkedIn
YouTube
Website

Allen Institute for AI, 2157 N Northlake Way, Suite 110, Seattle, WA 98103, USA