Skip to main content ->
Ai2

Latest research

February 12, 2026

Introducing AutoDiscovery: Automated scientific discovery, now in AstaLabs

AutoDiscovery explores data autonomously, generating its own hypotheses to surface surprising findings that researchers might never have thought to look for.
Read post
February 12, 2026

How researchers are using AutoDiscovery

Learn about how researchers are using AutoDiscovery, our scientific discovery tool, to make transformative impact across their fields.
Read post
February 11, 2026

MolmoSpaces, an open ecosystem for embodied AI

MolmoSpaces is our new open platform for embodied AI that provides physics-grounded scenes, objects, and grasp annotations to train and evaluate generalist robotic policies.
Read post
February 10, 2026

How2Everything: Mining the web to evaluate and improve LLMs on real-world procedures

How2Everything is an open framework for evaluating and improving how well LLMs generate step-by-step procedures.
Read post
February 4, 2026

Now in Nature: Synthesizing scientific literature with retrieval-augmented LMs

We're excited to share that our paper “Synthesizing scientific literature with retrieval-augmented language models” has been accepted to Nature.
Read post
January 28, 2026

Theorizer: Turning thousands of papers into scientific laws

Theorizer is a system that automatically reads scientific literature and synthesizes structured, testable theories.
Read post
January 27, 2026

Open Coding Agents: Fast, accessible coding agents that adapt to any repo

SERA is the first in our family of Open Coding Agents, achieving state-of-the-art performance at low cost.
Read post
January 21, 2026

HiRO-ACE: An accessible solution for kilometer-scale climate simulation

HiRO-ACE is an AI framework that makes kilometer-scale climate simulation dramatically more accessible, generating decades of precipitation data for any region of the globe.
Read post
December 15, 2025

Introducing Bolmo: Byteifying the next generation of language models

Bolmo is new a byte-level family built by adapting Olmo 3 into a fast, flexible byte-based model with a short extra training run.
Read post