Podcast Guide
Cover art for The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Jacob Buckman

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Recurrence and Attention for Long-Context Transformers with Jacob Buckman

Published
October 7, 2025
Duration
57:23
Summary source
description
Last updated
Apr 21, 2026

Discusses ai.

Summary

Today, we're joined by Jacob Buckman, co-founder and CEO of Manifest AI to discuss achieving long context in transformers. We discuss the bottlenecks of scaling context length and recent techniques to overcome them, including windowed attention, grouped query attention, and latent space attention. We explore the idea of weight-state balance and the weight…

Show notes

Today, we're joined by Jacob Buckman, co-founder and CEO of Manifest AI to discuss achieving long context in transformers. We discuss the bottlenecks of scaling context length and recent techniques to overcome them, including windowed attention, grouped query attention, and latent space attention. We explore the idea of weight-state balance and the weight-state FLOP ratio as a way of reasoning about the optimality of compute architectures, and we dig into the Power Retention architecture, which

Themes

  • ai