a builder's codex
codex · operators · Sherwin Wu · ins_build-for-next-model-not-current

Build for the model six months out — the current model will eat your scaffolding

By Sherwin Wu · Head of Engineering, OpenAI API and Developer Platform · 2026-04-28 · podcast · Sherwin Wu — Codex inside OpenAI, engineers as managers — Lenny's Podcast

Tier A · TL;DR
Build for the model six months out — the current model will eat your scaffolding

Claim

"This is the worst the models will ever be." Build product surfaces and harnesses that the next model will make work, not the current model's scaffolding. Customer requests for V1 scaffolding are valid, but the V2 model will obsolete most of those requests on arrival. Heavy investment in current-model plumbing is depreciating before you ship it.

Mechanism

The release cadence is now monthly to quarterly. Any complex scaffolding built around current-model failure modes pays off for one cycle and then becomes overhead. Lighter, more model-agnostic harnesses pay off across cycles. The discipline is to build the thinnest scaffolding that produces value today and degrade gracefully when the model improves.

Conditions

Holds when:

Fails when:

Evidence

"This is the worst the models will ever be."

Kevin Weil's quote, repeated by Sherwin. Echoed by Boris Cherny ("build for six months out") and Cat Wu ("build products that don't yet work"). Three operators converging.

"Listening to customers isn't always right in AI. The field and the models themselves change so quickly. They tend to disrupt themselves. The models will eat your scaffolding for breakfast."

— Sherwin Wu on Lenny's Podcast, 2026-04-28

Signals

Counter-evidence

Capability overhang (Amole Naik) cuts the same way but from the growth side. Both operators converge: build adaptive, not heavy. The opposite failure — perpetual deferral — is real; "next model will fix it" can be a permanent dodge. Pair the principle with explicit ship cadence.

Cross-references

Open the interactive view → View original source → Markdown source →