Planning for AGI and Beyond
AI Summary
Planning for AGI and Beyond" (February 2023) is the closest thing OpenAI has published to a strategic doctrine for the AGI transition. Altman and OpenAI lay out their belief that AGI — defined as AI systems that can outperform humans at most economically valuable work — is a genuine near-term possibility, and that the mission is to ensure this development benefits all of humanity rather than any single company, country, or group. The document introduces what became OpenAI's canonical risk framing: the two failure modes are (1) AI that turns against humanity and (2) AI that creates a small group with unchallengeable power — including OpenAI itself. The acknowledgment that OpenAI could itself be the problem if it accrues unchecked power was remarkable for a corporate document. Altman describes the iterative deployment strategy: release incrementally, allow society to adapt, observe what goes wrong, adjust. This graduated approach is contrasted with the alternative — a single massive deployment at capability threshold — which OpenAI argues would be both more dangerous and less correctable. The essay also introduced "superalignment" as a concept: the belief that AI itself will eventually be needed to help align future, more powerful AI systems. Planning for AGI and Beyond functions as the intellectual foundation for everything OpenAI did in 2023–2026: the GPT-4 release, the safety team structure, the board crisis, and the push toward reasoning models.
Original excerpt
OpenAI's strategic doctrine for the AGI transition. The document that defines their two-failure-mode framework, iterative deployment philosophy, and the acknowledgment that OpenAI itself could be the danger.
Frequently asked questions
What is "Planning for AGI and Beyond" about?
Planning for AGI and Beyond" (February 2023) is the closest thing OpenAI has published to a strategic doctrine for the AGI transition. Altman and OpenAI lay out their belief that AGI — defined as AI systems that can outperform humans at most economically valuable work — is a genuine near-term possib…
Who wrote "Planning for AGI and Beyond"?
"Planning for AGI and Beyond" was written by Sam Altman. It is curated in the Sam Altman vault on Burn 451, which covers agi · openai strategy · the intelligence age.
How can I read more content from Sam Altman?
The complete Sam Altman reading list is available at burn451.cloud/vault/sam-altman. Each article includes an AI-generated summary so you can decide what to read in seconds. Connect the Burn 451 MCP server to Claude or Cursor to query all Sam Altman articles as live AI context.
Can I use "Planning for AGI and Beyond" with Claude or Cursor?
Yes. Install the burn-mcp-server npm package and connect it to Claude Desktop, Claude Code, or Cursor. Once connected, your AI can search and reference this article and the full Sam Altman vault in real time — no manual copy-paste required.
26 more articles in this vault.
Import the full Sam Altman vault to Burn 451 and build your own knowledge base.
Content attributed to the original author (Sam Altman). Burn 451 curates publicly available writing as a reading index. For removal requests, contact @hawking520.