Jin Daily AI Trivia: OpenAI finally released its AI IDE - Codex App
Jin Daily AI Trivia: OpenAI finally released its AI IDE - Codex App
OpenAI and its confusing naming scheme strike again. This time, though, it’s the real deal: a proper, full-featured AI coding tool called the Codex App.
Seriously, how many products have been called “Codex” in the OpenAI lineup?
- 2021: Codex for VS Code (fine-tuned GPT-3)
- 2025: Codex CLI (OpenAI’s take on Claude Code)
- 2025: Codex Web (cloud-based Web VM coding agent)
- 2025: Codex IDE Extension (API-powered VS Code extension)
- 2026: Codex App (the full AI coding IDE)
And of course, there’s also the coding fine-tuned LLM itself: GPT-5.2-Codex.
The Codex App is available starting today on macOS. Anyone with a ChatGPT Plus, Pro, Business, Enterprise, or Edu subscription can use Codex across the CLI, web, IDE extension, and the app using their ChatGPT login. Usage is included in ChatGPT subscriptions, with the option to buy additional credits if needed.
For a limited time (until March 2026), Codex will also be available to ChatGPT Free and Go users to help them build more with agents. OpenAI is also doubling rate limits for existing Codex users across all paid plans during this period.
Verdict:
After trying it briefly, this already works much better for me. Google (Antigravity) has seriously dumbed down the Gemini models lately, to the point where they feel almost unusable.
The Codex model itself is very capable, and its agentic abilities are genuinely SOTA. The CLI, however, was really holding it back. The Codex App basically fixes this by putting a proper GUI on top of the CLI and running tasks in parallel threads. You can launch multiple operations at once, and they actually work together to complete your task.
This is genuinely good.
If you’re thinking about unsubscribing from OpenAI, maybe try this first. Claude is great, but it gets very expensive once you start working on serious projects.
As each coding LLM provider starts building their own end-user products, the real question is: what happens to tools like Cursor, Kiro, and WindSurf, which still have to pay those LLM providers for API access in the first place?
