Figma's OpenAI Codex Integration Blurs the Line Between Designer and Developer
A week after partnering with Anthropic's Claude Code, Figma has integrated OpenAI's Codex—signaling a rapid push to make design-to-code workflows seamless for a new generation of design engineers.

Image by Figma
Figma announced on Wednesday that it has integrated OpenAI's Codex coding assistant into its platform, enabling a bidirectional workflow between design canvases and code environments. The move comes just one week after Figma struck a similar deal with Anthropic for Claude Code integration, underscoring how aggressively the design tool company is courting the growing population of professionals who straddle the design-engineering divide.
How it works
The integration relies on Figma's MCP (Model Context Protocol) server to shuttle context between platforms. Users can copy a selection URL from a Figma Design, Figma Make, or FigJam file, paste it into OpenAI's Codex desktop app, and prompt the agent to generate code based on the design's layouts, styles, and component data.
Critically, the workflow also runs in reverse. After iterating in code, engineers can use Codex's generate_figma_design tool to render a live, running UI back into fully editable Figma frames—turning browser-rendered interfaces into layered design files in seconds. From there, designers can adjust layouts, swap in design system components, refine typography, and explore variations before sending updated context back to Codex for implementation.
This roundtrip capability is the integration's most consequential feature for design engineers—professionals who regularly toggle between visual exploration and code-level implementation.
What Figma and OpenAI are saying
Figma's chief design officer Loredan Crisan framed the partnership in terms of creative iteration: "With this integration, teams can build on their best ideas—not just their first idea—by combining the best of code with the creativity, collaboration, and craft that comes with Figma's infinite canvas."
OpenAI's Codex product lead Alexander Embiricos emphasized the breakdown of traditional role boundaries: "The integration makes Codex powerful for a much broader range of builders and businesses because it doesn't assume you're 'a designer' or 'an engineer' first. Engineers can iterate visually without leaving their flow, and designers can work closer to real implementation without becoming full-time coders."
Why it matters for design engineers
For years, the handoff between design and frontend code has been one of the most friction-laden steps in product development. Tools like Zeplin, Storybook, and Figma's own Dev Mode have chipped away at the problem, but the gap between a polished mockup and production-quality code has persisted.
Figma's dual integrations with both Codex and Claude Code suggest the company sees AI-powered coding agents—not traditional handoff tools—as the solution. The bidirectional MCP workflow effectively lets a single person prototype visually in Figma, generate working code via an AI agent, inspect the result in a browser, push it back to the canvas for refinement, and repeat. That loop compresses what used to require multiple specialists and multiple tools into a single iterative cycle.
The timing is notable. OpenAI's Codex MacOS app was downloaded over a million times within its first week of release, and the company says more than a million users now engage with Codex weekly. With Figma already one of the first companies to launch an app inside ChatGPT last October, the two companies appear to be deepening a strategic relationship.
For design engineers specifically, the practical implication is clear: the toolchain is converging. Whether the industry settles on Codex, Claude Code, or both, the expectation that designers ship code—and that engineers make informed visual decisions—is being baked directly into the platforms they already use.





