NVIDIA Touts Codex GPT-5.5 Gains

NVIDIA is integrating OpenAI's Codex, powered by GPT-5.5 and running on its own hardware, to accelerate complex engineering and research tasks.

5 min read
NVIDIA logo with abstract AI network background
NVIDIA is integrating advanced AI tools like Codex to enhance its engineering and research capabilities.· OpenAI News

NVIDIA is integrating OpenAI's Codex, now running on its own GB200 and GB300 infrastructure, to accelerate engineering and research. This advanced AI tool, powered by GPT-5.5, is enabling faster development cycles and more complex AI experiments.

Visual TL;DR. NVIDIA Hardware runs on OpenAI Codex. OpenAI Codex powered by GPT-5.5. GPT-5.5 enables Accelerated Tasks. OpenAI Codex used for Engineering Workflows. GPT-5.5 improves Bug Detection. Autonomous Sessions enables Bug Detection. OpenAI Codex deployed in Production Systems.

  1. NVIDIA Hardware: GB200 and GB300 infrastructure powering AI
  2. OpenAI Codex: AI tool for complex engineering and research
  3. GPT-5.5: Advanced AI model powering Codex capabilities
  4. Accelerated Tasks: Faster development cycles and complex AI experiments
  5. Engineering Workflows: Codex as primary tool for coding challenges
  6. Bug Detection: Surfaces bugs and gaps other models miss
  7. Autonomous Sessions: Identifies issues and suggests solutions beyond prompts
  8. Production Systems: Internal AI agents deploy tools across company
Visual TL;DR
Visual TL;DR — startuphub.ai NVIDIA Hardware runs on OpenAI Codex. OpenAI Codex powered by GPT-5.5. GPT-5.5 enables Accelerated Tasks. OpenAI Codex deployed in Production Systems runs on powered by enables deployed in NVIDIA Hardware OpenAI Codex GPT-5.5 Accelerated Tasks ProductionSystems From startuphub.ai · The publishers behind this format
Visual TL;DR — startuphub.ai NVIDIA Hardware runs on OpenAI Codex. OpenAI Codex powered by GPT-5.5. GPT-5.5 enables Accelerated Tasks. OpenAI Codex deployed in Production Systems runs on powered by enables deployed in NVIDIA Hardware GB200 and GB300infrastructure powering AI OpenAI Codex AI tool for complexengineering and research GPT-5.5 Advanced AI model poweringCodex capabilities Accelerated Tasks Faster development cyclesand complex AI experiments ProductionSystems Internal AI agents deploytools across company From startuphub.ai · The publishers behind this format
Visual TL;DR — startuphub.ai NVIDIA Hardware runs on OpenAI Codex. OpenAI Codex powered by GPT-5.5. GPT-5.5 enables Accelerated Tasks. OpenAI Codex used for Engineering Workflows. GPT-5.5 improves Bug Detection. Autonomous Sessions enables Bug Detection. OpenAI Codex deployed in Production Systems runs on powered by enables used for improves enables deployed in NVIDIA Hardware GB200 and GB300infrastructure powering AI OpenAI Codex AI tool for complexengineering and research GPT-5.5 Advanced AI model poweringCodex capabilities Accelerated Tasks Faster development cyclesand complex AI experiments EngineeringWorkflows Codex as primary tool forcoding challenges Bug Detection Surfaces bugs and gapsother models miss AutonomousSessions Identifies issues andsuggests solutions beyondprompts ProductionSystems Internal AI agents deploytools across company From startuphub.ai · The publishers behind this format

Engineers are using Codex as a primary tool for intricate coding challenges and to run comprehensive machine learning experiments. The system's ability to handle extended, autonomous sessions allows it to identify issues and suggest solutions beyond initial prompts.

"Codex is our go-to tool for complex engineering tasks, and with GPT-5.5, it surfaces bugs and gaps in my program that other models weren’t able to find," stated Dennis Hannusch, Senior Software Engineer.

Building Production Systems

NVIDIA's internal AI agents team is instrumental in deploying these tools across the company. Codex with GPT-5.5 has become their default for demanding engineering work.

Related startups

Hannusch highlighted its increased autonomy: "I’m able to go for long sessions with multiple compactions and find that it still performs with top accuracy and manages to keep the work in context." He also noted its skill in selecting appropriate tools and capabilities.

Using Codex, Hannusch transformed an internal platform from an MVP to a production-ready system, enhancing scalability and reliability. The team also rapidly developed an internal podcast recording app, comparable to Riverside, in just hours.

This accelerated development was critical given internal privacy constraints. The Codex desktop app's computer interaction capabilities even allowed for autonomous testing of video and audio recording features as they were built.

"Codex has completely changed the threshold for what’s worth building," Hannusch added.

Automating Research Workflows

For NVIDIA's research divisions, Codex is streamlining the entire research loop. This includes identifying research areas, scripting experiments, and executing them on remote infrastructure.

"GPT-5.5 has been a massive unlock as a creative partner, especially when it comes to knowledge work," said Shaunak Joshi, an AI researcher. He described using Codex as a research agent, feeding it large volumes of relevant papers.

Joshi found GPT-5.5 to be particularly creative, helping trace evidence across research papers and visualizing concept connections through suggested knowledge graphs. This aided in hypothesis generation.

Codex then generates the necessary scripts for training models on NVIDIA's machine learning infrastructure. The app's SSH support simplifies remote host login and setup, allowing researchers to manage extensive workloads from their laptops.

Furthermore, Codex excels at code modernization. "If you have an old codebase that isn’t that performant, Codex is really good at machine translation. So a lot of folks are taking their Python repository, sending it to GPT-5.5, and it’s rewriting it into Rust and making it like 20X more efficient," Joshi explained.

Codex is also proving invaluable for tasks such as code translation, with teams converting Python repositories to Rust for significant performance gains.

"We’re just scratching the surface of what it can do," Hannusch concluded. "I’m really excited to keep building real systems and see how far it can go."

© 2026 StartupHub.ai. All rights reserved. Do not enter, scrape, copy, reproduce, or republish this article in whole or in part. Use as input to AI training, fine-tuning, retrieval-augmented generation, or any machine-learning system is prohibited without written license. Substantially-similar derivative works will be pursued to the fullest extent of applicable copyright, database, and computer-misuse laws. See our terms.