truthupfront logo image

EY and NVIDIA launch platform for faster “physical AI” adoption

EY and NVIDIA

Table Of Contents

EY and NVIDIA start a platform to speed up the use of “physical AI.”

The partnership between EY and NVIDIA brings physical AI to life.

EY is working with NVIDIA to help businesses create, test, and use “physical AI” systems like robots, drones, and smart devices in the real world.
The project includes:

  • A new EY physical AI platform
  • A dedicated EY.ai Lab in Georgia
  • New leaders focused on robotics and automation on an industrial scale

The main goal of the partnership is to give businesses a safer, more predictable way to go from small pilots to using AI-powered machines across the whole company, without having to guess what will happen in real time.

What is physical AI, and why does it matter for businesses

Physical AI is when AI systems do more than just look at data on a screen.
They also do things in the real world, like robots on factory floors, drones in warehouses, inspection bots in energy plants, or autonomous systems in hospitals and logistics hubs.
These systems combine sensing, decision-making, and movement, so the effects of any choice are clear and immediate on safety, costs, and productivity.

That’s where things get hard for a lot of businesses.
Building a software model is one thing; trusting a robot to get around a messy warehouse or a drone to fly near important things every day is another.
This is why structured testing, simulation, and governance, which is a big part of this EY–NVIDIA project, are not only helpful but also necessary.

Inside EY’s new physical AI platform

EY’s new platform is based on a stack of NVIDIA technologies, including NVIDIA Omniverse libraries, NVIDIA Isaac tools, and NVIDIA AI Enterprise software.
This makes it possible to design, simulate, and run physical AI solutions in one place.
The goal is to help businesses with AI-powered machines from the early strategy phase all the way through deployment and long-term maintenance in industries like health, energy, consumer, and industrial.

EY says that the platform is built on three main pillars that show how physical AI systems work in the real world.

AI-ready data and synthetic scenarios

The first pillar is all about data: making and using AI-ready data, with a focus on synthetic data that looks like a lot of different real-life situations.
Synthetic data lets teams create rare, dangerous, or complex conditions, like extreme weather, equipment failures, or unusual human behavior, without needing those events to actually occur on-site.

This method makes it safer and easier to train and test AI models on a large scale, especially in places where testing them in the real world could be dangerous, costly, or disruptive to business.
It also gives businesses a way to test edge cases that might not show up enough in real data.

Real-time monitoring, digital twins, and training for robots

The second pillar is about training people to use digital twins and robots.
It does this by using NVIDIA Omniverse and NVIDIA Isaac to make very detailed virtual copies of factories, warehouses, or other places where work is done.
Companies can design and test robots, drones, and other devices in these simulated 3D environments before they ever turn on a real machine.

These tools help with making AI-powered robots, checking navigation and task logic, and keeping an eye on performance in both the virtual world and real-life operations to make sure everything runs smoothly.
The new EY.ai Lab and the platform work together by letting you build and test solutions for humanoid robots, quadrupeds, and next-generation robotic platforms.

Responsible physical AI, safety, and governance

The third pillar is “responsible physical AI,” which means having rules and controls that cover safety, ethics, and following the law.
Companies need frameworks to handle not only technical risks but also compliance and trust because physical AI decisions can affect workers, infrastructure, and even public spaces.

EY sees this governance layer as a basic need, not an afterthought.
This is especially true as physical AI systems become more self-sufficient and more integrated into daily operations.
This includes ways to keep an eye on things, check them, and make sure they follow all the rules and standards set by the company and the law.

NVIDIA Omniverse and Isaac power the platform

NVIDIA Omniverse and NVIDIA Isaac are the main technologies that make up this physical AI setup.
Omniverse has libraries and tools for making high-fidelity digital twins, which let different teams work together on virtual models of factories, logistics hubs, or city infrastructure.

NVIDIA Isaac has open models and robotics simulation frameworks that let you design, train, and test AI-driven robots in 3D environments before they go to work.
NVIDIA AI Enterprise, on the other hand, gives you the computing and software infrastructure you need to run heavy AI workloads reliably at scale.

John Fanelli of NVIDIA says that more businesses are using robots and automation to deal with changes in the workforce and make things safer.
He describes the EY.ai Lab, which runs on NVIDIA infrastructure, as a way for companies to “simulate, optimize, and safely deploy robotics applications at enterprise scale.”

EY.ai Lab in Georgia: testbed for robots and digital twins

The new EY.ai Lab in Alpharetta, Georgia, is a big part of this project.
It is EY’s first site that is only for physical AI.
The Lab has robotics systems, sensors, and simulation tools that let businesses test out ideas and make prototypes before putting them into use in the real world.

In the Lab, companies can:

  • Create and try out real-world AI systems in a virtual testbed
  • Make solutions for humanoids, quadrupeds, and other new types of robots
  • Use digital twins to make logistics, manufacturing, and maintenance better

The Lab is meant to be both a place for clients to learn and a place for developers to work.
Clients can see how AI-enabled robots act in real-life situations by using digital replicas and real tools at the same time.
After that, EY’s teams can help turn those tests into systems that work on networks of sites.

New leader: Dr. Youngjun Choi to lead physical AI

EY has named Dr. Youngjun Choi as Global Physical AI Leader to lead this effort.
He will be in charge of the strategy and delivery of robotics and physical AI.
He is in charge of the EY.ai Lab in Georgia and helps shape EY’s advisory work in this area.

Choi has almost 20 years of experience in robotics and AI.
He was in charge of the UPS Robotics AI Lab, where he worked on digital twins, robotics projects, and AI tools to update the company’s network.
Before working for UPS, he was a research professor in Aerospace Engineering at the Georgia Institute of Technology, where he worked on aerial robotics and autonomous systems.

How EY views the opportunity in physical AI

Raj Sharma, EY Global Managing Partner for Growth and Innovation, says that physical AI is already changing how businesses work in important areas, especially by making things more automated and lowering costs.
He says that combining EY’s industry knowledge with NVIDIA’s infrastructure is a way to speed up the process of going from “experimentation to enterprise-scale deployment.”

Joe Depa, EY Global Chief Innovation Officer, says that clients want better ways to use technology to make decisions and improve performance.
He also says that physical AI needs strong data foundations and trust from the start.
Depa says that with Choi in charge of the Lab, EY teams are beginning to go “beyond the surface of what is possible” and set the stage for operations that can grow.

Physical AI use cases across industries

EY says that the physical AI platform and Lab are meant to help many different industries, each with its own needs, but all needing safe, scalable automation.

Some areas of focus are:

  • Industrials: Factory robots, automated inspection systems, and maintenance drones that work around heavy machinery
  • Energy: Digital twins and robots for power plants, pipelines, and the grid
  • Consumer and retail: Robots in warehouses, robots that automate order fulfillment, and robots in stores that do things like scan shelves
  • Health: Robots that help with logistics, support, and monitoring in hospitals and other clinical settings

Before being trusted with important tasks, these programs often need to be able to test situations in a simulation first, for example, a robot moving through a crowded warehouse aisle or a drone inspecting a complex industrial structure.

From small pilots to large enterprises

The “pilot trap” is one of the biggest problems with physical AI.
A lot of companies can get one robot or a small trial working, but it’s much harder to scale those systems across multiple locations while keeping performance and governance consistent.
EY’s platform, which is based on NVIDIA technologies, is meant to fill that gap by providing a single path from testing to deployment in a business.

This includes making sure that data is created and managed in the same way, that digital twins are made and updated in the same way, and that governance frameworks are used in the same way across different sites and regions.
That means that companies can go from having just one test lab to having a network of sites that all use the same robotic systems, without having to redesign everything for each new deployment.

Safety, ethics, and rules in physical AI

EY puts responsible AI at the center of this effort because physical AI systems interact with people, machines, and public spaces.
They pay special attention to safety, ethics, and compliance.
This focus includes not only technical controls but also process and governance structures that help businesses make sure that their physical AI deployments follow the rules and their own policies.

The platform is said to include governance and controls throughout the life cycle of physical AI systems, from design and testing to ongoing monitoring and maintenance.
This method is meant to help businesses feel more confident about using automation in places where mistakes could have serious effects, like in sensitive or regulated areas.

Building on the existing EY–NVIDIA AI partnership

The physical AI platform and EY.ai Lab are built on work that EY and NVIDIA did together before, like the AI agent platform that came out earlier this year.
The previous work focused mostly on AI agents performing complex digital tasks.
The new project, on the other hand, will involve robots, drones, and edge devices in the real world.

Both companies want to work together on AI in more areas, like energy, health, and smart cities.
They also want to support automation projects that aim to reduce waste and environmental harm.
This makes the partnership more than just a technical program; it also fits into larger digital and sustainability plans.

What’s next for industrial AI

EY and NVIDIA say that this is part of the “next phase” of industrial AI, when AI systems will no longer be limited to screens and dashboards but will also work through machines in the real world.
The partnership’s goal is to make it easier for businesses that want to use physical AI but don’t have the right skills or testing environments in-house to do so by providing a dedicated Lab, a structured platform, and leadership focused on robotics.

As more companies turn to automation to deal with changes in the workforce, performance demands, and sustainability goals, structured ecosystems like this one, which combine simulation, digital twins, data, and governance, are likely to become a key part of how physical AI is safely scaled.

Author -Truthupfront
Updated On - December 4, 2025
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments