Texas Instruments has announced a collaboration with NVIDIA aimed at accelerating the development and safe deployment of next-generation humanoid robots. The partnership combines Texas Instruments’ real-time motor control, sensing, radar, and power technologies with NVIDIA’s advanced robotics computing and AI infrastructure, enabling robotics developers to move faster from simulation to real-world deployment.
As part of the collaboration, TI has integrated its mmWave radar technology with the NVIDIA Jetson Thor using the NVIDIA Holoscan ecosystem. The integration enables low-latency 3D perception and improved safety awareness for physical AI applications. By combining radar and camera data through sensor fusion, developers can achieve more reliable object detection, localization, and tracking in complex environments.
The solution helps address challenges that have historically limited humanoid robot deployment. Radar technology allows robots to detect transparent or reflective surfaces such as glass doors, while also maintaining performance in low-light, glare, fog, or dust conditions. This capability enables robots to navigate safely in environments such as offices, hospitals, and retail spaces.
Texas Instruments will demonstrate the technology during NVIDIA GTC 2026, taking place March 16–19 at the San Jose McEnery Convention Center. The demonstration will showcase real-time sensor fusion integrated with NVIDIA’s robotics ecosystem, highlighting how AI compute, sensing, networking, and power management work together to support functional safety and scalable humanoid robot systems.
According to TI and NVIDIA leaders, the collaboration aims to bridge the gap between AI simulation environments and real-world robotics, helping developers validate complete humanoid systems earlier and accelerate the transition from prototype development to commercially viable robots operating safely alongside humans

Share your Details for subscribe