The automotive industry has entered a global execution race where winners will be those who treat AI, software, and validation not as features, but as core vehicle infrastructure deployed at global scale.

CES 2026 marked a decisive inflection point for the global automotive industry. The narrative moved beyond “software-defined vehicles” (SDVs) toward AI-defined vehicles, where competitive advantage is determined by the ability to deploy, validate, monitor, update, and monetize AI safely at scale across the full vehicle lifecycle. The show evolved from Generative AI in earlier years to Physical AI, with NVIDIA’s announcement of Alpamayo 1 stealing the spotlight.


Source : Consumer Technology Association (CTA)®

Across OEMs, Tier-1 suppliers, silicon providers, hyperscalers, and software vendors, the messaging clearly shifted from experimentation to execution. Centralized compute architectures, cloud-to-edge continuity, AI-native development workflows, and production-ready autonomy and cockpit experiences dominated the show floor.

This report does not aim to be exhaustive but instead highlights some of the most important announcements and demonstrations we observed at CES 2026.

NVIDIA

NVIDIA used CES 2026 to establish itself as the central platform provider for Physical AI and autonomous driving. The company launched Alpamayo 1, the industry’s first open-source Vision Language Action (VLA) model, with a strong emphasis on explainability and safety assurance.

Mercedes-Benz was announced as the lead customer adopting the 10-billion-parameter Alpamayo 1 model in combination with NVIDIA Drive AV software. This stack will power Mercedes’ address-to-address L2++ feature, which is expected to launch on the new CLA platform in 2026.

Also on view was a tight coupling of simulation and driving datasets. To support broader ecosystem adoption, NVIDIA launched AlpaSim, an open-source simulation framework, along with Physical AI open datasets covering more than 1,700 hours of driving data. Together, these assets are intended to help companies develop their own autonomous driving stacks.


Source : Consumer Technology Association (CTA)®

From an ecosystem perspective, NVIDIA’s VLA approach addresses the long-tail scenarios that have often slowed the transition from 99% system reliability to the remaining rare but critical edge cases.

The announcement encourages multi-partner autonomous vehicle (AV) ecosystems. NVIDIA effectively created an “Android moment” for autonomous driving by providing a scalable foundation that does not rely on LiDAR or HD maps. Instead, the system builds an online map dynamically, using navigation paths and real-time sensor data from cameras and radars.

NVIDIA CEO Jensen Huang summarized this shift while describing Alpamayo 1, “The ChatGPT moment for physical AI is here—when machines begin to understand, reason, and act in the real world. Alpamayo is the foundation for safe, scalable autonomy.”

Qualcomm

CES 2026 represented a big launchpad for Qualcomm’s push into Physical AI. While the company emphasized Physical AI broadly, its automotive announcements focused on expanding OEM adoption of its latest high-performance AI system-on-chips (SoCs).

Qualcomm announced a landmark partnership with Leapmotor for the D19 model, featuring the industry’s first commercial cross-domain compute platform. This platform features two SA8797P SoCs, with one dedicated to cockpit functions such as infotainment and agentic AI applications, and the other handling advanced driver assistance systems (ADAS) and autonomous driving (AD) workloads, supporting VLA models with an overall 12X improvement in AI performance. This creation of a single high-performance system managing ADAS/AD, infotainment, and gateway functions, represents a first of its kind in the industry, and it is not surprising that a Chinese OEM achieved this milestone with Qualcomm’s support.

The Snapdragon Cockpit Elite and Ride Elite platforms also saw broader adoption across NIO, Zeekr, Li Auto, and Great Wall Motors. In parallel, Qualcomm signed a strategic agreement with Volkswagen to supply high-performance SoCs to the Rivian joint venture, with plans to eventually scale these platforms across all Volkswagen, Audi, and Porsche vehicles using the Scalable Systems Platform (SSP).

Scale-wise Qualcomm might have announced its biggest OEM partnership with Toyota. The Snapdragon Digital Chassis will power the 2026 Toyota RAV4, delivering next-generation cockpit experiences, AI capabilities, and Mapbox-based navigation. In addition, Bosch will support global ADAS deployment under Toyota Safety Sense.

From an OEM and ecosystem standpoint, Qualcomm is emerging as a leader in electronic control unit (ECU) consolidation through cross-domain computing. By reducing ECU count and bill-of-material (BOM) complexity, the company is helping OEMs simplify vehicle architectures while preparing for long-term software scalability. Qualcomm is also laying a strong foundation for GenAI- and VLA-powered infotainment and ADAS systems, enabling OEMs to future-proof ADAS and in-vehicle infotainment (IVI) experiences.

Qualcomm’s Group General Manager for Automotive and Executive Vice President summarized this positioning: “Qualcomm Technologies is leading the industry forward by enabling automakers to deliver more intelligent, personalized, and safer driving experiences through the power of AI and software-defined architectures. With the unmatched scale of our Snapdragon Digital Chassis footprint… we are well-positioned to accelerate innovation and help the industry advance toward the next generation of connected, automated mobility.”

MediaTek

MediaTek is emerging steadily as a key disruptor, differentiating itself through a strategic partnership with NVIDIA. At CES 2026, the company unveiled its ultra lineup of cockpit processors, including the Dimensity Auto Cockpit S1 Ultra and P1 Ultra, integrating NVIDIA GPUs and Drive OS software to support advanced AI workloads.

MediaTek also announced the Dimensity Auto Connect T830 5G telematics module, designed to support eCall and satellite connectivity use cases. In addition, the company showcased a new Wi-Fi 8 chip for automotive use. MediaTek further spotlighted its collaboration with Denso on the development of a custom ADAS SoC.

From an ecosystem perspective, MediaTek is positioning itself as a mass-market alternative to NVIDIA and Qualcomm. With support for up to 7-billion-parameter Large Language Models (LLMs) in its new S1 and P1 chips, MediaTek is offering OEMs a more affordable path to power AI-enabled experiences.

Renesas

Renesas publicized the R-Car X5H touted as the industry’s first 3-nanometer process node multi-domain SoC capable of supporting IVI, ADAS, and gateway functions. The chip features 32 ARM cores and delivers up to 400 TOPS of AI performance, enabled through a chiplet-based architecture for adding modular accelerators.

The company also demonstrated its commitment to shift-left development at OEMs by working with Microsoft Azure through the R-Car Open Access (RoX) Whitebox SDK.

Renesas highlighted ecosystem integrations with partners such as Bosch, ZF, StradVision, and Smart Eye. In particular, StradVision and Smart Eye demonstrated direct integration of perception and driver-drowsiness detection solutions out of the box.

For OEMs, Renesas’ chiplet-based architecture enables a scalable compute approach, allowing additional performance to be added as new use cases emerge. This flexibility represents a key differentiator as vehicle software requirements continue to evolve.

Cerence AI

Cerence showcased a vision for conversational, agentic AI experience in-vehicle that spans navigation, comfort systems, and vehicle controls, operating seamlessly across cloud-based LLM and edge-based Small Language Models (SLM) environments. Central to this vision is xUI, a modular hybrid agentic AI architecture that allows OEMs to orchestrate proprietary, open, and third-party LLMs without vendor lock-in.

Demonstrated on a stunning BYD concept vehicle, xUI supports multi-step reasoning and orchestrates multiple agents running CaLLM, Cerence’s Automotive LLM. Importantly, OEMs retain the flexibility to swap CaLLM with other LLMs. Geely Auto was announced as the first OEM to adopt a cloud-heavy version of xUI.

Cerence also demonstrated a productivity-focused agent with Microsoft. This solution leverages Intune, Copilot, and Azure, using conversation to enable safe handling of work and productivity tasks while driving.

From an ecosystem perspective, Cerence positions itself as a critical orchestrator across agents, enabling OEMs to scale AI functionality across multiple domains. The platform extends AI value beyond the cabin into ownership lifecycle management and dealer operations, while repositioning conversational AI as a core vehicle agent rather than a simple infotainment feature.

As Chief Revenue Officer, Christian Mentz explained, “Cerence xUI is purpose-built to help automakers deliver better, AI-powered user experiences… [it] transforms in-car assistants from reactive systems into proactive, context-aware companions.”

Microsoft

Microsoft continued to reinforce its role as a foundational partner for digital engineering, infrastructure, and AI enablement across the automotive ecosystem. Building on its collaboration with Cerence, Microsoft emphasized its work in delivering a productivity-focused in-vehicle agent with seamless work integration.

In addition, Microsoft unveiled a similar productivity experience featuring Foundry and the Microsoft 365 productivity suite with Bosch on its AI extension platform. This in-vehicle solution allows users to seamlessly continue Microsoft Teams calls while on the move, with Bosch managing in-vehicle AI agents through Microsoft Foundry.

Microsoft also emphasized accelerating shift-left development by partnering with ETAS to host its complete testing suite on Azure. This approach enables OEMs to begin testing and validating software stacks much earlier in the development cycle.

Further showcasing its partner-led strategy, Microsoft demonstrated collaboration with TomTom to create an agentic experience around navigation and urban L2+ driving. By leveraging Microsoft AI tools, the companies illustrated how AI-driven navigation and driving assistance can be delivered through ecosystem partnerships rather than vertically integrated platforms.

From an OEM and customer perspective, Microsoft’s focus is on accelerating SDV development velocity while improving compliance. At the same time, Microsoft is positioning itself as a key enabler of agentic AI experiences in vehicles through its cloud, AI, and infrastructure offerings.

A defining quote captures how Microsoft is prioritizing not doing it all directly but by empowering partners: “At CES 2026, we are particularly focused on the transformation within digital engineering, helping our automotive and mobility customers establish strong software capabilities. The time for AI experiments and pilots is over. Digitalization is now about AI transformation.”

Amazon

Amazon used CES 2026 to expand its presence in automotive AI through both Alexa and AWS. The company launched Alexa+, an advanced AI assistant designed to power agentic experiences across multiple domains. BMW was announced as the first OEM customer to adopt Alexa+ on the iX3, which will be the first vehicle built on BMW’s Neue Klasse SDV platform.

In addition to agentic capabilities across booking and reservations with partners such as OpenTable and Expedia, Alexa+ brings conversational reasoning that supports complex, multi-part interactions with users.

On the AWS side, Amazon announced a close partnership with Aumovio, positioning AWS as the preferred cloud provider for the company’s AD development. This partnership includes support for synthetic data generation, enabling more scalable training of AD systems.

Amazon also highlighted the growing adoption of Alexa Custom Assistant, which is now integrated with HERE and TomTom navigation platforms. This integration allows OEMs to easily add agentic navigation experiences.

From an OEM and customer standpoint, Amazon continues to lead as the primary non-Google choice for in-vehicle voice assistants. Its ability to combine agentic AI with smart home and ecosystem integrations remains a critical differentiator for automakers seeking flexibility and ecosystem breadth.

TomTom

TomTom’s presence at CES 2026 centered on scaling AI-enabled maps to support an expanding set of automotive use cases. The company launched Orbis maps with lane-level detail and complete coverage across Germany, with a stated focus on scaling this capability across Europe and North America.

Building on this foundation, TomTom introduced an ADAS SDK that leverages Orbis map data to generate a Most Probable Path (MPP). This path incorporates attributes such as traffic signs, speed limits, road curvature, lane connectivity, and gradient. TomTom also announced a partnership with CARIAD to adopt Orbis lane-level maps to power its automated driving systems. In parallel, the company confirmed integrations with Amazon Alexa+ and SoundHound AI, enabling conversational AI assistants to operate within the TomTom navigation experience.

The company further showcased continued cooperation with Microsoft, enhancing AI-driven navigation and agent experiences through Azure OpenAI in Microsoft Foundry Models, Azure Cosmos DB, and Azure Kubernetes Service.

For OEMs, TomTom’s Orbis platform offers a scalable solution for obtaining comprehensive lane-level data across road networks. This capability supports navigation, ADAS, and L2++ automated driving use cases, positioning TomTom as a critical mapping partner as automation requirements increase.

dSPACE

dSPACE continued to strengthen its leadership in hardware-in-the-loop (HIL)/ software-in-the-loop (SIL) validation while expanding its support for shift-left development strategies. At CES 2026, the company showcased new HIL and SIL solutions targeted at battery management systems (BMS) and charging infrastructure, built around its SCALEXIO HIL and VEOS SIL platforms and other components.

The company demonstrated the use of generative AI in virtual ECU generation, integrating GitHub Copilot Extension Agent into developer CI/CD pipelines. This enables faster creation of ECU code and accelerates early-stage development and validation.

dSPACE also spotlighted automated human-machine interfaces (HMI) testing conducted in the cloud, leveraging AWS Graviton for high-performance computing (HPC) hardware, alongside QNX Cabin OS. These solutions support HIL/SIL validation.

For OEMs, dSPACE’s AI-enabled virtualization and simulation capabilities support faster testing and validation of new software features by reducing dependence on physical hardware prototypes. The company’s CI/CT pipelines act as stress-test environments, validating code across large volumes of simulated miles using dSPACE’s HIL infrastructure.

Synopsys

Synopsys used CES 2026 to further advance the shift-left agenda through increased virtualization and digital twin capabilities. A key announcement was AgentEngineer, an agentic AI framework integrated with the Synopsys.ai suite that can reason, plan, and execute complex engineering tasks autonomously.

The company expanded its digital twin portfolio to support leading automotive compute platforms, including ARM’s Zena Compute, Texas Instruments’ TDA5, and NXP’s latest HPC, the S32N7. Following the acquisition and integration of Ansys, Synopsys also expanded its high-fidelity sensor simulation capabilities. The Ansys AVxcelerate software now supports Samsung’s latest image sensors, further improving realism in perception system validation.

Synopsys also highlighted partnerships with IPG Automotive, showcasing a mini-ECU prototype that allows the entire vehicle network to be digital twinned to avoid any integration issues post the start of production (SOP).

For OEMs, Synopsys’ SoC virtualization leadership reduces overall testing and validation costs while accelerating time to market. As software complexity increases, digital twins are becoming a critical enabler of scalable automotive development.

Sonatus

Sonatus focused on enabling AI at the edge, presenting a foundational software platform that allows for data capture, AI model introduction and scaling, updating vehicle software and operating cloud-based models.

At CES 2026, Sonatus showcased AI Director, an orchestration layer that enables OEMs to deploy AI models on edge ECUs, including widely used platforms such as the NXP S32 series. Demonstrated use cases included sensor virtualization for tire health monitoring in collaboration with Michelin.

The company also demonstrated AI Technician, a solution designed to identify complex diagnostic issues and perform root-cause analysis, helping fleets minimize downtime and improve operational efficiency.

In addition, Sonatus displayed its Updater platform in collaboration with Bosch, highlighting a features-on-demand (FoD) use case.

From an ecosystem perspective, Sonatus accelerates FoD roll out and monetization opportunities for OEMs along the vehicle lifecycle, while improving fleet uptime and reducing operational costs. This is achieved through predictive maintenance and prognostics alerts delivered through AI Technician type feature. The platform provides OEMs with an end-to-end solution that covers the entire lifecycle ranging from data capture, model training, deployment, scaling, and new feature updates, positioning Sonatus as a unique one-stop partner.

CEO Jeff Chou summarized Sonatus AI orchestration efforts, “It’s going to be a moving data center on wheels. AI is unlocking benefits such as vehicle optimization, anomaly detection, personalization, and predictive maintenance.”

QNX

QNX underscored its role as the foundational infrastructure with its piping that is required to make OEM SDV programs successful. While less visible than application-layer innovations, QNX remains critical to reducing cost and complexity across SDV architectures.

At CES 2026, QNX launched Alloy Kore in partnership with Vector. The solution clubs together Vector’s middleware layer with QNX 8.0 and its Hypervisor, enabling OEMs to accelerate software development by allowing them to focus on the app/UX layers rather than the core plumbing layer. Mercedes-Benz was announced as the first customer adopting Alloy Kore.

QNX Sound also continued to impress, showcasing a Dolby Atmos solution running entirely on a separate Qualcomm Snapdragon SoC rather than traditional digital amplifiers and digital signal processors (DSPs). This approach promises nearly $100 in BoM cost reduction for OEMs without compromising on audio quality. QNX Sound will be a key player to watch in 2026 with a Chinese OEM and another customer win.

From an OEM and ecosystem standpoint, QNX accelerates software development by focusing on differentiating factors. It remains the most entrenched RTOS/Hypervisor solution, now reinforced by Alloy Kore. The company is also driving adoption of software-defined audio across entry- and mid-tier vehicle segments.

Moter

Moter continued to push insurance technology directly to the vehicle edge. At CES 2026, the company showcased the expansion of its Moter scores by incorporating ADAS data alongside traditional telematics, enabling more nuanced risk analytics.

The company announced a partnership with NEC that integrates biometric technologies to identify drivers and detect signs of impairment or fatigue that will impact their scores, further enriching risk assessment models.

Moter also presented an improved version of its DriveSage mobile application, with a stronger focus on commercial fleet safety and risk management. In parallel, the Moter SDK was shown integrated into middleware stacks, enabling OEMs to offer embedded insurance products at the point of sale or throughout the vehicle lifecycle.

Additionally, Moter demonstrated integration with Sonatus AI Director, allowing insurance analytics to run directly at the edge within the vehicle.

Moter offers OEMs the ability to turn vehicle data into a monetizable insurance product at the edge. At present, no other insurance-focused solution offers this level of edge-based integration and scalability.

HERE

HERE continued to demonstrate its role in powering next-generation digital cockpits and automated driving systems. The company showcased its tenth-generation navigation engine, which uses a unified map architecture supporting both navigation and ADAS and provides lane-level guidance and visualization for navigation and automated driving.

The navigation platform also supports Alexa Custom Assistant, enabling conversational navigation experiences powered by multiple AI agents. This capability was demonstrated through HERE’s partnership with Lucid Motors.

HERE further expanded its automated driving footprint by integrating its AI map into the Snapdragon Ride Pilot system, an L2+ platform developed by Qualcomm and BMW.

In collaboration with AWS, HERE launched an SDV accelerator platform that provides a cloud-based digital cockpit environment for faster testing and validation. An interesting showcase was the SceneXtract, an AI-powered simulation tool that uses generative AI to convert map data into testable driving scenarios.

From an OEM perspective, HERE enables the development of branded L2+ type address-to-address automated driving features while also allowing automakers to use behavioral maneuver data layers to more closely mimic human driving behavior.

Bosch

Bosch’s focus at CES 2026 was the introduction of its AI Extension Platform, an HPC unit with integrated agentic AI capabilities. The platform is built around an NVIDIA AGX Orin SoC and leverages the NVIDIA CUDA ecosystem, allowing for easy integration of a wide range of AI models.

The platform also features deep integration with Microsoft productivity agents.

Tensor

Tensor showcased what may be the first L4 autonomous vehicle designed for personal ownership rather than fleet deployment. The vehicle architecture is based on a dual-system approach, switching between System 1, described as instant and reflexive AI, and System 2, a VLA-based reasoning model designed to handle rare and edge cases.

The vehicle, expected to be available in late 2026, features eight NVIDIA Drive Thor chips delivering more than 8,000 TOPS of processing power and features a VLA model, OpenTau, which the company has open-sourced.

Hyundai Motor Company

Hyundai Motor Company foregrounded its move into Physical AI and robotics. The company showcased the production version of the electric Atlas humanoid robot from Boston Dynamics, which will be deployed in 2026 at Hyundai’s Metaplant America facility in Georgia. These robots will perform parts sequencing and logistics tasks and are capable of autonomous battery swapping, allowing continuous 24/7 operation without human intervention.

Hyundai also announced a partnership with Google DeepMind to provide the cognitive foundation for these humanoid robots. The “brains” will leverage Gemini Robotics foundation models, enabling contextual awareness.

From an ecosystem standpoint, Hyundai’s approach represents one of the most ambitious humanoid robot deployments on automotive manufacturing floors outside of Tesla.

Hyundai Mobis

Hyundai Mobis announced a contract to supply high-performance actuators for the mass production of Boston Dynamics’ next-generation humanoid robot, Atlas.

Hyundai Mobis also highlighted a partnership with Qualcomm to use Ride Flex cockpit and ADAS SoCs to develop new solutions for markets such as India. Enhancements to its flagship M.VICS 7.0 platform were also showcased, including a holographic windshield display and a large 18.1-inch screen that can be adjusted based on driving or parked conditions. In addition, Hyundai Mobis demonstrated an X-by-wire system that integrates steering and braking into a single controller.

These developments open new premium differentiation opportunities in high-end cockpit systems while also creating growth opportunities for Tier 1 suppliers in physical AI and robotics markets.

Neumo

Imagine a company with the technology to do contactless brain wave mapping that can provide more advanced insights including driver fatigue, distraction and stress before they occur. Neumo is a Michigan startup that is fast commercializing this space and at CES 2026 they announced entry into the commercial fleet driver monitoring systems market integrating its tech into the driver’s seat headrest. With an intent to develop its own ASIC (Application Specific Integrated Circuit) that will integrate its neuro-monitoring capabilities, the company is pushing into the long-haul trucking market where currently most monitoring is done using video cameras. Hailed by John McElroy at the show for its impressive technology, Neumo is one interesting startup to watch out for in the driver monitoring and wellness market.

Conclusions and Key Takeaways

CES 2026 delivered one of the most compelling demonstrations to date of the impact of Physical AI operating at scale, particularly in manufacturing, with Hyundai’s commitment to deploy thousands of humanoid robots over the next three years at its factories. With partnerships for advanced reasoning models / the “brain” part, this will be the next big productivity pill in automotive manufacturing.

In autonomous driving, reasoning-based VLA models such as Alpamayo 1 signaled an Android moment for the autonomous driving world. These models are expected to accelerate the development of address-to-address autonomous driving for OEMs specifically, while OEMs relying on traditional rule-based approaches will likely remain focused on limited operational design domains (ODDs).

In the software development space, the focus has moved beyond purely virtualized ECUs toward the use of Generative AI and Agentic AI to generate test cases and push development further to the left. This, alongside developments like Alloy Kore from QNX and Vector, are expected to support accelerated software development for OEMs and, thereby, also significantly reduce overall vehicle development timelines.

Agentic AI is emerging as a defining in-vehicle paradigm. Orchestrators will be the next buzz word in the automotive AI space. The spotlight will be on orchestrators that use conversational AI to bring together multiple agents across domains to provide a seamless cloud and edge experience to users.

Cross-domain computing is now moving from concept to production reality. ECU consolidation continues to be important for OEMs for managing lifecycle software. Qualcomm’s Leapmotor deployment demonstrated how ECU consolidation can simplify architectures while supporting advanced AI workloads. This trend is expected to expand across several other OEMs.

Finally, Chinese OEMs continued to demonstrate leadership in software velocity and system integration. They led with a strong emphasis on vertical integration across batteries, centralized compute, ADAS software, and cockpit AI. This also included a clean-sheet SDV architecture with over-the-air (OTA)-first design. Production-scale ADAS and AI cockpit features positioned as standard, rather than premium, offerings continue to set them apart globally.


Source : Consumer Technology Association (CTA)®

Final Synthesis

CES 2026 confirmed that the automotive industry has entered a global execution race. AI-defined vehicles are no longer aspirational concepts; they are being shipped, updated, and monetized in real-world markets. Across compute platforms (Qualcomm, Arm, NVIDIA), software and lifecycle tooling (Sonatus, dSPACE, Microsoft), safety foundations (QNX), maps (HERE, TomTom), UX layers (Cerence AI), and aggressive Chinese OEM execution, one conclusion stands out clearly: winners will be those who treat AI, software, and validation not as features, but as core vehicle infrastructure deployed at global scale.

Your Transformational Growth Journey Starts Here

Share This