Mobileye told journalists in Shanghai that its ambition now extends beyond vehicle autonomy to humanoid robots, setting a timetable that moves from on‑site validation in 2026 to production partnerships in 2027 and scaled deployment in 2028. Chen Yunxiang, head of business development, strategy and localization for Mobileye China, framed the move as an evolution of the company’s capabilities — from EyeQ perception chips and driver assistance systems to what she called "Mobileye 3.0": a full‑spectrum physical‑AI business that spans driving automation and embodied robotics.
Founded in 1999, Mobileye built its reputation on chips and perception stacks for advanced driver assistance systems (ADAS). The company says more than 230 million vehicles were equipped with its EyeQ chips by the end of 2025, and its workforce has grown to over 3,500 staff with R&D employees accounting for more than 80 percent. The acquisition of Mentee Robotics, announced in January 2026, is the concrete step that Mobileye says will let it apply its sensing, AI and simulation expertise to humanoid platforms.
Chen laid out a staged rollout. Mobileye plans customer on‑site validations in 2026, to be followed by a manufacturing collaboration in 2027 with a company spun out of Continental AG’s auto division, and then scaled production and commercial landing beginning in 2028. The first target markets will be industrial settings — warehouses and manufacturing roles that demand repetitive tasks and where safety and uptime economics are clearest — with a later push into home scenarios.
The robot push sits alongside Mobileye’s continuing work on higher‑level driving automation. Chen highlighted the company’s Robotaxi program with Volkswagen Group, where Mobileye Drive has been integrated into the ID.Buzz and is undergoing road tests in several cities across Europe and the United States. Mobileye expects one city to reach so‑called "driverless" operation without a safety operator in the third quarter of this year, underscoring that the firm views driving autonomy and humanoid robotics as complementary arms of a single physical‑AI strategy.
Technical synergies are the rationale. Mentee brings progress in VLA (vision‑language‑action) models and large‑scale Sim2Real simulation methods that, Chen argued, pair naturally with Mobileye’s driving automation stack. Those combined capabilities, she said, should improve generalization to long‑tail scenarios, speed adaptation to new environments and shorten expensive real‑world verification cycles for both cars and robots.
The company also weighed in on a parallel industry debate in China over so‑called "cockpit‑drive convergence" — whether automakers should integrate infotainment and driving compute in one module (One Box) or compress both functions onto a single silicon die (One Chip). Chen emphasized user experience and cost effectiveness over architectural dogma, warning that a single box can sacrifice flexibility even if it appears cheaper on paper, while separate modules allow OEMs to tailor configurations to different needs.
For global observers, Mobileye’s plans illustrate two converging trends: the industrialization of embodied AI and the blurring of lines between automotive autonomy and general‑purpose robotics. The company’s move also signals the growing importance of partnerships — with suppliers that can scale manufacturing and with robotics teams that provide task‑level learning and simulation expertise — as capital‑intensive hardware efforts shift from lab prototypes to repeatable production.
If Mobileye achieves volume humanoid production by 2028, the ripple effects will be wide: it would accelerate competition with established robotics firms and Chinese startups racing to commercialize humanoids, reshape labour economics in logistics and light manufacturing, and pose new questions for safety standards and regulation. For now, the timetable remains ambitious and execution risk is high, but the announcement tightens the timetable for when humanoid robots may move from demos into commercial service.
