Nexar Challenges Mobileye and Tesla with an AI Model for Accident Prediction

Israeli startup Nexar has unveiled a new artificial intelligence model called BADAS—short for Beyond ADAS—designed as a foundational layer for next-generation safety and autonomous driving systems. The model draws on tens of billions of real-world driving kilometers captured over years by Nexar’s dashcam network, in an effort to solve one of the toughest problems in mobility: how to make AI systems understand human behavior on the road, not just react to pre-programmed situations.

Unlike traditional models trained mainly on simulations or curated datasets, BADAS was trained on a massive trove of dashcam recordings gathered from private vehicles, commercial fleets, and municipal monitoring systems worldwide. The data come not from lab conditions but from the unpredictable chaos of everyday driving—changing weather, human errors, and near-miss events that never make it into formal crash reports. This gives the model an unprecedented ability to learn from authentic behavioral patterns and real-world context.

Real-World Data Meets Tesla-Style Scale

Nexar’s approach doesn’t replace simulations—it complements them. While simulations reproduce rare or dangerous scenarios, real-world footage provides the probabilistic texture of everyday driving. On this foundation, BADAS can anticipate what might happen seconds ahead—for instance, when a car subtly drifts toward another lane or when traffic dynamics around an intersection shift unexpectedly. The result is an evolutionary step from reactive alerts to probabilistic prediction based on learned behavior.

The strategy naturally recalls Tesla’s vision-based AI, which also relies on data from millions of vehicle cameras. Both companies see large-scale visual data as the key to autonomous learning, yet their roles differ sharply. Tesla builds a closed, vertically integrated system: data, software, and vehicle. Nexar, in contrast, is a data and AI infrastructure provider, not an automaker. It collects and processes global video data, then offers predictive models as a plug-in layer for others—automakers, fleet operators, insurers, and cities. Its ambition is to create a kind of “AI roadway infrastructure,” a shared computational foundation for safety and prediction.

Predicting a Crash 4.9 Seconds Ahead

Alongside the launch, Nexar published a research paper titled “BADAS: Context-Aware Collision Prediction Using Real-World Dashcam Data” on arXiv, detailing the model’s scientific principles. The paper redefines accident prediction as a task centered on the ego vehicle—the driver’s own car—rather than external incidents captured by chance.

Two model versions were introduced: BADAS-Open, trained on about 1,500 public videos, and BADAS 1.0, trained on roughly 40,000 proprietary clips from Nexar’s dataset. The study found that in many existing datasets, up to 90% of labeled “accidents” are irrelevant to the camera vehicle—necessitating new annotation work using Nexar’s richer data. The outcome was striking: the model predicted collisions an average of 4.9 seconds before they occurred, a major improvement over earlier visual prediction systems. Integrating real-world data with a new learning architecture known as V-JEPA2 further enhanced both accuracy and stability.

From Dashcams to a Global Data Engine

Until now, Nexar’s business revolved around consumer dashcams, but the company was quietly building a global road-data network—tens of millions of driving hours from hundreds of thousands of cars. BADAS marks the transformation of that dataset into a commercial product.

The model is expected to serve multiple industries: carmakers could license it for driver-assistance systems (ADAS); insurers might use it for context-based risk assessment; and municipalities could deploy it for real-time detection of hazards, crashes, or congestion. Nexar plans to make BADAS accessible through API and SDK interfaces, enabling partners to build custom services around it. Its greatest advantage lies in scale: every new vehicle connected to Nexar’s network enhances the model’s predictive power—a classic flywheel effect.

A Bridge Between Sensing and Intelligence

By launching BADAS, Nexar positions itself between hardware-driven vision firms like Mobileye and cloud-AI platforms that power advanced models. It aims to be the bridge connecting ground-truth data with the AI systems that interpret complex human motion and decision-making on the road. The move also reflects a broader automotive trend—shifting from costly sensors like LiDAR to scalable, cloud-based visual intelligence.

If Nexar’s model delivers on its promise—accurate real-time hazard prediction across diverse environments—it could become a core AI layer for global road safety, underpinning not only autonomous vehicles but also commercial fleets and smart-city systems worldwide.

Mobileye Begins Developing the EyeQ8 Chip for “Mind-Off” Driving

Mobileye Global Inc. (NASDAQ: MBLY) reported third-quarter 2025 revenue of $504 million, up about 4% year-over-year, with a non-GAAP net profit of $76 million and a GAAP net loss of $96 million.
The company raised its full-year outlook, guiding for $1.85–1.88 billion in revenue and up to $286 million in adjusted operating income — reflecting 2% and 11% increases over previous forecasts.

According to the company, the revision stems from stronger-than-expected performance in China and Europe. Demand for its EyeQ6-based ADAS systems continues to rise among Chinese automakers preparing for new model launches. In Europe, Mobileye announced a new collaboration with Bentley Motors, integrating advanced driver-assistance systems into upcoming luxury models. It marks one of Mobileye’s first deployments within a high-end European brand under the Volkswagen Group, serving as a template for additional VW marques.

China as a Challenge, India as the Next Frontier

During the earnings call, CEO Amnon Shashua highlighted that results were “better than expected in China, both from shipments to Chinese OEMs and from the performance of Western OEMs operating in the country.”
Still, the company faces pricing pressure in that market. CFO Moran Shemesh noted that “the average selling price of EyeQ chips declined by about $0.50 year-over-year, primarily due to higher Chinese OEM volumes, where pricing remains a significant headwind.”

At the same time, India is emerging as a major growth engine. “The growth potential in India is becoming increasingly clear — driven by stronger adoption trends and a supportive regulatory environment,” Shashua said. EVP Nimrod Nehushtan added that India will soon join the company’s REM network, which crowdsources road data from more than seven million vehicles worldwide.

Mobileye emphasized that it is now transitioning from advanced driver-assistance systems (ADAS) to fully autonomous capabilities, led by its IQ6 High and Surround ADAS platforms.
“Demand for higher performance at lower cost is intensifying,” said Shashua. “The IQ6 High delivers performance comparable — and in many cases superior — to Nvidia’s Orin X, at less than one-quarter of the price.”
The system combines multiple cameras and radar sensors to enable hands-free highway driving, and the company recently announced a second major Western OEM win for the technology — underscoring its growing appeal in mass-market vehicles.

EyeQ8 and the Era of “Mind-Off” Autonomy

Beyond near-term numbers, Shashua used the call to outline Mobileye’s longer-term technological vision. The company has begun development of its EyeQ7 and EyeQ8 chips — designed to push autonomy from “eyes-off” (where a human or teleoperator still serves as backup) to “mind-off,” where no human intervention is required.

“In mind-off driving, the driver can sleep — the robotaxi no longer needs a teleoperator,” he explained. “The EyeQ7 and EyeQ8 don’t replace the EyeQ6; they add a new layer on top of it. We need AI that can understand a scene like a human being — to perceive context, not just objects.”

According to Shashua, EyeQ chips follow a two-year development cadence. The upcoming EyeQ8, now in design, will be three to four times more powerful than the EyeQ7 and form the backbone of Mobileye’s mind-off systems targeted for 2029–2030.

Robotaxis and the German Testbed

Commercially, the company is preparing to remove safety drivers from its first U.S. robotaxi fleet in the first half of 2026, in partnership with Lyft, Volkswagen, and Holland (a Benteler division). In Europe, Mobileye is working with Volkswagen to secure homologation in Germany — a key regulatory milestone. “Germany’s government has made clear it wants to lead Europe in autonomous driving,” said Nehushtan, describing strong public and political support for the initiative.

Mobileye Achieves First Commercial Win for Its Radar in Autonomous Driving System

After seven years of development, Mobileye has secured the first commercial win for its imaging radar system: a major global automaker has selected the radar to serve as a core component in its autonomous driving platform. The decision follows over a year of comparative evaluations, in which the system competed head-to-head with rival technologies. The customer plans to integrate the radar into an SAE Level 3 autonomous driving system starting in 2028. This system will support hands-free driving on highways, with the ability to detect vehicles, objects, obstacles, and pedestrians.

Mobileye originally began developing the radar in 2018 with the goal of providing redundancy for its camera-based autonomous driving system. Most traditional automotive radars offer data about object distance, relative velocity, and horizontal positioning. However, Mobileye’s radar belongs to a new class of 4D imaging radars, which capture that same data in both horizontal and vertical planes, enabling a three-dimensional understanding of the environment over time.

The system is built around a radar-on-chip (SoC) processor developed entirely in-house by Mobileye, capable of delivering up to 11 TOPS of compute. It features a Massive MIMO-based transmit-and-receive architecture implemented using proprietary RFIC components. These components handle signal transmission and reception, convert the analog signals into digital form, and send the data to the radar’s main processor. The system supports more than 1,500 virtual channels and operates at a rate of 20 frames per second. The radar antenna provides a wide 170-degree field of view and sub-0.5-degree angular resolution.

From Backup Sensor to Central Sensing System

Mobileye’s radar is designed to serve three core functions in the vehicle: ensuring reliable sensing in poor environmental conditions that impair camera performance, enriching the scene understanding provided by cameras, and acting as a full fallback system in case of a camera failure. In effect, the radar is capable of replicating all camera-based functions to ensure uninterrupted autonomous driving.

According to the company, the radar can detect small objects at safe distances even when the vehicle is traveling at speeds of up to 130 km/h (about 81 mph). In such scenarios, the radar can identify pedestrians and cyclists at a range of around 315 meters, and even smaller hazardous obstacles at distances of approximately 250 meters.

Mobileye is currently traded on Nasdaq with a market capitalization of roughly $12 billion.

Mobileye revenue decreased 23%

Mobileye announced that its revenue decreased 23% year over year to $490 million in the fourth quarter, as compared to the fourth quarter of 2023. The main reason the this weak quarter is a 20% reduction in EyeQ SoC volumes. This was primarily related to the previously disclosed meaningful build-up of inventory at our Tier 1 customers, including in the fourth quarter of 2023. Average System Price was $50.0 in fourth quarter 2024, compared to $52.7 in the prior year period primarily due to lower percentage of SuperVision related revenue as compared to the fourth quarter of 2023.

Operating Margin of (18%) decreased by 29 percentage points in the fourth quarter of 2024 as compared to the prior year period, due to higher operating expenses than the prior year period on a lower revenue base, as well as the gross margin. Mobileye annual 2024 sales totaled $1.65 billion, compared with $2.08 in 2023. The company generated net cash of $400 million in 2024. Its balance sheet is strong with $1.4 billion of cash and cash equivalents and zero debt. Mobileye expects to return to growth this year. Its financial guidance is $1.69-1.81 billion sales in 2025.

Mobileye President and CEO Prof. Amnon Shashua, said the company achieved major technological milestones: “EyeQ6 High System-on-Chip (SoC) is on-track for series production launch and achieves 10x the frame-per-second processing in comparison to EyeQ5 High. We look forward to a robust cadence of EyeQ6 High-based product launches beginning in 2026”.

He revealed that Mobileye imaging radar B-samples achieved outstanding performance across hundreds of OEM tests. “Most importantly, we progressed significantly on the SuperVision, Chauffeur, and Drive projects for VW Group, achieving milestones on the path to start-of-production.”

Mobileye to use Innoviz LiDAR for its AV Platform

Above: Mobileye’s robotaxi in its HQ campus in Jerusalem

Mobileye and Innoviz Technologies, announced that Mobileye will use Innoviz’s LiDARs for Mobileye Drive, its AV platform. Mobileye Drive is a comprehensive driverless system that enables to provide robotaxis, ride-pooling, public transport, and goods delivery fully autonomous. It is now under comprehensive tests in EuropeNorth America, and Asia. Innoviz’s LiDAR technology will join the current cameras, radars, and imaging radars in this platform. The agreement is built upon mutual work between the two companies over the past few months, with Start of Production (SOP) beginning in 2026.

“The integration of our imaging radars and high-resolution cameras in combination with the Innoviz LiDARs will play a key role in delivering Mobileye Drive,” said Prof. Amnon Shashua, President and CEO of Mobileye. Innoviz’s InnovizTwo product platform specifically engineered for Mobileye Drive to provide the L4 autonomous platform with a complete set of LiDARs.

“Better-than-expected cost reduction”

The agreement was signed shortly after Mobileye had eneded the internal development of Frequency Modulated Continuous Wave (FMCW) LiDARs in September 2024. MobilEye explained the descision: “We now believe that the availability of next-generation FMCW lidar is less essential to our roadmap for eyes-off systems. This decision was based on a variety of factors, including substantial progress on our EyeQ6-based computer vision perception, increased clarity on the performance of our internally developed imaging radar, and continued better-than-expected cost reductions in third-party lidar units.”

The lidar R&D unit will be wound down by the end of 2024, affecting about 100 employees. Operating expenses for the lidar R&D unit are expected to total approximately $60 million in 2024 (including approximately $5 million related to share-based compensation expenses). While this action is not expected to have a material impact on Mobileye’s results in 2024, it will result in the avoidance of lidar development spending in the future.

P3 to use Mobileye Drive for its robotaxis

Mobileye and the Croatia-based Project 3 Mobility (P3), announced a collaboration to explore a new mobility service, utilizing Mobileye’s scalable self-driving technology, Mobileye Drive. The first P3 service is aimed to be launched in Zagreb in 2026, with testing and validation of Mobileye’s AV solution on the streets of the Croatian capital targeted to start in 2024.

Project 3 Mobility is developing a fully autonomous electric vehicle for urban mobility ecosystem and the needed specialised infrastructure and mobility services. The vehicle is built on a completely new platform designed for fully autonomous driving. The project will create a new mobility service in the wider area of Zagreb based on the concept of “Mobility as a Service” (MaaS). Project 3 Mobility is about to establish a production facility in Croatia for the large-scale production of autonomous vehicles that will be deployed worldwide.

Mobileye technology will be integrated into the P3 vehicle, which will use the Mobileye Drive autonomous driving solution. Currently, Project 3 Mobility has a team of more than 240 people with experts from more than 20 different industries and nationalities in two offices – in Croatia and the UK. P3 has already signed agreements with 9 cities across the EU, UK and the Gulf Cooperation Council to provide its urban autonomous service.

Project 3 Mobility plans to invest at least EUR 350 million before going to market. In May 2023 it secured a 179,5 million euros grant from the European Commission, and in the beginning of February 2024 P3 had closed a 100 million euros Series A investment round. Among the current investors: Kia, SiteGround  and Rimac Group.

Jerusalem-based Mobileye has developed an autonomous driving and driver-assistance technologies. Today, more than 170 million vehicles worldwide have been built with Mobileye technology inside. Its 2023 annual revenue tota;;ed $2.08 billion- 11.24% growth year-over-year. The company’s Future business backlog continues to grow, with 2023 design wins projected to generate future revenue of $7.4 billion across 61 million units.

Mobileye’s EyeQ Ultra Targets Consumer AVs

Mobileye’s new EyeQ Ultra system-on-chip (SoC) for autonomous driving is optimized for low cost vehicles . The company said that at only 176 TOPS, it can handle all the needs and applications of Level 4 (L4) autonomous driving without the power consumption and costs related to integrating multiple SoCs together. “Consumer AV is the end game for the industry,” said Prof. Amnon Shashua, Mobileye president and CEO. “By developing the entire self-driving solution – from hardware and software to mapping and service models – we can reach the performance-and-cost optimization that will make consumer AVs a reality.”

EyeQ Ultra packs the performance of 10 EyeQ5s in a single package. Leveraging 5 nanometer process technology, EyeQ Ultra can handle all the needs and applications of Level 4 (L4) autonomous driving without the power consumption and costs related to integrating multiple SoCs together. Like its EyeQ predecessors, EyeQ Ultra has been engineered in tandem with Mobileye software, enabling extreme power efficiency with zero performance sacrifices.

First silicon is expected at the end of 2023

EyeQ Ultra (photo above) utilizes an array of four classes of proprietary accelerators, each built for a specific task. They are paired with additional CPU cores, ISPs (Image Signal Processors) and GPUs, and is capable of processing input from two sensing subsystems – one camera-only system and the other radar and lidar combined – as well as the vehicle’s central computing system, the high-definition map and driving policy software.

First silicon for the EyeQ Ultra SoC is expected at the end of 2023, with full automotive-grade production in 2025.  The new AV solution is supported by some 200 petabytes dataset that helps the AV and computer vision system handle edge cases and thereby achieve the very high mean time between failure (MTBF) needed in self-driving vehicles.

The compute engine relies on 500,000 peak CPU cores at the AWS cloud to crunch 50 million datasets monthly – the equivalent to 100 petabytes being processed every month related to 500,000 hours of driving. “The sheer size of Mobileye’s dataset makes the company one of AWS’s largest customers by volume stored globally.”