Mobileye Begin Autonomous Fleet Tests

18 May, 2018

The first of the Intel Mobileye 100-car fleet hits the road in Jerusalem in May 2018. "If you can drive in Jerusalem, you can drive anywhere". The Radar/LiDAR layer will be added in the coming weeks

The first phase of the Intel and Mobileye 100-car autonomous vehicle (AV) fleet has begun operating in the challenging and aggressive traffic conditions of Jerusalem. This pilot is aimed to demonstrate Mobileye’s Responsibility-Sensitive Safety (RSS) model , and to integrate key learnings into the product. In the coming months, the fleet will expand to the U.S. and other regions.

“We target a vehicle that gets from point A to point B faster, smoother and less expensively than a human-driven vehicle; can operate in any geography; and achieves a verifiable, transparent 1,000 times safety improvement over a human-driven vehicle without the need for billions of miles of validation testing on public roads,” stated Professor Amnon Shashua, senior VP at Intel and the CEO and CTO of Mobileye.

Why Jerusalem?

“Mobileye is based in Israel, and Jerusalem is notorious for aggressive driving. There aren’t perfectly marked roads. And there are complicated merges. People don’t always use crosswalks. You can’t have an autonomous car traveling at an overly cautious speed, congesting traffic or potentially causing an accident. You must drive assertively and make quick decisions like a local driver.

“This environment has allowed us to test the cars and technology while refining the driving policy as we go. Driving policy makes all other challenging aspects of designing AVs: to be extremely safe without being overly cautious; to drive with a human-like style (so as to not surprise other drivers) but without making human errors. To achieve this delicate balance, the Mobileye AV fleet separates the system that proposes driving actions from the system that approves (or rejects) the actions. Each system is fully operational in the current fleet.”

The Golden Path between Safety and Assertiveness

“Our driving policy system is trained offline to optimize an assertive, smooth and human-like driving style. This is a proprietary software developed using artificial intelligence-based learning techniques. This system is the largest advancement demonstrated in the fleet. But in order to feel confident enough to drive assertively, this ‘driver’ needs to understand the boundary where assertive driving becomes unsafe. To enable this important understanding, the AI system is governed by a formal safety envelope that we call Responsibility-Sensitive Safety.”

The Intel Mobileye autonomous car is equipped with 12 cameras to create a comprehensive end-to-end solution
The Intel Mobileye autonomous car is equipped with 12 cameras to create a comprehensive end-to-end solution

“RSS is a model that formalizes the common sense principles of what it means to drive safely into a set of mathematical formulas that a machine can understand (safe following/merging distances, right of way, and caution around obstructed objects, for example). If the AI-based software proposes an action that would violate one of these common sense principles, the RSS layer rejects the decision. The the AI-based driving policy is how the AV gets from point A to point B, while the RSS is what prevents the AV from causing dangerous situations along the way.”

Camera-based Redundancy

“During this initial phase, the fleet is powered only by cameras. In a 360-degree configuration, each vehicle uses 12 cameras, with eight cameras providing long-range surround view and four cameras utilized for parking. The goal is to prove that we can create a comprehensive end-to-end solution from processing only the camera data. We characterize an end-to-end AV solution as consisting of a surround view sensing state capable of detecting road users, drivable paths and the semantic meaning of traffic signs/lights; the real-time creation of HD-maps as well as the ability to localize the AV with centimeter-level accuracy; path planning and vehicle control.”

Professor Amnon Shashua, senior VP at Intel and the CEO and CTO of Mobileye
Professor Amnon Shashua, senior VP at Intel and the CEO and CTO of Mobileye

“The camera-only is our strategy for achieving a ‘true redundancy’ of sensing: multiple independently engineered sensing systems, each of which can support fully autonomous driving on its own. True redundancy provides two major advantages: The amount of data required to validate the perception system is massively lower and in the case of a failure of one of the independent systems, the vehicle can continue operating safely in contrast to a vehicle with a low-level fused system that needs to cease driving immediately. The radar/LiDAR layer will be added in the coming weeks as a second phase of our development.”

Next Generation of Hardware on the Road

Shashua announced that the end-to-end compute system in the AV fleet is powered by four Mobileye EyeQ4s. “An EyeQ4 SoC has 2.5 Terra OP/s (TOP/s) (for deep networks with an 8-bit representation) running at 6 watts of power. Produced in 2018, the EyeQ4 is Mobileye’s latest SoC and this year will see four production launches, with an additional 12 production launches slated for 2019.

“The SoC targeting fully autonomous is the Mobileye EyeQ5, whose engineering samples are due later this year. An EyeQ5 has 24 TOP/s and is roughly 10 times more powerful than an EyeQ4. In production we are planning for three EyeQ5s to power a full L4/L5 AV. Therefore, the current system on roads today includes approximately one-tenth of the computing power we will have available in our next-gen EyeQ5-based compute system beginning in early 2019.”

Share via Whatsapp

Posted in: Autonomous Car , News , Semiconductors