Intel and Mobileye develop LiDAR in a Chip

Mobileye announced during CES 2021 this week that it has produced the first prototype of an Automotive LiDAR sensor based on Intel’s Silicon Photonics technology. The new sensor is planned to be installed in its Autonomouse Driving systems in 2025, when the market will be ready for a mass deployment of AVs. According to Prof. Amnon Shashua, President and CEO of Mobileye, the new sensor is based on Frequency-Modulated Continuous Wave (FMCW) technology and Doppler-style algorithms as opposed to the current Time of Flight sensors.

The FMCW sensor provides 4D velocity relative measurements for a distance of up to 300 meters. Its high resolution detection capability reaches 600 points per degree, created by 2 milliom laser pulses per second (2M PPS). The idea is that current approaches and available sensors are too expensive for consumer AVs. Mobileye needs Radars and LiDARs that are both better and cheaper. To reach L5 autonomy it propose three levels of redundancy in the forward-facing field of view (FoV), and for the rest of the FoV, 2 levels of redundancy.

In this scenario, the EV system consists of 360⁰ camera coverage, 360⁰ Radar cocoon and one forward-facing LiDAR sensor. In fact, Intel owns a unique Fab capable of putting active and passive optical elements on a chip together, including lasers and optical amplifiers, loaded onto a photonic integrated circuit, PIC.

The goals for the future radar chip are also very agressive: It will be a software-defined imaging radar eqipped with 2,304 virtual channels based on 48 by 48 transmitters and receivers. This radar will be able to detect motorcycles beyond 200m, old tire on the road, 140 meters away and low and small hazards on the road.

Seoul Robotics to employ Cognata LiDAR Simulator

Above: Winter driving in difficult visibility conditions – in Cognata’s synthetic simulator

Cognata was chosen to provide a simulator of LiDAR sensors signals to Seoul Robotics, which develops software for analyzing data coming from the sensors, in order to extract information about the vehicle’s environment. The collaboration deepens Cognata’s grip on the ADAS systems market. Founded in 2016 by the CEO Danny Atsmon, the Rehovot-based (near Tel aviv) Cognata has developed a virtual platform used to train and test autonomous vehicles even before the vehicle hits the road for field tests.

The system is based on several layers: a static environment, a dynamic environment, sensors and cloud interface. The static environment is built from realistic imaging of entire cities, including streets, trees, road defects, etc. The dynamic layer mimics the behavior of other drivers on the road and the sensor layer mimics the information coming from each of the 40 different sensors found today in autonomous vehicles.

The chosen imaging software of Innoviz

Cognata is well acquainted with the field of LiDAR. In December 2019, it was selected by Innoviz to test Innoviz’s LiDAR technology. Cognata’s software can simulates how Innoviz’s LiDAR signals are reflected from different surfaces and materials, and how the sensors will function under different road conditions. A few days later it was also chosen by the Rehovot-based Foresight to test its QuadSight system, based on the use of two infrared cameras and two visible-light cameras, to produce a stereoscopic (three-dimensional) machine vision capability.

The agreement with Seoul Robotics is Cognata’s second major deal in Korea. In August 2020, it was selected by Hyundai MOBIS to supply a simulator for the development of ADAS systems and autonomous vehicles. Hyundai MOBIS is a Tier 1 supplier of the Korean automotive industry and manufactures auto parts for Hyundai, Kia and Genesis Motors.

Innoviz Introduced Next Generation LiDAR

Innoviz Technologies announced the launching of InnovizTwo – the next generation of its solid-state automotive-grade LiDAR sensor. The company declined to provide any technical details but said the new sensor provides a major cost reduction design of over 70% compared to InnovizOne, along with significant performance improvement. Samples of InnovizTwo will be available in Q3 2021, with full production expected to begin in 2022.

The first generation LiDAR sensor, InnovizOne, was chosen by BMW for its upcoming autonomous vehicle production platforms. It is expected that it will be installed in BMW’s electric SUV iNEXT, planned to go to the market in late 2021 or in the begininig of 2022. This  905 nm laser wavelength sensor provides detection range of 20cm to 600m with accuracy of 3cm, field of view (HxV) of 73°x20° and frame rate of 10 FPS.

A migration path for Fully Autonomous Vehicle

Omer Keilaf, CEO and co-founder of Innoviz, told Techtime that the rapid cost performance improvement is a result of several improvements made inside the three main chips that runs the sensor: Its MEMS scanner, the ASIC computing unit and the detector of the returning light. “But”, he added, “the importance of InnovizTwo is laying elswhere.

“We know how difficult it is to introduce a fully validated L3 platform to the market. Taking such a big step is possible by only a few car manufacturers. InnovizTwo solves a significant bottleneck in the industry: It will allow more car makers to offer safe L2+, while paving the path to full L3 automation in efficient and safe way.”

In fact, Keilf explains that it is all about a migration path. “To allow a safe and smooth adoption of automation, the industry needs to introduce L2+ functionality – with hardware that supports L3 and L4.

“With more vehicles effectively equipped with advanced sensors, car manufacturers will be able to collect roadway data and release more advanced functions to those same vehicles via over-the-air updates. To pursue this, hardware must fulfill more advanced technical requirements, while also coming under the price point of today’s L2 technology.”

What are L2, L3, L4?

Autonomous level are defined as Level 2, Level 3 and Level 4. Level 2 (L2) means Partial Automation. The vehicle can control both steering and accelerating/decelerating, human supervision is always required. Level 3 (L3) represent High Automation. The vehicle can perform all driving functions under certain circumstances in which no human supervision is required. The driver is required to be able to regain control. L4 (Level 4) is Fully Automated. The vehicle can perform all driving tasks in specific geographical areas without requiring human presence.

TowerJazz and Lumotive Demonstrate Solid-State Beam Steering for LiDAR

TowerJazz and Lumotive from Bellevue, Washington, announced the successful demonstration of the first beam steering ICs for automotive LiDAR systems that are fully solid-state (without any moving parts). The steering concept is Based on Lumotive’s Liquid Crystal Metasurface (LCM) technology, to control the laser beam direction by applying electric fields. The idea was originally developed by  Dr. David Smith, Director for the Center for Metamaterial and Integrated Plasmonics in Duke University.

He had developed a concept called Holographic Beam Forming. By building miniature metal structure on a surfaces (called Metamaterials, or Metasurface), he could change the refractive index of this surface. When these structures are small enough to act like tiny array of antennas, they response to electric fields and allow the control of the refractive index by applying electric signals.  But to implement the idea, Lumotive needed a special IC.

Here TowerJazz came for the help: Lumotive’s beam-steering ICs uses TowerJazz’s 130 nm Cu back-end-of-Line technology, customized to meet specific optical performance requirements with optimized lithography and custom dielectrics. Lumotive’s complete LiDAR system based on this beam steering chip coupled with a custom SiPM (Silicon Photomultiplier) sensor, utilizing TowerJazz’s SPAD (Single Photon Avalanche Diode) technology, will be available for prototype testing in late 2019.

“Our partnership with TowerJazz enabled us to achieve this important milestone and will allow us to bring our revolutionary technology to production,” said Bill Colleran, President & CEO, Lumotive. Research firm Yole Développement estimates that the ADAS and autonomous vehicle LiDAR markets will grow dramatically in the coming years, increasing from $721 million in 2018 to $6.3 billion in 2024, with a CAGR of nearly 45% during that period.

Innoviz: $170M Funding Round for Solid-State LiDAR

Innoviz Technologies, from Rosh Ha’ayin, Israel, has closed its Series C funding round with $170 Million secured. This is a much bigger round compare to the company’s expectation of $132M (March 2019). Innoviz provides solid-state LiDAR sensors and perception software to enable mass-production of autonomous vehicles, and had been chosen by BMW to supply the LiDAR sensors of its future autonomous car. The brings Innoviz’s total funding to $252 Million.

The announced that the funding will support several key initiatives, including the enhancement of Innoviz’s perception software. This software support the sensors by providing vehicles with a deep understanding of the 3D driving scene, including object detection, classification, segmentation and tracking. In addition, Innoviz has secured two computer vision industry experts, Dr. Raja Giryes and Or Shimshi, to serve as strategic collaborators to the software.

Dr. Raja’s experience includes senior academic and research roles and a doctorate in computer science. Or brings significant private sector experience managing teams and serving in advisory roles for computer vision, deep learning, machine learning and AI with Samsung Semiconductors, Intel, Qualcomm and Citi Bank. Innoviz will also accelerate its path to mass production and commercialization.

The company expects that its InnovizOne automotive-grade LiDAR solution to enter series production in 2021 for global automakers. The company’s high-performance, solid-state LiDAR solution InnovizPro is available now, offering better performance as well as mapping and other applications.