Tower Semiconductor Announced ultra-fast RF Switch

Tower Semiconductor announced a new radio frequency (RF) switch technology with record figure of merit targeting the 5G and high-performance RF switch markets. The company is engaged with multiple customers and partners to bring this technology to market for next-generation products.

This new switch technology demonstrates a record RF device figure of merit: On/Off transition times (Ron Coff) shorter than 10 femtoseconds vs. 70-100 femtoseconds in use today for the most advanced applications. The switch performs over a wide range of frequencies spanning MHz to mmWave, including the frequency bands discussed for 5G.

The switch is also nonvolatile so consumes no energy when in the on-state or off-state, making it attractive for IoT, and other power and battery sensitive applications. Tower has demonstrated the versatility of this patented technology by integrating it with other process platforms such as SiGe BiCMOS and Power CMOS.

Tower Semiconductor will be offering multi-project wafer runs (MPWs) in 2021 for select customers. This model enables new customers to experience the technology in lower costs, by sharing the wafer in production with other interested parties. The new RF switch will be presented at IMS 2020 (International Microwave Symposium).

The abstract of Tower Semiconductor’s presentation in IMS 2020 reveals more details about the new technology: Two different sized layouts of four-terminal phase-change material (PCM) RF switches fabricated in a 200 mm silicon high volume manufacturing environment. Both layouts have with a record high FCO of 25 THz. Layout-A has a RON*C_OFF values of 6.2 fs, and Layout-B has a RON*C_OFF values of 6.3 fs.

Both layouts show minimal changes to RON or actuation voltage when cycled 10 million times. Also, a Layout-A device was cycled 1 billion times, demonstrating the ability of this RF switch to be used in high endurance applications.

Intel may Outsource 7-nm Production

Above: Bob Swan, Intel CEO. “We have invested in contingency plans “

Intel took the market by surprise when it revealed last week a plan to intensify outsource production and to move some of its future 7-nanometer devices production to third parties. Immediately following the announcement, Intel’s shares in NASDAQ lost 16%. In fact, Intel published a very good Q2 2020 results: Revenues of $19.7 billion, compared to $16.6 billion last year. It also expects annual revenues of $75 billion in 2020, compared to $72 billion in 2019.

But Intel’s production difficulties overshadowed every thing else. Intel’s, CEO Bob Swan, published a prepared remarks about the issue: “We are seeing an approximate six-month shift in our 7nm-based CPU product timing relative to prior expectations. Our 7nm process is now trending approximately twelve months behind our internal target. We have identified a defect mode in our 7nm process that resulted in yield degradation.”

“Contingency Plan” means Outsourcing Production

“We’ve root-caused the issue and believe there are no fundamental roadblocks, but we have also invested in contingency plans to hedge against further schedule uncertainty.” Trey Campbell, Director of Investor Relations, gave a context during the earning call: “Our priorities in the ideal world is leadership products on our process technology. But the focus will be leadership products. So to the extent that we need to view somebody else’s process technology, and we call those contingency plans, we will be prepared to do that.”

In an answer to an analyst in the call, Swan explained that if the company decide to continue to do all its production inside, it will invest “a little more (in) 10-nanometer and less (in) 7-nanometer. In the event we decide that we’re going to leverage third-party foundries more effectively, we would have a little more 10 and a lot less seven. In the event we’re not there and there’s a better alternative, be prepared to take advantage of it.”

The conclusion is shocking: Intel does not lead the process race anymore, and it is also does not believe in its ability to provide full scale 7 nm production services for its own road map. In this case it has no other option but to outsource TSMC and Samsung, the global leaders in 7 nm process.

Nvidia and Mellanox built a Supercomputer in just a Month

Photo above: Mellanox’ AI platform protects supercomputers from from hacking and inappropriate use

In a first joint announcement by Nvidia and Mellanox, the two companies announced a reference design for the rapid building of supercomputers, and a new cyber protection platform for supercomputers. Mellanox has expanded its offering of Unified Fabric Manager (UFM) products, adding to it a new appliance called UFM Cyber-AI Platform.

It provides cyber protection to supercomputers and big data centers, using an artificial intelligence software that studies the behavior characteristics of the computing systems, to identify malfunctions and detects abnormal activity that implies on hacking and unauthorized activity.

Originally, UFM technology was developed a decade ago by Mellanox in order to manage InfiniBand-based communications systems by providing network telemetry data, monitoring the activity of all the related devices, and managing the software updates across the network’s components.

The new solution comes both as a software package or as a complete appliance based on Nvidia’s dedicated server. It is focused on characterizing computer operation and identifying unusual activity. According to Nvidia and Mellanox, the system significantly reduces the data center’s downtime, whose damages are estimated to reach $300,000 per hour.

Supercomputers are open and unprotected platforms

According to Mellanox’s VP of Marketing, Gil Shainer, the integration of Mellanox’s InfiniBand with Nvidia’s GPU changes the rules of the game in the supercomputer market, bringing to it unprecedented cyber security and preventative maintenance capabilities. Shainer: “Supercomputers are managed differently from organizational computer centers. Usually it is an open platform that need to provide easy access to many researchers around the world.”

To illustrate the dilemma he recalled an event that took place several years ago at an American university. “The administrator of the computers center told me how they caught a student using a computer for crypto mining. The suspicion emerged when they found out that the computer’s power consumption was not declining during the annual vacation, a period of time in which the computer usually is not active. Our solution allows you to detect such a situation right away – and not have to wait for your computer’s power bill.”

Reference Design for the Rapid Construction of Supercomputer

Alongside the joint announcement, Nvidia unveiled a new supercomputer called Selene (photo above), which is considered the strongest industrial supercomputer in the United States, with peak performance of 27.5 petaflops. The computer is based on the new A100-model GPU processors announced this week, and was built for internal research conducted in Nvidia. During a press briefing last week, Shainer revealed that the new computer was built in just one month, a record-breaking time for the construction of a supercomputer.

Shainer: “The ability to build a supercomputer in a month is based on expertise in communication and expertise in processors. We have developed a reference design that allows anyone to build a supercomputer, based on ready made blocks of Nvidia’s processors and Mellanox’s communication. Because the processors are fully compatible with the communications cards, the computer can be set up in no time. In fact, we have jointly developed a reference design that allows for the construction of computers of any size – not just supercomputers.”

BMW-Mercedes Break up is bad news for Intel/Mobileye

Photo above: BMW impression of highway autonomous driving

Less than a year since the German Automotive giants BMW Group and Mercedes-Benz AG agreed to work together on a joint development program of next-generation technologies for driver assistance systems and automated driving, they decided to halt the cooperation and to take different paths. Last week they announced that they are putting their cooperation in automated driving “temporarily on hold”.

The original agreement raised many expectations: On July 2019, the two parties announced an agreement for a long-term strategic cooperation, which will include joint development of driver assistance systems, automated driving on highways and automated parking (SAE Level 4). They planned to bring together more than 1,200 specialists from both companies, often in mixed teams, to develop a scalable architecture for driver assistance systems, including sensors, as well as a joint data centre for data storage, administration and processing, and functions and software.

Intel/BMW vs Mercedes/NVIDIA

For Intel and Mobileye (owned by Intel) it was a great opportunity: They both have a long and deep cooperation with BMW Group in all aspects of Autonomous Driving, and the agreement could secure their dominant position in the German car industry. “We have systematically further developed our technology and scalable platform with partners like Intel, Mobileye, FCA and Ansys,” said Klaus Fröhlich, member of the Board of Management of BMW. “Our current technology, with extremely powerful sensors and computing power, puts us in an excellent position.”

But those hopes were short lived: “Digitalization is a major strategic pillar for Mercedes-Benz. To prepare for the future challenges of a rapidly changing environment, we are currently also sounding out other possibilities with partners outside the automotive sector,” said Markus Schäfer, Board Member of Daimler AG and Mercedes-Benz.

And it turned out that one of these “partners” is NVIDIA – a bitter competitor of Intel and Mobileye. On Tuesday, June 23, they announced a cooperation to create a revolutionary in-vehicle computing system and AI computing infrastructure. Starting in 2024, this will be rolled out across the fleet of next-generation Mercedes-Benz vehicles.

The new software-defined architecture will be built on the NVIDIA DRIVE platform and will be standard in Mercedes-Benz’s next-generation fleet. But there is a twist: NVIDIA and Mercedes-Benz will jointly develop the AI and automated vehicle applications for SAE level 2 and 3 – far below the ambitious goal of the original BMW/Mercedes coalition.

Weebit Nano raised $4.5 million to commercialize ReRAM technology

Hod Hasharon (near Tel Aviv)-based Weebit Nano, which is developing a new type of non-volatile ReRAM technology, has raised $4.5 US million through a private allotment of shares on the Australian Securities Exchange (ASX). At the moment, the company is trying to raise an additional $0.5 AUD million through a public offering. Weebit Nano’s CEO, Coby Hanoch, told Techtime that this financial round ,”Will allow us to move towards commercialization, and hopefully, within a year we’ll already be engaging in serious interactions with potential customers.”

According to the report supplied by the company to the ASX, about half of the money raised will be allocated for the development of a dedicated module for embedded systems, the company’s first target market for its ReRAM technology. “Our technology has already been proven and tested by customers. We are now developing a specific module of the memory, in order to make it suitable for the embedded systems market.”

The Best of all Possible Worlds

Approximately 20% of the amount will be allocated for the development of a component called ‘Selector’, which is designed to minimize leakage currents between the memory cells, and about 15% for transferring the technology for production at standard Fab manufacturing facilities. Weebit Nano is developing a new Resistive Random Access Memory (ReRAM), based on the use of materials that change their electrical resistance in response to electrical voltage, thus “remembering” the voltage levels after they are disconnected from the power source.

It combines the non-volatility of flash memory with the fast, low-power, and long life cycle of the volatile DRAM memory technologies. The company estimates that its prototype is 1,000 times faster and uses 1,000 times less power than flash memory, traits which make him perfect candidate for IoT, artificial intelligence, information centers and more.

Lately, Weebit Nano announced first commercial collaborations, both with the Chinese semiconductors companies, XTX and SiEn. Together, they will examine the integration of the Weebit Nano’s memory component into their’ products. “China is the largest chip consumer in the world, and is determined to build an independent semiconductors industry,” said Hanoch.

Altair’s new business: AI DSP Engines

Photo above: Sony intelligent vision sensors IMX500 (left) and IMX501. Both include Altair’s DSP processor

Hod Hasharon-based Altair Semiconductor (owned by Sony) has secretly expanded its operations beyond the IoT sector and entered the Artificial Intelligence (AI) chips market. This came to light last month, when Sony announced new image sensors for smart control systems . The component is built of two chips embedded in stacked configuration inside a single package (Multi Chip Module) consisting of a Sony image sensor, and a DSP processor developed by Altair, which is responsible for a neuronal network inference operations.

This new family of smart image sensors is currently consists of two components: IMX500 and IMX501. When installed in a security camera, street camera, or other IoT devices, the logic circuit processes and sends only the inference itself to the network center. Thus, it saves considerable processing and communication resources and enables a given device to function as a smart sensor without compromising the privacy of the people being photographed.

A smart camera equipped with the visual-logic sensor can enumerate the number of people in the store and transmit the information without having to send their images to the cloud. It can discern congestion patterns in various complexes, and even track customer behavior in the store –  based only on analyzing their movements –  and without having to identify the customers themselves.

The images are sent back in a variety of configurations (see below): pure decoded information without visual elements, an image in various formats, or only the relevant visual area. From Sony’s point of view, this constitutes an entrance to a major market characterized by a very large growth. As far as Altair is concerned, this is a very surprising development, since so far the company has focused on communication solutions for IoT devices and not on the development of DSP or artificial intelligence processors.

Altair’s core activity is focused on IoT connectivity chips, with its flagship product being the ALT1250 chipset, which includes a modulator and a modem for supporting Cat-M1 standard and the NB-IoT standard. It features an RF front end circuit that supports all LTE bands, an RFIC circuit, a power management unit (PMU), memory, amplifier circuits, filters, an antenna switch, global navigation satellite system (GNSS), hardware-based security, an eSIM circuit and an internal micro controller unit (MCU) that allows customers to develop unique applications.

A new strategy for both Altair and Sony

Sony’s announcement positions it in a massive market and transforms it into ahybrid IoT-image-sensors player. The move can secure orders for Altair in very large quantities. However, it can also hint at a new Altair strategy that can develop in two interesting directions: the first is the integration of ALT1250 technologies into Sony’s future image sensors – alongside the recently unveiled AI processor.

The other direction is independent: integrating the artificial intelligence processor into its next-generation connectivity chip – a kind of ALT1250 reinforced with artificial intelligence. An IoT connectivity chip embedded with artificial intelligence has many advantages – from providing artificial intelligence to ‘dumb’ cameras – thus allowing enhanced communication management capabilities – and even enhancing the current-generation ALT1250 security system.

Connected Devices in an Era of Pandemics

By: Igor Tovberg, Director of Product Marketing at Altair Semiconductor, a Sony Group Company

Technology has a history of helping to track and treat viruses. And, with the World Health Organization (WHO) declaring COVID-19 a global pandemic, people are rightly asking themselves how new technologies such as the Internet of Things (IoT), AI, and Big Data can be employed to slow down the proliferation of pandemics and avoid a future global health crisis.  In this article, I describe how connected medical devices could help.

Monitoring trends with Wearables

Millions of wearable devices have been deployed globally. Activity and heart-rate sensing are becoming a baseline feature in every fitness band and smartwatch, with data being continuously sensed and uploaded into the cloud. Would this data be useful in predicting a spreading epidemic?

Indeed, a recently published study by Scripps Research Translational Institute in The Lancet Digital Health analyzed such data and found that resting heart rate and sleep-duration data collected from wearable devices could help inform timely and accurate models of population-level influenza trends. Sensing and analyzing more physiological factors would improve the speed and accuracy in the discovery of epidemics.

Changes in patient care habits

Isolation is one of the preventive actions being taken to stop the virus spread, as exposure to an infected carrier could prove fatal for people with a weakened immune system. Now, more than ever, health stats relating to virus symptoms can be sent to health care providers without patients having to visit their clinic and risking exposure.

mHealth

Connected devices such as thermometers, blood pressure meters, inhalers, glucose meters, or other personal health monitoring devices will play a significant role in protecting people’s lives.

Cellular connectivity through the CAT-M or NB-IoT network can ensure a secure and reliable countrywide link for the delivery of patients’ stats to their health care provider from any location, regardless of WiFi/BLE coverage.

Connected out-of-the-box cellular-based devices are freeing doctors from relying on a patient’s ability to set up the LAN/PAN connection by themselves.

Quarantine compliance with smart cellular IoT wristbands

The general population can wear smart wristbands as a health monitor. With an emphasis on the small size and long battery life, Cellular IoT offers reliable connectivity for smart wristbands, with autonomy from paired smartphones. Recently, the Hong Kong Government has deployed smart wristbands to monitor city residents quarantined inside their homes.

Accelerating the speed of reaction

Monitoring is vital in the detection chain, and reaction time is critical for prevention. Enterprises, airports, and cities would surely benefit from monitoring devices for citizens, and healthcare facilities would benefit from the ability to monitor remote patients. Timely discovery of outbreaks could prevent many new dangerous viruses in the future.

Solution

For personal, medical, or environmental monitoring, Altair’s ALT1250 ultra-low power, compact, secure, and highly integrated cellular IoT chipset enables slimmer devices with long battery life, which can remain continuously connected – reliably connecting people in ways previously unobtainable. All without the need for a smartphone or home WiFi network.

Conclusion

According to Bill Gates, in any crisis, leaders have two equally important responsibilities: Solving the immediate problem and keeping it from happening again. It’s clear that IoT technology, and specifically medical devices, have an important role to play in the containment and treatment of outbreaks like COVID-19. I genuinely believe that IoT can be fully harnessed to control and potentially prevent the next global pandemic.