Sponsored Content

2019 began with the annual Consumer Electronics Show (CES) – the gathering place for thousands of global technology companies there to showcase their latest developments and innovations.

BrainChip’s (ASX:BRN) Senior Vice President, Marketing and Business Development, Bob Beachler, was in attendance and noted many trends emerging at this year’s show, with one in particular taking centre stage – autonomous vehicles and ADAS systems.

It is now widely accepted that autonomous vehicles could change the future of our society. It is expected to bring unprecedented mobility and as the technology improves and advances it is hoped our roads will become safer, less congested and maybe even the road-rage will be reduced.

Beachler noted that the key take-away from CES is that for driverless cars to provide a safe and efficient mode of travel the technology needs to be advanced, fool-proof and fail-safe. Reliable sensing, fast processing and communication will form the cornerstone of a successful and safe autonomous vehicle.

The system most commonly referred to as the ‘advanced driver-assistance system’ (ADAS), forms the ‘brain’ of the vehicle. It consists of machine learning algorithms / artificial intelligence and hardware that is connected to the sensors and other systems within the vehicle.

It includes features such as collision warning, avoidance, blind spot monitoring, lane departure warning and more. The ADAS doesn’t completely substitute the human component – but helps the driver with the task.

Currently the sensors and hardware that form part of the ADAS and collect important data and information are required to send the data back to the car’s central hub for processing. It is this to-and-fro that is a major challenge in autonomous vehicles and for one main reason – it slows down reaction time.

But what if that processing could be done there and then?

 

The technology enabling this is Edge computing. It is the ability to process data at the source – even when that data is large and complex.

BrainChip’s artificial intelligence software can be deployed on something as small as a chip – making it ideal in Edge computing applications. Its technology is dubbed the AkidaTM Neuromorphic System on a chip (NSoC).

The NSoC is a new breed of neural network acceleration. Neuromorphic computing works like a human brain – it emulates the biological function of neurons that communicate using spikes.

BrainChip’s Beachler said that “whilst there were a plethora tech companies focused on driverless vehicles and ADAS, there weren’t any firms as advanced as BrainChip in developing and deploying a chip that would be able to process the sensor information quickly and with minimal power consumption – rapidly speeding up the ability of the ADAS to process data and enable real-time decision making.”

Speeding up processing times is key to real-time decision-making

 

Autonomous vehicles represent an enormous computing challenge – data captured by the car’s cameras and LIDAR scanners needs to be processed by the vehicle’s onboard computers.

The latency that is currently associated with sending this data back and forth is too large for any ADAS to implement real-time decision-making. It also drains power. This is where the Akida NSoC comes in.

Its nextgen AI capabilities mean it eliminates the requirement to send and share the data with a central hub and is able to process and act upon the data as it is fed into the chip – in real-time.

It is this form of computing at the edge that will be the driving force behind autonomous vehicles.

It will become more integral than the cloud – which is how most data is currently shared for processing. But even in today’s modern society – a decent network connection is never guaranteed.

If the ability exists to process and analyse the data locally in an on-board computing system, the driverless car can edge closer to becoming a reality at scale.

A further benefit of the BrainChip NSoC is that it will be an off-the-shelf product that can be programmed – making it applicable not only in autonomous vehicles, but also in vision guided technology, such as drones and more.

Adding more intelligence to these vehicles is only likely to improve their capabilities and accelerate them into mass production.

 

This content is produced by Star Investing in commercial partnership with BrainChip. This content does not constitute financial product advice. You should consider obtaining independent advice before making any financial decisions.