News
ADAS Sensing Architectures – Distributed or Central?
With driver assistance and driving automation requiring multiple sensors and real-time processing, ADAS electrical and electronic system architectures will be challenged by scalability. Indie Semiconductor is committed to supporting the industry with an innovative ADAS solutions roadmap, empowering system integrators and OEMs to make the appropriate architectural choices for their platform needs.
The societal benefits of greater driver, passenger and road user safety are well understood and documented, and indeed governments are increasingly regulating the need for advanced driver assistance systems (ADAS) in new vehicles (see example). In addition to the obvious human impact of safety systems, a recent McKinsey report estimates that ADAS already represents a $55-70 billion revenue opportunity today, and in combination with autonomous driving (AD) could collectively represent a $300-$400 billion revenue by 2035.
To support the functionality and safety goals needed for higher levels of automation, the number of sensors on the vehicle and the ability to process data in real-time, will need to increase greatly (refer to our blog In Pursuit of the Uncrashable Car). Both ADAS and AD leverage various sensor modalities (including computer vision, ultrasonic, radar and LiDAR; see also our Blog discussing these) to gather and process data about the vehicle’s exterior and interior environment.