Fill out the following information and subscribe to LG Innotek Insights newsletters to stay up to date with the latest LG Innotek news.
Sensing Solutions: The Hidden Key to Autonomous Driving Reliability & Maturity
Intro
Autonomous driving technology has stepped out of laboratories and test tracks and onto real roads. Although the market briefly stalled due to regulatory and safety concerns, it has now entered a new phase, spurred by the launch of Conditional Driving Automation from major Original Equipment Manufacturers (OEMs) and institutional adjustments. From urban robotaxis to hands-off driving, technologies are moving beyond the realm of possibility into the sphere of experience.
Automakers, parts suppliers, and platform companies are in a heated race over who will be the first to hit the accelerator on commercialization. Consumers, the logistics sector, and the shared mobility industry are becoming increasingly expectant, while regulators and urban infrastructure developers are focusing all their efforts on how to ensure a safe introduction. In this transitional phase, the spotlight has turned to sensing solutions — the eyes of the vehicle — as the decisive factor in driving market transformation. This article discusses the role of strategic technology partnerships in driving advancements in sensing solutions, which are critical to realizing future mobility as autonomous vehicles move toward global commercialization.
The Reignited Race for Autonomous Driving: Beyond the Transitional Stage
Once regarded as a field of research and testing, autonomous driving is now entering a new chapter. Global OEMs and mobility technology companies are pushing commercialization forward and expanding consumer adoption, reshaping the market landscape while accelerating its realization. Level 3 Conditional Automation technologies — such as Mercedes-Benz’s DRIVE PILOT and Stellantis’s STLA AutoDrive — are entering the market. Meanwhile, Ford has expanded its BlueCruise service to 17 countries across North America and Europe, demonstrating hands-free mobility with more than 420,000 vehicles capable of driving without drivers.
Tesla has launched robotaxi operations within geofence zones in Austin, Texas, while Waymo is expanding its services to more than ten cities across the United States, signaling a shift from pilot demonstrations to city-scale commercialization. Amazon’s autonomous driving startup Zoox has also announced the opening of a facility in Hayward, California, capable of producing 10,000 robotaxis annually, with plans to begin commercial operations in Las Vegas by the end of this year. Backed by a shared industry consensus on the goal of commercialization, the sector is pushing through the transitional phase head-on and reigniting the race for autonomous driving.
Accurate Sensing: The Prerequisite for Autonomous Driving
At the core of autonomous driving lies accurate sensing. Just as human eyes and ears perceive their surroundings, autonomous vehicles must deliver precise and reliable sensing and decision-making across diverse environments. Sensors have become one of the indispensable components underpinning autonomous driving technologies, serving as the foundation for safety and reliability.
This trend is also evident at major global mobility events. From expert forums on sensor fusion to showcases at CES and Auto Shanghai, numerous advanced autonomous driving technologies and vehicles equipped with sensors have been unveiled. At IAA Mobility 2025, the world’s largest mobility exhibition, high-performance sensor technology is highlighted as part of an innovative approach shaping the future of mobility alongside Software Defined Vehicles (SDVs), Artificial Intelligence (AI), and big data.
▲ Data Flow in Autonomous Vehicle Systems
As autonomous driving levels advance, the number, types, and performance requirements of sensors continue to increase. Just as crucial is how quickly data can be interpreted and linked to vehicle control, which defines the maturity of the technology. In the autonomous driving system — comprising perception–decision–control — sensing solutions form the starting point that drives a vehicle’s physical movements and dynamic responses. AI algorithms and programming, built on data accumulated through extensive testing and driving experience, are also indispensable for enabling real-time decision-making and response. Ultimately, only when each technological element operates in precise coordination can the system function properly.
Sensor conditions also change with vehicle aging and varying driving environments, including vibration, cargo load, passenger shifts, or unexpected impacts. These factors reduce sensing accuracy and undermine the performance of Advanced Driver Assistance Systems (ADAS). To address this, diverse data points such as speed, temperature, time, altitude, and weather must be collected, followed by a calibration that leverages cloud-based analysis to calculate and correct errors. This ensures that sensor deviations are compensated for in real time, maintaining optimal performance under ever-changing road conditions.
Bridging Regulation and Technology: Towards Coexistence
Increasingly complex safety regulations are setting new standards and challenges for autonomous driving technologies. With the U.S. National Highway Traffic Safety Administration (NHTSA) mandating Automatic Emergency Braking (AEB) in all new vehicles starting in 2029, sensor-driven vehicle design has effectively become the industry standard for regulatory compliance. Vehicles must be capable of detecting obstacles accurately under conditions such as nighttime, high speeds, and harsh weather, while actively monitoring blind spots regardless of distance.
Alongside exterior components, industry attention is also turning to In-Cabin cameras that monitor passengers. The European Union plans to mandate Driver Monitoring Systems (DMS) starting in 2026, and similar measures are under review in the United States, Japan, and other major markets. High-value products are emerging that can serve multiple purposes inside the vehicle — ranging from videoconferencing and entertainment to infant monitoring — expanding the lineup of sensing solutions. Meeting complex safety regulations while integrating multiple functions into a single camera offers differentiated value by enhancing space efficiency and design flexibility. Here, regulation does not simply constrain technology; rather, it coexists with it, prompting adaptation that generates new value.
Autonomous Sensing Strategy – ① Sensor Fusion vs. Vision Only: A Defining Crossroad
▲ Functional distinctions across sensors
With increasing demand for sensors, the industry faces a strategic question: how best to combine the roles and strengths of each technology. The core autonomous driving sensors — cameras, radar, and LiDAR — operate on different physical principles to deliver accurate perception of the driving environment. Camera modules provide high-resolution imaging for fine-grained recognition tasks such as object classification and text reading. Radar, based on radio waves, reliably measures distance, direction, and speed even in adverse weather or low-visibility conditions. LiDAR, using laser pulses, captures three-dimensional spatial data and shapes, offering superior spatial awareness in complex urban settings.
Global OEMs are deploying distinct market strategies as they seek differentiated technological advantages. Approaches to sensor utilization largely fall into two categories: Sensor Fusion, which integrates multiple sensors, and single-sensor solutions. The industry is calling for a new standard: modular sensor architectures tailored to each OEM, service model, and driving environment, while making the most of different sensor types.
Autonomous Sensing Strategy – ② SDV Transition: Redefining Sensor Competitiveness
▲ Evolution of vehicle E/E architecture
As autonomous driving advances and the number of sensors increases, the traditional Electronic Control Unit (ECU)-based distributed processing structure has begun to show its limits — causing communication delays, wiring complexity, and higher power consumption. In response, OEMs are shifting to centralized architectures built on high-performance domain controllers, laying the foundation for SDVs. More recently, the focus has moved beyond domain-based integration toward the Zone Controller (ZCU) model, which consolidates data by the vehicle’s physical zones. By enabling localized pre-processing of sensor data and reducing wiring, this approach is gaining traction as an SDV-optimized architecture. It enhances the accuracy of sensor fusion and real-time data processing, while also meeting requirements for functional safety and high-performance computing.
This structural shift is also redefining what makes sensing solutions competitive. Beyond the precision of individual sensors, the decisive factor is how flexibly they integrate within the control architecture and how efficiently they interact with the system. Since sensing data must be fused and processed in real time, minimizing transmission latency and ensuring compatibility with communication standards will increasingly determine product competitiveness. In short, sensing solutions are moving toward integration and system-level optimization — closely aligned with the transition to SDVs.
Autonomous Sensing Strategy – ③ Compact and Integrated: Redefining Design Innovation
With sensor placement restricted by vehicle design and the need to minimize aerodynamic drag, demand for hardware design optimization is increasing. Sensors are typically placed in unobtrusive areas such as inside bumpers, behind rearview mirrors, or around headlamps. This has made compact, high-integration designs that deliver strong performance within smaller volumes more critical than ever.
Design capabilities that address physical constraints — such as heat dissipation, durability, spatial efficiency, and placement flexibility — are now essential. Moreover, the scope of competitiveness is expanding to include system-level optimization that manages the heat, interference, and power consumption challenges accompanying higher integration.
Leading the Next Mobility Era: A Total Solution Provider Ahead of Market Needs
The race in technology and infrastructure is no longer about whether it can be done. Once feasibility is proven, competitiveness depends on how advanced technology is and whether it can scale to mass production. As commercialization of autonomous driving comes into view, sensing solutions are increasingly expected to feature designs and manufacturing processes optimized for mass production, while maintaining stable quality at accessible price ranges. On the path to full autonomy, system integration capabilities that accommodate diverse sensor strategies and deliver high technical maturity became critical. Thus, companies with strong technological expertise and robust product optimization will gain an edge as strategic partners. As camera, radar, and LiDAR technologies continue to evolve, the ability to combine them in ways that maximize their strengths will directly shape vehicle quality and determine competitiveness in SDV mass production.
Partners enabling customers to achieve their vision of future mobility must go beyond supplying parts and evolve into innovators who help shape that vision through technological innovation. Leading global suppliers are already working with OEMs from the earliest stages of vehicle development, building finely tuned processes that deliver both operational flexibility and stability in mass production. In the era of autonomous driving, mass-production optimization goes beyond hardware — it requires software interoperability, regulatory compliance, and the full capabilities of a total solution provider. This evolution will position such companies as indispensable partners in driving industrial transformation and reigniting the autonomous driving race.
Building on its expertise in smartphone camera modules and automotive components, LG Innotek is creating a sophisticated, combined sensor supply chain for the autonomous era. Its full sensing portfolio — covering ADAS and In-Cabin camera modules, radar, and LiDAR — provides the flexibility to meet diverse customer needs and sensor strategies. In particular, its advanced sensor fusion capabilities, which combine hardware and software to maximize synergies across sensor types, offer strong competitiveness in the autonomous vehicle market. Companies with wide-ranging portfolios, stable mass-production capabilities, and strong design expertise are set to accelerate the vision of future mobility as trusted partners alongside their customers.
▶Writer profile
Lee Jong-beom (Partner) Deloitte Korea
Lee Jong-beom, a Partner in the M&A Group within the Strategy, Risk & Transactions practice at Deloitte Anjin LLC, advises domestic and international clients — including automakers and parts suppliers — on mergers, acquisitions, investments, and transaction execution such as financial due diligence. He is also a member of the Automotive Supplier specialist team within the M&A Group and the Deloitte Asia Pacific Automotive Supplier Sector team, supporting clients in tackling complex challenges facing the industry.
※ This editorial reflects the author's opinion and does not represent the views or strategies of LG Innotek.