Are Self-Driving Cars Safe?
Are self-driving cars safe? The software in modern cars contains more than 100 million lines of code that enable many different features — cruise control, speed assistance, and parking cameras. And, the code within these embedded systems only gets more complex.
This trend will continue as cars of the future become more connected. Cars are increasingly dependent on technology. And they will progressively become more autonomous — and ultimately self-driving. For this reason, it's important to learn about the security concerns with self-driving cars so you can answer the questions "are self-driving cars safe" and "are self-driving cars safer than humans?"
Read along or jump ahead to the section that interests you the most:
Table of Contents
- But, Are Autonomous Cars Safe? And Are Self-Driving Cars Safer Than Humans?
- How Are Autonomous Cars Safe?
- Security Concerns: Are Autonomous Cars Secure?
- Why Are Automotive Compliance Standards Important?
- Are Self-Driving Cars Legal?
- Automotive Developers Will Always Need Compliance Tools
- Perforce Static Analysis Tools Effectively Manage Security and Safety Concerns with Self-Driving Cars
➡️ ensure self-driving cars are safe with Klocwork
Back to topBut, Are Autonomous Cars Safe? And Are Self-Driving Cars Safer Than Humans?
Self-driving cars have the potential to be safer than human drivers, according to experts.
And autonomous-driven vehicles are improving: A recent study published by the Nature Communications Journal compared accident data from 2,100 autonomous driving systems and 25,113 human-driven cars. The study found that self-driving cars had fewer accidents than human-driven vehicles in most comparable scenarios.
The data to support the use of self-driving cars makes sense — after all, human drivers get into millions of accidents each year. In 2022, there were around six million accidents in vehicles with human drivers. Less than one percent of those accidents were fatal.
However, other data from the study shows that self-driving cars have a long way to go before they become the norm.
For example, when the Nature study analyzed the accident rate in other scenarios, accidents using advanced driver assistance systems (ADAS) were more frequent during dawn and dusk, and under turning conditions.
Furthermore, the Nature study admits that because autonomous vehicles are still scarce, the accident data is small. Meanwhile, the accidents that do occur and make headlines can erode public trust in fully autonomous vehicles. A Forbes legal survey found that 93% of respondents have concerns about self-driving cars, for example.
Infamously, in March 2018, an Uber self-driving test vehicle was involved in a fatal collision with a biker in Arizona. This self-driving car failed to identify the biker as a human until it was too late to stop. And the human safety driver in the vehicle failed to notice the biker.
It's likely to take some time before we see widespread use of self-driving cars.
In the meantime, vehicles driven by humans are using more and more technology in Advanced Driver Assistance Systems (ADAS).
Back to topHow Are Autonomous Cars Safe?
Today, most new cars are equipped with ADAS.
ADAS includes:
- Lane tracking
- Autonomous emergency braking
- Enhanced vision systems
The systems delivering these functions rely on sensors and actuators that communicate over local networks. These are controlled by microcontrollers.
Cars also communicate with each other. This is known as Vehicle to Vehicle Communication (V2V). They also communicate to the infrastructure — such as traffic lights, road signs, or satellites. This is known as Vehicle to Infrastructure (V2I) or V2X.
Enabling all of this is, of course, software. In addition to the application code there are operating systems and middleware — such as network communication stacks — as well as sensor, actuator, and display interfaces.
For more information on ADAS, read our article on advanced driver assistance systems.
Back to topSecurity Concerns: Are Autonomous Cars Secure?
There are more security concerns facing vehicles with ADAS than ever.
There are growing security concerns around vehicles. With the growth of V2X communication, cars are vulnerable to malicious attacks. There have been reports of hackers taking control of cars and overriding the driver.
Most car manufacturers use On-Board Diagnostics (OBD). OBD provides access to various engine parameters for fault-finding and diagnostics at servicing.
Technical details of the connector interface — OBD II — are publicly available. There are a number of Bluetooth OBD connectors that enable anyone to access engine parameters using just a cell phone.
Clearly, this could expose the engine control system to anyone with good or bad intentions.
The University of Michigan recently put this to the test. They found that a direct laptop connection to the OBD interface could be used to override driver instructions.
Back to topWhy Are Automotive Compliance Standards Important?
So, as ADAS grows and self-driving cars become the norm, automotive compliance will remain important. And automotive standards will need to be followed.
ISO 26262 is perhaps the most important safety standard for vehicles. It focuses on the functional safety of electrical and electronic systems. And it applies to all activities within the lifecycle of safety-related systems. This includes requirements applicable to the quality of software.
Get an example using automotive hypervisors >>
The standard uses Automotive Safety Integrity Levels (ASILs) to provide a measure of the risk. These range from A to D. A is the lowest safety integrity level. And D the highest with the most requirements.
The risk parameters include severity of risk, probability of exposure, and controllability.
Controllability assumes that the driver:
- Is in an appropriate condition to drive.
- Has the appropriate driver training (a driver’s license).
- Complies with all applicable legal regulations.
📕 Related Resource: How To Comply With The ISO 26262 Standard
Back to topAre Self-Driving Cars Legal?
Self-driving cars can be legally tested in some countries. But there are more legal barriers that self-driving cars will need to overcome.
Laws will need to adapt to accommodate ADAS and self-driving cars. It’s critical that legislation accounts for what should happen in the event of an ADAS failure. ADAS will need to notify drivers and fall back to human control in these cases.
Software Design Standards
If the notification fails, the human driver may not be paying attention and won’t be able to avoid harm — like in the case of the Uber self-driving car. If the fallback fails, the system may stay in control instead of allowing the driver to intervene and avoid harm.
The Society of Automotive Engineers (SAE) standard J3016 breaks driving automation into six classes, from no automation to fully automatic.
ADAS at SAE level three or higher rely on software to:
- Gather data from sensors.
- Create a model of the environment.
- Decide how to assist the driver or control the vehicle.
ADAS at these levels also determine whether sensors are functioning correctly, when to alert the driver, and when to trigger a fallback to human control.
Traffic Laws
Traffic laws will need to change to accommodate ADAS — particularly in the area of liability and privacy. Every country has its own traffic laws.
North America
In the U.S., the National Highway Traffic Safety Administration has proposed a formal classification system that defines five levels of automation. At the lowest level, the driver must be in complete control of the vehicle at all times. At the highest level, the vehicle performs all safety-critical functions for the entire trip — and the driver isn’t expected to control the vehicle at any time.
At a state level, it varies. In 2011, Nevada was the first state to authorize self-driving car tests on public roads. Today, 29 states allow self-driving cars to be tested. In seven states — Nevada, Florida, Georgia, West Virginia, Utah, North Carolina, and North Dakota — fully autonomous driving is legal if the vehicle's AI is capable of SAE level 4 or 5.
Europe
A European research project — Automated Driving Applications & Technologies for Intelligent Vehicles — began in January 2014. It develops various automated driving functions for daily traffic by dynamically adapting the level of automation to the situation and driver status.
The project also addresses legal issues that might impact successful market introduction.
Vehicle & Road Automation (VRA) is a support action funded by the European Union. It aims to create a collaboration network of experts and stakeholders working on deployment of automated vehicles and related infrastructure.
Asia Pacific
The Japanese government is perhaps the closest to a self-driving car reality. They had new legislation passed in time for the 2020 Olympics in Tokyo so that self-driving transportation systems were in full operation, under limited conditions and in the vicinity of the Olympic village. However, a minor accident during the events illustrated that safety improvements are still needed, and autonomous vehicles are not yet ready for normal roads and everyday use.
In China, autonomous driver legislation is making its way through major cities, including Shanghai and Beijing. China’s legislation is quite flexible so the government has more power to put the required changes in place. However, they will still need to deal with the same complex issues as other countries.
India
India is also thinking about autonomous driving but faces major challenges. One of them is the slow-moving legislation and the difficulty in imposing the expected rules because of its infrastructure.
Back to topAutomotive Developers Will Always Need Compliance Tools
It is possible to develop safe and secure systems for vehicles. To comply with legislation and compliance regulation, automotive developers will always need smart tools — such as static code analyzers like Klocwork and Helix QAC.
Software Design
System security starts with designing features.
This may include using firewalls to maintain separation between safety-critical applications (such as steering and brakes) and less critical applications. This is especially important for those that communicate with the outside world (such as infotainment).
It also includes reducing or limiting communication, as well as checking and validating any data that is communicated.
Safe Embedded Code
Most automotive embedded software is written in C or C++.
You can use a coding standard to ensure that this embedded code is secure. MISRA offers guidelines for C and C++. And AUTOSAR also offers guidelines for modern C++.
📕 Related Resource: How to ensure safe code with AUTOSAR.
Back to topPerforce Static Analysis Tools Effectively Manage Security and Safety Concerns with Self-Driving Cars
Security is a growing concern among automotive developers, surpassing safety as the top concern in the latest State of Automotive Software Development report by Perforce.
Klocwork static application security testing (SAST) for C, C++, C#, Java, JavaScript, Python, and Kotlin identifies software security, quality, and reliability issues to safeguard your software against potential cybersecurity vulnerabilities.
See the difference that Klocwork can have on the security of your code.
➡️ register for a Klocwork free trial
Helix QAC for C and C++ keeps code compliant to coding standards like MISRA and AUTOSAR and functional safety standards like ISO 26262.
See the difference that Helix QAC can have on the safety and compliance of your code.
➡️ register for a Helix QAC free trial
Back to top