How to apply ISO 26262 to the development of self driving cars?

May 12th, 2021 – Reading time: 7 minutes

3 main challenges and how to solve them

Nowadays, everybody is talking about self driving cars to be the future. But how to safely develop self driving cars when there are still so many challenges to face that the current safety standard, the ISO 26262, does not answer? I want to present to you the three main challenges and give suggestions how to approach these.

1. Challenge: Defining the relevant work packages when implementing machine learning algorithms

To identify the first challenge you need to ask yourself how the logic inside a self driving car can learn to react to manifold situations and driving challenges, if there is no more fallback driver. The answer to this is machine learning, since it trains a deep neural network (DNN) to act as a driver, so that the vehicle may take over the responsibility to making own driving decisions.

However, the ISO 26262 does not give specific guidance on what should be done to ensure safety when implementing machine learning algorithms for a self driving car. What are the relevant work packages that should be provided if machine learning algorithms make up a major part within software development?

Solution: ISO/IR 4804 provides assistance

A suggestion was made in ISO/TR 4804, which is a technical report on safety and cybersecurity for automated driving systems. Annex B lists different safety artefacts which should be created (see Table 1). They are structured in four phases: ‘Define’, ‘Specify’, ‘Develop and Evaluate’ and ‘Deploy and Monitor’. These work products should be included when building the safety argument for the safety case.

Example safety artefacts for DNN development steps

Table 1: Example safety artefacts for DNN development steps (ISO/TR 4804, Annex B)

2. Challenge: Making sure a self driving car still to be considered safe when being updated

According to ISO 26262, a safety case must be created to provide sufficient evidence that a vehicle is reasonably safe to be released for production. But what about if the software of the self driving car has been improved by the producer due to feedback obtained from field monitoring? The OEM will then want to upload the software ‘over the air’ on self driving cars that are already in the field. How can it be guaranteed that the self driving car may still be considered reasonably safe? The ISO 26262 does not yet give answers to that.

Solution: Dynamic safety cases can better deal with results from field monitoring

One approach to that question is the creation of dynamic safety cases. The UL 4600 gives guidance on how to create a dynamic safety case and what content it should address. Requirements are formulated for e.g. autonomy functions, interactions with other road users or lifecycle concerns. UL 4600 also emphasizes the use of the goal-structuring notation to build a well-formed, but also dynamic safety case.

Since the EU requires the use of event data recorders in motor vehicles from July 2022 (GSR (EU) 2019/2144), field monitoring will play an even bigger role for the development of self driving cars. This can also be seen in the released ISO/DIS 21448 (SOTIF) which included a clause for operation phase activities. If a risk in the field has been identified, which requires a modification of the safety concept (in the end resulting in hardware/software updates of the vehicle fleet in the field), this may be easier updated in a dynamic safety case. Updated dynamic safety cases then require a delta assessment if the affected safety goals are still achieved by the updated safety arguments and evidences. In any case, this may be a topic that could be included in a future update of the ISO 26262.

3. Challenge: Missing target values for false positive rates

The third challenge deals with the necessity to find a compromise between false positive rates & false negative rates. An example for a false positive event would be a plastic bag flying in front of a self driving car. The internal logic interprets the object as a emergent threat, causing an emergency braking with the possibility of a rear-end crash. An example for a false negative incident is that certain vehicles did not brake when approaching a standing firetruck and finally crashing into it without speed reduction. The internal logic may have interpreted the firetruck as part of the environment which has no impact on the driving path, like billboards, bridges or tree branches hanging over the lane. The problem now is, the more you try to avoid false positives, the more frequent you will obtain false negative events and vice versa. However, quantitative target values for acceptable false positive rates are not published in ISO 26262 or in any other automotive standard.

Solution: Set a benchmark for false positive rate target values in the 3rd version of the ISO 26262

A publication of such target values or a guideline how to approach these would be highly appreciated. In order to find such target values, a first starting point could be the FIT rates stated in part 5 of ISO 26262 for random hardware failure targets. Of course, these values just relate to a failure of a hardware part. In case of false positives, however, we have a unique misinterpretation of a driving situation, being rather a software algorithm failure.

Additionally, false positives rates may be linked to robustness requirements. For an object recognition algorithm, for example, it could be included that it is acceptable if a false positive is only detected in a certain number of subsequent frames of a camera image. Only if something is determined to be a possibly hazardous object for longer than the fault tolerant time, then reactions like the emergency braking may be permitted. By this, the algorithm will become more robust towards single false positive detections.

Conclusion: There is more than one norm to consider when safely developing self driving cars

We can see that, apart from the ISO 26262, there are additional guidelines and norms, like the ISO/DIS 21448, the ISO/TR 4804 or the UL 4600 , which will affect and regulate the development of self driving cars in the future. Annex D of the ISO/TR 4804 lists proposed ISO standards that apply for different aspects of automated driving systems. Since the publication of a third version of the ISO 26262 is not targeted before 2026, the industry will need to observe upcoming publications to keep track of the most recent developments In the field of self driving cars.

Author

Sascha Hackmann
Sascha HackmannSafety Management Expert

Core Competences

  • Functional Safety Management
  • System Safety Development
  • Market Research & Development
  • Reliability Management

How can we accelerate your development?

Let’s start

Learn more

INVENSITY Kompetenzen

CONSULTING

Accelerate your development

CAREER

Let’s make things better

© Copyright 2007 – 2020   |   All Rights Reserved 

INVENSITY Competencies

CONSULTING

Accelerate your development

Career

Let’s make things better

© Copyright 2007 – 2020
All Rights Reserved