Victor Haydin
Blog post

Three Things Automotive Industry Should Learn from Recent Boeing 737 MAX Crash

Discover how aerospace challenges can contribute to the development of the automotive industry

March 18, 2019

6 mins read

4.5 1

On March 10, 2019, Ethiopian Airlines’ Flight 302 disappeared from radars just six minutes after the take-off. Within a couple of hours, the worst expectations confirmed: the aircraft crashed into a field 62 kilometres east of the departure airport. All 157 people on board died. It was a new aircraft — just four months in service — piloted by a well-trained and very experienced captain. And it represented the latest and most sophisticated generation of legendary Boeing 737 series — MAX. The same aircraft type as the one that crashed into the Indonesian sea five months earlier, killing almost 200 people on board.

The investigation is still underway as I am writing these lines, but multiple industry professionals point into the same direction while looking for the cause: flight control software. Think ADAS, just in aircraft, not an automobile.

One may argue that autopilot systems in an aircraft are very different from ADAS/AD systems in a car. And usage patterns are very different as well. But I still firmly believe that this case is something the automotive industry should learn from.

Read more: Learn which advanced safety features increase safety and convince people to adopt self-driving vehicles

Some technical details of what actually happened with that Boeing (still to be confirmed by official investigation)

737 MAX is the fourth and the newest generation of iconic 737 series from Boeing: flying in the skies since the late 1960s and produced in more than 10000 copies — more than any other passenger liner in the history of humankind. To make it as fuel efficient as possible, Boeing had to install new engines — larger in size. Therefore, they had to be placed a bit differently than on the previous generation.

Check out how much bigger is the engine and how it covers the wing’s edge:

Three Things Automotive Industry Should Learn from Recent Boeing 737 MAX CrashThis advancement significantly changed the aerodynamics of the aircraft.

 

 

During flight testing in 2016, Boeing found out that aerodynamics changes may cause aircraft stall in certain conditions. Since it was too late and too complicated to alter the aircraft’s aerodynamics, Boeing decided to introduce MCAS — the system helping pilots restore a plane from the stall. It’s very similar to ADAS in a vehicle that allows you to break, steer, or keep a car on the road. MCAS automatically activates if speed is low and the angle of attack is above specific value and points the aircraft’s nose down so that it starts to accelerate.

In both cases, the aircraft was just off the airport, and such a manoeuvre effectively sent it to the ground. Pilots got warnings and tried to get the plane back to the right trajectory, but didn’t manage to do so.

In the first case, the angle-of-attack sensor was to blame: logs showed that two redundant sensors were reporting different speed values, and MCAS decided it was a stall. In such situations, pilots have to turn the autopilot off. Though it seems they didn’t identify the cause correctly, despite the captain and the first officer’s report on two sensors showing different values.I will stop here since I am writing about automotive, not the aerospace industry. If you are interested in more details, check one of the Bloomberg articles, for example.

If you’re searching for turnkey solutions that will advance the autonomous driving, contact Intellias
Get in touch

So, what precisely the automotive industry can learn from what happened?

Autonomous systems might be already way too complex to make them 100% reliable by design.

Mind you, MCAS worked right exactly how it was designed to work. There was no bug in the code itself. There even wasn’t a bug in the system design per se. It was a complex combination of aerodynamics, software, hardware, human factor, Boeing’s marketing strategy and pilot training process that led to these unfortunate events.

Boeing is not unique here. Any company building a product of that complexity and scale within market pressure of today could have been at their place. Can one imagine Airbus, Audi, Toyota or GM at this situation? Absolutely! And aeroplane avionics systems are way much simpler than autonomous driving systems. There are approximately 6.5 million lines of code in Boeing 787 Dreamliner (which is more complicated than 737). What about a Waymo car? 100 million lines of code, according to some sources.

Can you imagine a human dealing with this level of complexity?

Let’s face it: humans are no 100% reliable backup for automation.

If well-trained professional pilots can’t take control over an aircraft after the automation fails, why do we think regular drivers will be able to do the same with their cars? Here the whole concept of Level 3 autonomous driving is falling apart, and Level 4 starts to crack.

Imagine your 60-year-old aunt behind the steering wheel of a Level 3 car moving on a highway and staring at the message on a dashboard: “Sensors can’t detect road markings ahead. Take over control at 10… 9… 8…“ Can you make drivers pass a formal training and an exam before they are allowed to activate the Level 3 autopilot? Yes. Can you make sure they remember what to do in a year, especially if they have never got into such a situation before? Forget about that. Waymo might be just right building Level 5 vehicle and skipping Levels 3 and 4: human drivers may cause more problems than they solve, especially at scale. But even with level 5 humans are in the car and all around it. Which leads us to the following.

Simulation should take into account imperfect human in the loop.

The aerospace industry is a long-standing leader in using simulation to test its products. Still, despite millions of simulated flights this either wasn’t detected or taken seriously. Technically, we can classify this case as a human error: after the first crash, Boeing reminded pilots about previously issued instruction to turn automation off if angle-of-attack sensors work improperly. But no one cares about that today after more than 350 people died. Now, in your simulated drives during disengagement, do you expect a human driver to panic, do anything illogical, or forget what they have to do? You should, because that’s what humans do every time.

Large-scale data collection from real-world situations might be the only way to model this behavior. And when I am talking “large-scale”, I seriously doubt that one OEM can do this, even a large one like GM or VW. The whole industry should create an ecosystem that will collect data from every possible source and share it.

The bottomline

Of course, it’s not the case to stop thinking about building Level 3 and above systems. The market demand is rising, and technology finally seems to be advanced enough to keep up with  it. So, the race is on, and eventually, we will have autonomous vehicles on the road. Much like aircraft autopilots, ADAS/AD saves more lives than takes. Still, probably this is the first time in the history we intend to roll out an automation product with this level of potential impact on human lives. Just statistically, people will die, and every time we will ask the question: is automation or humans to blame here? While the answer might quite often be “both” and we will have to learn how to deal with that.


Contact us to speak with one of our automotive experts to get more information about the ADAS software development and insights on how to implement ADAS features and algorithms in your projects.

Your subscription is confirmed.
Thank you for being with us.

4.5 Thank you for your vote. 15208 06e3dddcb0

Thank you for your message.
We will get back to you shortly.