PDA

View Full Version : Fancy that: Hacking airliner systems doesn't make them magically fall out of the sky


slfool
4th Mar 2020, 11:01
Fancy that: Hacking airliner systems doesn't make them magically fall out of the sky

Study finds most A320 pilots shrug, ignore dodgy systems and land safely

https://www.theregister.co.uk/2020/03/04/aviation_infosec_study_a320_systems_hack/
https://www.ndss-symposium.org/ndss-paper/a-view-from-the-cockpit-exploring-pilot-reactions-to-attacks-on-avionic-systems/

.Scott
4th Mar 2020, 12:01
They don't fall out of the sky - but they did cause disruption.
Also, the pilots in the study said that the simulator scenarios were useful training exercises.

DaveReidUK
4th Mar 2020, 12:13
A cynic, or a student of recent history, might point out that you don't need to be a hacker to write code that makes an airliner magically fall out of the sky.

Airbubba
4th Mar 2020, 12:15
Here's a link to the original paper cited in the article above:

https://www.ndss-symposium.org/wp-content/uploads/2020/02/23022.pdf

fergusd
4th Mar 2020, 12:29
And in latest news : Paper reveals that the entirely expected result is what happened shocker (and uses 'hacking' clickbait wording completely unecessarily because nobody hacked anything) . . .

Superpilot
4th Mar 2020, 13:19
Some of you are missing the point and being a tad facile. The researchers behind the study are on a journey to help the aviation industry develop the next generation of airborne and ground equipment in order to secure aircraft-to-ground and aircraft-to-aircraft communications - something that is completely open to abuse/compromise today.

I attended a session and was given the TCAS hack twice.

The first time, I naturally did what I had to do. But, upon seeing that I was being directed towards the other aircraft, I reacted based on my instincts.
The second time, I reacted opposite to the commands more readily through learned behavior. However, it dawned on me that the first event gave me a lot more time to react compared to the second. The researcher could've easily made it more difficult thus causing a collision, but what would've been the point of that? There is currently nothing in the Airbus and Boeing manuals for such a scenario yet the TCAS "hack" can be done with equipment costing less than a couple of grand.

This is food for thought.

safetypee
4th Mar 2020, 14:39
Way back in the early days of TCAS, most of the airborne certification tests were flown against false targets - 'a hacked system' - external to the test aircraft in flight. I decline to explain how in an open forum, but very simple.
Even if a spurious flight deviation generated a hazardous real aircraft conflict, then the TACAS logic has further protections.

Similar tests for EGPWS, but it was much more fun flying low level in Wales on a clear day - planning to miss the mountain tops.

Also, research flying into the vulnerability of VHF transmissions - ILS etc, demonstrated the difficulty in generating sufficiently confusing inputs so that both the technology and human would be deceived simultaneously.
The test conclusions were 'fortunatously' confirmed when the aircraft experienced VOR interference from an illegal radio transmission near Sheffield - hot wired / repeated from remote transmitters; the audio content could be heard on the intercom 'in tune' with the bearing deflection. Several instances / overflights were able identify the location, which the police successfully raided.

Lonewolf_50
4th Mar 2020, 15:46
For Safetypee: will the airline companies fund the kind of training that Superpilot mentions? Seems to be required in this modern day signals-saturated environment.

Gauges and Dials
4th Mar 2020, 15:59
There is a well-known electronic warfare technique (well enough known that it's not an issue to discuss it in an open forum) that involves persistent, sporadic, low-level disruption. The objective is not to cause systems to fail outright, or create catastrophic errors (e.g. to specifically cause two airplanes to fly into each other) but rather to erode operator confidence in the sensors, displays, and systems, making the whole works a lot less effective and resilient.

safetypee
4th Mar 2020, 16:17
Funding - need for training; depends on the threat. Theoretical or practical, where is the evidenced based assessment of likely outcome. TCAS most likely, but the report appears not to have considered the complete system or actual / realistic conflicts.
Dual TCAS; anyone recall 'Two TCAS North of Darwin' mentality.

The industry resorts to training far too often - the universal solution, often without the required thought or justification. Focus safety on the overall operational environment, start with the hardware; only use training for the proven high risk threats without technical solution.

Considering the 'big picture', what type of deviation could occur; context, situation, location.
What other aircraft systems might mitigate any threat.
What did the industry do before GPWS, TCAS; there were accidents, but check/compare risk, context, situation, etc.

Training; no, not justified.
Better to inform pilots and regulators that “The only thing we have to fear is fear itself.” - Avoiding danger is no safer in the long run than outright exposure.
How many instances of crews failing to follow valid alerts; and that's after training !

Denti
4th Mar 2020, 18:35
Well, in a world where we are trained to let the automatics do their things, including reacting to TCAS, a credible hack would be concerning of course.

tdracer
4th Mar 2020, 19:21
What Superpilot describes is quite scary - recall that one of the most serious mid-air collisions occurred over Germany because one of the pilots received conflicting instructions from air traffic and from TCAS and chose to ignore TCAS.
Given the time criticality of many TCAS alerts, forcing the pilot to decide if TCAS is telling them the right thing or not could prove catastrophic.

Lonewolf_50
4th Mar 2020, 19:23
Funding - need for training; depends on the threat. Theoretical or practical, where is the evidenced based assessment of likely outcome. How many instances, and how many dead, will satisfy you that there is a need?
Most of the lessons learned, and improvements made, in aviation were written in blood. (CRM for one)
So far, it would seem that no instances are on record as primary cause.
TCAS most likely, but the report appears not to have considered the complete system or actual / realistic conflicts.
Dual TCAS; anyone recall 'Two TCAS North of Darwin' mentality. Which seems to be the system under discussion, yes.
The industry resorts to training far too often - the universal solution, often without the required thought or justification. Focus safety on the overall operational environment, start with the hardware; only use training for the proven high risk threats without technical solution.
I don't think that this is an either / or situation. (If you feel that the threat is overstated, that's fine)
Considering the 'big picture', what type of deviation could occur; context, situation, location.
What other aircraft systems might mitigate any threat.
What did the industry do before GPWS, TCAS; there were accidents, but check/compare risk, context, situation, etc.
Training; no, not justified. Really? What you just mentioned is stuff that calls for awareness, updating, and yes, training. (At what periodicity? No idea)
Better to inform pilots and regulators that “The only thing we have to fear is fear itself.” - Avoiding danger is no safer in the long run than outright exposure.
How many instances of crews failing to follow valid alerts; and that's after training ! Getting the word out and awareness is a form of trainng. I am not restricting the term "training" to "get in the sim."
Mixing such things into sim sessions would not hurt, in terms of the CRM/crew response to suspect inputs from the TCAS.

Denti
4th Mar 2020, 19:50
Mixing such things into sim sessions would not hurt, in terms of the CRM/crew response to suspect inputs from the TCAS.

That would be a serious change in philosophy re TCAS. Since Uberlingen the training is quite clearly: Always follow an RA. Nowadays it is not the pilots doing that anymore, it is the autoflight system on its own, monitored by the pilots. However, monitored only to interfere if the autoflight system is not good enough following the RA commands.

Introducing ambiguity in TCAS response already killed quite a few people, re-introducing would be quite hard to do at this point. I believe the focus currently is more in the direction to make systems harder to hack or to spoof, which would include stuff like certificate based answer-response for example, but that is mainly a protocol issue.

neilki
4th Mar 2020, 20:32
2 of the 3 scenarios requires the immediate application of Memory Items (TCAS RA & EGPWS), the third (in my company at least) GPWS warning below 1000' is an automatic Go Around.
What surprised me was that 100% of Pilots DIDNT follow them, or took significant time to do so...
Failing to immediately execute a manufacturer or company memory item here is the beginnings of a very bad day. FOQA trigger and a ride back to HQ. No need for cookies. You won't be eating....

triploss
5th Mar 2020, 06:18
What a hilariously misleading headline.
Misleading external radio signals is not "hacking airliner systems". That's providing bad inputs to aircraft systems.

If someone actually manages to hack into the avionics, in the conventional sense of gaining direct control of the system on board the aircraft, all bets are off, and you won't see pilots shrugging it off. That's a lot harder to achieve of course, and fortunately hasn't happened as far as we know. Nevertheless, the risk is always there with networked aircraft that can speak to the internet.

Flying Hi
5th Mar 2020, 09:06
What a hilariously misleading headline.
Misleading external radio signals is not "hacking airliner systems". That's providing bad inputs to aircraft systems.

If someone actually manages to hack into the avionics, in the conventional sense of gaining direct control of the system on board the aircraft, all bets are off, and you won't see pilots shrugging it off. That's a lot harder to achieve of course, and fortunately hasn't happened as far as we know. Nevertheless, the risk is always there with networked aircraft that can speak to the internet.

Thread starters that are clearly written as clickbait should be immediately pulled.

Denti
5th Mar 2020, 09:31
What a hilariously misleading headline.
Misleading external radio signals is not "hacking airliner systems". That's providing bad inputs to aircraft systems.

If someone actually manages to hack into the avionics, in the conventional sense of gaining direct control of the system on board the aircraft, all bets are off, and you won't see pilots shrugging it off. That's a lot harder to achieve of course, and fortunately hasn't happened as far as we know. Nevertheless, the risk is always there with networked aircraft that can speak to the internet.
Considering that "hacking" started by providing an erroneous input into a phone system to use it for free, that is actually exactly the meaning of hacking. Nowadays it is more understood as a direct access and change of internal software, but that is in reality only a part of hacking. Usually there are quite a few other components to successful hacking, including social engineering, bad inputs and of course software agents.

Lonewolf_50
5th Mar 2020, 12:32
Introducing ambiguity in TCAS response already killed quite a few people, re-introducing would be quite hard to do at this point. I believe the focus currently is more in the direction to make systems harder to hack or to spoof, which would include stuff like certificate based answer-response for example, but that is mainly a protocol issue. Thanks for spelling that out.