PDA

View Full Version : Aviation Experts Urge Caution on Releasing Self-Driving Cars - WSJ July 27, 2016


airman1900
29th Jul 2016, 00:13
Interesting article from Wall Street Journal Interactive Edition, July 27, 2016 titled, "Aviation Experts Urge Caution on Releasing Self-Driving Cars"

Aviation Experts Urge Caution on Releasing Self-Driving Cars - WSJ (http://on.wsj.com/2aBYPTd)

Capn Bloggs
29th Jul 2016, 00:36
Airbus lecturing cars on automatics.. Good one...

compressor stall
29th Jul 2016, 01:10
Probably because they are the company that lead the world in automation interface with the human and have ridden the automation journey into safety and out the other side into over dependence on same - which they actively acknowledge.

They are probably the single best company to comment.

HamishMcBush
29th Jul 2016, 07:20
Bearing in mind that at the moment, the most dangerous parts of any journey by plane are the connecting road trips at the start and end of the trip, ........

Capn Bloggs
29th Jul 2016, 07:33
which they actively acknowledge.
while being dragged, kicking and screaming, into the real world where pilots were crashing their aeroplanes because of a lack of flying skill after a system went haywire...

I think the best "single" entity to comment are the hundreds of thousands of pilots around the world that know their skills have degraded badly but were/are not being listened-to. "Out the other side", IMO, will come much more quickly with the typical joe-blow on the road.

But Tesla advises drivers to stay alert and keep their hands on the wheel in case they need to take over unexpectedly.
Living in cloud-cuckoo land, I'm afraid.

badgerh
29th Jul 2016, 09:55
I would have thought that the example of the aviation industry is a good one; automation has dramatically reduced the accident rate over the last decades.

Let's hope that legislators accept that "perfection" is impossible and that (semi) autonomous driving will do exactly the same for car travel; reduce deaths and serious injuries by an order of magnitude.

compressor stall
29th Jul 2016, 10:48
It will be interesting when self driving cars are commonplace and the accident data is shown to be lower.

Then insurance companies start not insuring if it's an accident that happens in manual drive and is a case of driver error...

slast
29th Jul 2016, 11:05
The big lesson car makers need to learn from aviation is surely that humans are extremely bad at monitoring automatic systems that are doing routine things. It's easy now to automate simple things like keeping a distance. But if Tesla thinks there can be a permanent solution involving "advising drivers to stay alert and keep their hands on the wheel in case they need to take over unexpectedly" then they simply haven't got the message.

What could constitute a "need to take over unexpectedly"? The whole selling point of these systems is surely that the person behind the wheel (I hesitate to call him/her the "driver") can travel more safely while paying less attention. A system failure or encountering circumstances not covered by the sensor>reaction programming will indeed become extremely rare, but this rarity in itself will create further doubt/confusion and hence delay in the "driver's" mind before they can take over effectively - just as we have seen with aircraft automation.

With aircraft the best advice in this situation is often "don't do ANYTHING right away". The machine is probably in a relatively stable situation and you have time to look around and assess the big picture - probably what went wrong in the AF event. I can't imagine how that could apply to a road vehicle where routinely adequate separation is measured in inches and centimetres. So near-instant reaction is what Tesla seems to need, while the system has actually made it even less feasible for even trained humans. And if anyone thinks all drivers will accept additional training over today's minimal standards in order to NOT drive their car, they are indeed living in cloud-cuckoo land.

In the interim these vehicles will for decades share a real world where 99.999% of other vehicles do not have this technology to collaborate with, and roads and weather are not like the sunny days and highways of S California.

So IMHO, with car automation responsibility will have to be all or nothing. Car autopilot on: manufacturer is responsible for all events, however "unfair" that may seem to the technology. And with the autopilot off, situation is as now.

Dont Hang Up
29th Jul 2016, 11:35
The thing that worries me most about car automation is that the time from automation dropout to accident could be literally one second. That is such a short reaction time to be alert for, one may as well be driving the car oneself anyway.

There are few aviation scenarios I can think of where one would not have tens of seconds or even minutes to work the problem. Autoland is a special case but even there there is no failure mode that could kill you within one second.

Teevee
29th Jul 2016, 12:27
Hasn't there already been one or two accidents? I would imagine that irrespective of degrading skills there is something else too that a pilot would know but a motorist will blissfully ignore. Anything on 'auto' must still be monitored. Some of the quotes I've seen about these cars indicate that a lot of the 'drivers' seem to think taking a nap while the autopilot is on is perfectly ok.

RAT 5
29th Jul 2016, 12:36
One wonders about the 'time between failures'; the 'rate of failure/1,000.000hrs driving'; what component will be considered a critical failure; what systems will require backups; what will be the activation time of the backup system following a failure; will the cars be fail passive or fail operational; what warning systems will be necessary; what type of training - licence will be required; what happens as technology marches ahead and you buy a new car after 10 years which is 2 generations more advanced and operates in a whole new manner; how granny (who does't even own a computer0 will manage; what about maintenance and component changes - on condition or life so as to maintain the certified failure rate?
We know that once the techno/engineer boys get a head of steam up the rate of progress can be astonishing.
It does disturb me, when watching outside my house, how poor parking technique is, and that's with sensors as well. One wonders at basic teaching and the test. The worst offenders are the ladies who spend most of their driving in suburbia and who, in theory, have the most practice. Perhaps they are of the 'drive around to find a bigger drive into space - not reverse' tribe. Solution? Not better training and testing, but auto-parking systems. Oh dear. Sound familiar.
I have friends who often drive through the night for up to 14 hours on the motorways. They may even swap drivers every few hours. No need now; you can go to sleep after programming TomTom. Scary.
Deepening the muppet mode, competence & ability of drivers is scary. All this automated driving would take the fun out of whizzing round Paris or Rome in a mini spam-can.

Ian W
29th Jul 2016, 12:46
Deepening the muppet mode, competence & ability of drivers is scary. All this automated driving would take the fun out of whizzing round Paris or Rome in a mini spam-can.

Nothing like the Périphérique on a wet night :ok:

I wonder how long those remaining drivers with their cars in 'Direct Law' will take to work out that the automata will avoid them if they drive close enough. They will then start using the foibles of the automated systems to get ahead. Back to the Périph - it could get more exciting :eek:.

badgerh
29th Jul 2016, 12:50
slast, I think you have it right - in auto mode the manufacturer will be responsible. Volvo have already committed to that.

Failure modes and having the driver take over are not as critical as in aviation - after all you are already on the ground which does somewhat help. At least initially, cars will fail stopped. When they do not know what to do they will first slow up urgently and then, if the driver does not take control, stop. This might be irritating, to say the least, to other drivers but at least it is relatively safe.

MG23
29th Jul 2016, 13:45
Failure modes and having the driver take over are not as critical as in aviation - after all you are already on the ground which does somewhat help.

You usually don't get tailgaters in the air, or kids running out into the road chasing footballs at 40,000 feet. There will be very, very little time to react and take over in a car, which is why the whole idea is laughable. In the fatal Tesla crash, it appears the car didn't 'see' a truck turning across the road, and the driver would have had to figure out that the car hadn't seen it, take control back, and brake to a stop before it hit with only a few seconds to act.

That's not gonna happen.

At least initially, cars will fail stopped.

Because stopping in the middle of the highway is just so safe.

BTW, even Google have said that actual self-driving cars are probably decades away. And they've been the biggest hypers so far.

Tourist
29th Jul 2016, 14:10
MG23

There are actual self driving cars on the roads of the U.K. right now. They are currently required to have a driver ready to take over, but that won't last long.

MG23
29th Jul 2016, 14:19
There are actual self driving cars on the roads of the U.K. right now.If it has a steering wheel, it's not an actual self-driving car. It's just fancy cruise control.

Tourist
29th Jul 2016, 14:30
Britain leads the world in putting driverless vehicles on the roads (http://www.telegraph.co.uk/technology/2016/04/10/britain-is-head-and-shoulders-above-rivals-in-putting-driverless/)

All these people are idiots, obviously, whereas you are cleverer.

The Elon chap, total prat. What has he ever achieved?

Oh, wait a minute....

https://youtu.be/_ZXu_rYF51M

Watch after the 8minute point, after all rocket science going up is easy, even NASA can do that. Coming down however...

Tourist
29th Jul 2016, 14:32
MG, you will note that your comment about Google saying they are decades away is total [email protected] incidentally. They reckon fully online 2020.

CONSO
29th Jul 2016, 14:54
MG23

There are actual self driving cars on the roads of the U.K. right now. They are currently required to have a driver ready to take over, but that won't last long.
" ....Please sit back and relax- nothing can possibly go wrong go wrong go wrong..."

That comment was in vogue at least by the 1960's as we were working on the SST :-P

dogsridewith
30th Jul 2016, 13:36
It would be more nerve-wracking to truly monitor well enough to catch a Tesla Autopilot lane-tracking error than to just drive the darn car.
(Noting that one of the reported Autopilot wrecks seemed to be a lane tracking error on the Pennsylvania (USA) Turnpike.)
(Where maybe the most published error was not seeing a semi-trailer while the human was watching a Harry Potter movie...his last entertainment.)

Pace
30th Jul 2016, 14:24
Until a child runs out between 2 parked cars and gets killed or the automatics can't deal with sheet ice
Oh well I will be able to go to the pub and tell the car to drive me home! When the police stop to Breathalyse me I will say Breathalyse George not me I ain't driving ))

handsfree
30th Jul 2016, 14:42
or kids running out into the road chasing footballs at 40,000 feet

Don't be too sure about that

http://i829.photobucket.com/albums/zz216/poodlejumpy/index_zpsl1v85pth.jpg

Hempy
30th Jul 2016, 14:57
while being dragged, kicking and screaming, into the real world where pilots were crashing their aeroplanes because of a lack of flying skill after a system went haywire...

It's not Airbus's fault that airlines lower their training skills requirement simply because the aircraft is 'safer to fly' via automation. Whilst RPT jet pilots 'monitor' more than they 'fly' these days, the fact is that no system is fail proof. Pilots should be employed on their ability to handle systems going 'haywire', and to deal with the problem accordingly.

Sadly, instead, we see some airlines who are prepared to trust the 'systems' and pay peanuts to get monkeys. As we've seen, these people can't even fly a simple visual approach, let alone deal with the computer breaking e.g Asiana 214 (albeit a Boeing, but the principle still stands)

Dont Hang Up
1st Aug 2016, 11:15
Until a child runs out between 2 parked cars and gets killed or the automatics can't deal with sheet ice

Actually those are two situations that the automatics can potentially perform much better than the human driver.

Far quicker reactions for the former, and stability control able to apply differential braking to each wheel for the latter.

No, the risk comes when the automatics decide to say "you have control" for whatever reason, and the driver has perhaps one second to get fully back in the control loop and react to what will, almost by definition, be an unusual situation.

VP959
1st Aug 2016, 12:01
No, the risk comes when the automatics decide to say "you have control" for whatever reason, and the driver has perhaps one second to get fully back in the control loop and react to what will, almost by definition, be an unusual situation.

Reminiscent of me learning to fly (gliders) around 1978. Not long before going solo I was taken up in a Blanik whilst wearing a black blindfold. The glider was put through some fairly severe manoeuvres when we came off the tow and the instructor then leant forward, removed my blindfold and said "you have control". At a guess it took me around three to four seconds to work out we were in a spin***, and then maybe another second or two to think through the correct recovery action for the direction of the spin. I did get the "recovery from unusual attitudes" bit ticked off, though................

As said before, in an aircraft you pretty much always have a few seconds, maybe minutes, to resolve a problem. In a car you will have less than a second in many cases.

To work, car autopilot systems either have to require constant and conscious monitoring and input from the driver, or they have to be significantly more reliable than anything we've had in the way of autopilot systems before.

Tesla may be a leader in the field, but they are a long, long way away from achieving the degree of reliability needed to make car autopilots work under all possible driving scenarios. The fact that there are only a relatively small number of Tesla's on the road, for just a few years, and there have already been accidents shows that the technology is just not mature enough to rely on yet.

I've no doubt that, given time, the technology to deal with all driving scenarios may well achieve the reliability level needed, but I'd guess that we're a long way away from that at the moment, probably decades.


***For those that know that a Blanik will usually self-recover from a spin with the controls released, I should add that a tail weight had been added to ensure it kept spinning. Not something that's done now, but back then deliberately putting the aircraft outside the CofG limit for this part of the syllabus was pretty normal.

Cazalet33
1st Aug 2016, 12:30
It would be more nerve-wracking to truly monitor well enough to catch a Tesla Autopilot lane-tracking error than to just drive the darn car.

Your experience of the Tesla Autopilot is clearly very different to mine.

There is nothing "nerve-wracking" about it, other than the first few times you use the system.

When you run into road conditions with which the autopilot has difficulty, eg cones or scrubbed out lines, you simply take over and drive in the normal way until you are in more benign conditions.

Never tried to watch a Harry Potter movie in the car though. Don't think I'll ever be that stupid.

Dont Hang Up
1st Aug 2016, 14:54
Your experience of the Tesla Autopilot is clearly very different to mine.

There is nothing "nerve-wracking" about it, other than the first few times you use the system.

When you run into road conditions with which the autopilot has difficulty, eg cones or scrubbed out lines, you simply take over and drive in the normal way until you are in more benign conditions.

I'm afraid I still don't get it.

The computer is performing the control function whilst the human is having to monitor second by second for an abnormal situation. That is a complete reversal of the natural "skill set" of the human-machine interface.

Surely better for the human to stay in control with the computer providing a warning (such as vibration) for those moments of distraction or loss of concentration?

Cazalet33
1st Aug 2016, 19:42
I'm afraid I still don't get it.

Do you get the reason why we have autopilots in aircraft?

Do you get the reason why we sometimes fly coupled approaches?

I won't ask about Cat III Autoland because I've never used one, but I do get the idea.

Do you get the idea why many cars have cruise control? If so, you are well on your way to understanding why the Tesla autosteer is such a boon.

It's just an aid, but it's a bloody useful one and it significantly reduces fatigue when used properly.

VP959
1st Aug 2016, 20:20
I think the main issue is terminology. If Tesla promoted it actively as a safe steering AID, rather than an "autopilot", then there might be less confusion as to it's capability and role.

It's a bit like flying a helo with heading hold, where you keep your feet off the pedals and concentrate on the other controls, or having a FADEC that means you don't have to worry about watching the torque or rotor rpm, as their controlled for you.

These are really useful aids that reduce workload, much as Tesla autosteer reduces workload when you can benefit from putting more of the "little grey cells" into improving situational awareness, say at speed on a multi-lane road with a lot of other traffic around.

Cazalet33
1st Aug 2016, 21:07
VP, you've got it.

Teslas are not self-driving and they are nothing like autonomous.

Dont Hang Up
2nd Aug 2016, 05:01
I'm afraid I still don't get it.

Do you get the reason why we have autopilots in aircraft?

Yes I do get autopilots for aircraft.

Not only can one take ones hands off the controls, one can stop continuous monitoring of attitude, altitude and heading. One can then legitimately get on with other tasks and, on hearing autopilot-disengage, expect not to look up straight into the headlights of an oncoming truck.

And the difference is basically why I don't "get" lane control.

Krystal n chips
2nd Aug 2016, 05:54
" Not long before going solo I was taken up in a Blanik whilst wearing a black blindfold. The glider was put through some fairly severe manoeuvres when we came off the tow and the instructor then leant forward, removed my blindfold and said "you have control". At a guess it took me around three to four seconds to work out we were in a spin***, and then maybe another second or two to think through the correct recovery action for the direction of the spin. I did get the "recovery from unusual attitudes" bit ticked off, though "

That is "priceless" ! ....you wouldn't care to name the location of this epic flight of lunacy at all, or the instructor perchance ?......having done plenty of spins and recovery from unusual attitudes, I have never heard of anything so potentially lethal and having no training validity whatsoever.

I can think of one contender however, given we both knew certain people from the past....initials are R.N.

VP959
2nd Aug 2016, 10:08
I think you are close - and yes, it was potentially dangerous and yet fairly common practice at that particular club.

The instructor was someone I did not get on with at all, probably the only instructor I've ever had, flying anything, where I can honestly say that I dreaded flying with him. The initials R and N do indeed come into the equation...................

I suppose that you have to bear in mind that light aircraft training also included spins, and demonstrating a recovery from one, at that time. It wasn't until a few years later that someone realised that there were more spinning accidents during spin recovery training than there were from any other cause. I believe that nowadays all that's allowed is a wing-low stall into an incipient spin, with the student recovering from that. No more are students subjected to three or four full rotations then given control and asked to recover the aircraft into level flight.

Cazalet33
2nd Aug 2016, 10:40
on hearing autopilot-disengage, expect not to look up straight into the headlights of an oncoming truck.

Let's leave aside the fact that Tesla recommend that their autopilot be used only on dual carriageways.

And the difference is basically why I don't "get" lane control.

The analogy I would use is that of an aircraft autopliot holding track. It is constantly calculating cross-track distance; rate of change of cross-track distance; wind correction angle; and magnetic variation. Of course the pilot can do that, but letting the automatics do the donkey work massively frees up the pilots capacity to absorb the bigger picture that we call SA. The lane control of a Tesla does a similar thing. It massively reduces workload and very noticeably reduces fatigue.

The safety advantage is not hypothetical. Last time I looked the fleet had accumulated over 136,000,000 miles of autopilot experience in the wild. The stats show that you are significantly safer with the autopilot on than with it off.

dogsridewith
2nd Aug 2016, 14:01
The aircraft-automatics analogy fails because the Tesla Autopilot lane-tracking error would have to be noticed by the human and corrected in a fraction of a second if the car starts climbing one of the road colored movable concrete barrier/dividers we enjoy very closely for miles during road repairs on interstates/dual-carriageways. (You see black scuffs on those things. I've always wondered what it is like to contact one at speed...might vary with steering dynamics, tire pressure, Hal's mood?)

Cazalet33
2nd Aug 2016, 14:55
That's the difference between an autopilot and an autonomous vehicle/aircraft.

[Some] Teslas have an Autopilot. None is autonomous. Huge difference there.

radeng
2nd Aug 2016, 15:37
I wonder how RFI immune the cars will be after say 5 years vibration etc? Some of you may remember many years back when Volvo started using electronic ignition and the number of cars that stopped suddenly around Daventry and Rugby? (probably around Rampisham and Wooferton too, but those areas are more rural.)

VP959
2nd Aug 2016, 16:11
I wonder how RFI immune the cars will be after say 5 years vibration etc? Some of you may remember many years back when Volvo started using electronic ignition and the number of cars that stopped suddenly around Daventry and Rugby? (probably around Rampisham and Wooferton too, but those areas are more rural.)
That's a very good point. I remember a BMW I had many years ago that emitted enough RFI to stop reception of Radio 4 in Cornwall, traced to the R4 frequency being a multiple of of the engine management system clock frequency, IIRC. The same model BMW was also susceptible to cutting out near certain high power transmitters, something that, I think, may have been the subject of a recall.

Far more recently, Toyota have had significant problems with EMI from the systems on their hybrids. Two that I owned suffered from exceptionally poor FM radio reception, and it was a common complaint both to Toyota GB and on the Toyota forums. It's notable that the one I have now has lots of additional screening all over the place, with braided cables and screened connectors replacing what looked to be plain cables on the earlier models. They have also added a diversity front end to the radio, with a second antenna built in to one of the rear quarter lights, to try and improve radio performance (still not great though).

My worry would be over the software integrity, though. I remember having significant problems with a commercial turboshaft engine FADEC years ago, because the code had been written in C and there were no certified compilers at that time. The only way to verify that the code was safe was to spend a lot of time and money on getting someone to do a static code walkthrough.

Car manufacturers seem to have side-stepped many of the software reliability testing problems by using lots of separate processors, all running fairly simple code that can be tested by using all possible combinations of inputs and outputs to the processor sub-unit. Even then they get it wrong fairly regularly. My current car has been in for two recalls for software updates, one serious (as in don't drive until it's been done) and one routine (at the next service). That's in under three years, and it's a mature design (most of the car's components have been around for at least 6 years).

When you look at the work that goes into certifying safety-critical software on aircraft, then compare that with the approach taken with car software, there seems to be a worrying discrepancy in standards. This might be OK when the hardware can over-ride the software (brakes and steering for example), but drive-by-wire is becoming more common, and hardware over-rides are beginning to disappear (many cars now have no physical throttle control, for example).

Cazalet33
2nd Aug 2016, 16:53
My worry would be over the software integrity, though.

The Tesla Model S has already been comprehensively hacked.

http://www.youtube.com/watch?v=KX_0c9R4Fng

Capn Bloggs
2nd Aug 2016, 23:17
It seems somebody just thought of how to keep these driverless cars on track... smart roads, coming soon to a bitumen strip near you! $1000 to convert my Corolla: where do I sign?
No more driving by 2030: Telstra’s chief scientist

All vehicles on Australian roads will be driverless by 2030 and road builders must begin work to create smart roads that interact with them, says Telstra chief scientist Hugh Bradlow.

Dr Bradlow said his conser*vative and realistic forecast was based on the rate of autonomous car development, where 14 trials were under way in California, and falling costs of retrofitting driverless systems to existing cars, which would soon be in the $US1000 ($1310) range.

“My expectation is that governments will very quickly *realise that they need to make them mandatory to help overcome the statistic that 90 per cent of road accidents are caused by human error,’’ he said.

Self-driving vehicles and smart roads, where surfaces are embedded with transponders and rechargeable batteries to relay information about road conditions, would likely move Australia’s annual road death toll from 1200 towards zero, he said, but governments needed to agree on safety and communi*cations standards before building an intelligent road network.

Dr Bradlow, who spoke in Perth on Friday as part of a series of Australian Asphalt and Pavement Association workshops around Australia this month, said the combination of driverless *vehicles and use of ride-share services such as Uber and GoGet would remove vehicle numbers from Australian roads and *reduce road capacity.

Association chief executive Michael Caltabiano said autonomous cars would be a plus for governments spending $6 billion to $8bn a year on maintaining road surfaces.

“Everyone’s been talking about how driverless cars interact with the road but no one has talked about what condition the road needs to be in to service a driverless car,’’ he said.

“What I can say is the road construction industry will be ready when the technology is ready.’’

In April, South Australia was the first state to legalise controlled testing of autonomous vehicles on the state’s roads after hosting driverless car trials last year, while an inquiry launched by the NSW parliament is working to research what a “driverless vehicles regulatory framework” would involve for Australia.

Cazalet33
2nd Aug 2016, 23:27
Do they have roo bars on driverless vehicles in Oz?

Or would that be regarded as ostentatious?

VP959
3rd Aug 2016, 07:16
The above idea sounds as if it might be cheaper to turn the roads into railways. It's pretty easy to run autonomous trains; we've had a railway line here that runs with no driver for years, the Docklands Light Railway.

NutLoose
8th Aug 2016, 16:03
Well this Tesla did well, it drove the driver to hospital lol

Tesla car drives owner to hospital after he suffers pulmonary embolism - BBC Newsbeat (http://www.bbc.co.uk/newsbeat/article/37009696/tesla-car-drives-owner-to-hospital-after-he-suffers-pulmonary-embolism)

Cazalet33
8th Aug 2016, 16:56
Rather than call an ambulance, the lawyer decided to find a hospital using his car's self-driving mode.

I guess he has the little known Follow-Ambulance Mode option on his Model X.