Big Brother is watching- News Item
Many of us are convinced that our smartphones and other connected devices listen to our every word and can serve us ads based on the keywords we utter. Now, a similar eavesdropping concept could be coming to the flight deck as airlines look at artificial intelligence-enabled technologies that can listen to pilot conversations, store the data for analysis and report “high-stress” events in real time to alert airline personnel on the ground to an unfolding emergency even if the pilots do not communicate with them.
One company developing such flight deck tools is NIIT Technologies, an India-based IT services company with offices in Princeton, New Jersey. The company with 10,000 employees globally has about 100 airline customers, including many major airlines in the United States. Madan Mohan, global head of travel and transportation for NIIT Technologies, explained that the technologies the company has created can be used by airlines for a host of purposes, from leveraging AI to predict whether a crew will be delayed on their way to an airport when reporting for duty, to determining if a particular pilot is the right "fit" for the job, to monitoring pilot conversations and improving safety through flight operation quality assurance (FOQA) and real-time monitoring. “Using our data technology, we can acquire the voice of the pilot while they are flying and use AI to differentiate between what is normal and expected conversation or determine if there is increased stress in the pilot’s voice,” Mohan said. Airlines, he said, are already testing the concept. The eventual goal is to monitor, store and perhaps relay in real time cockpit conversations, with AI algorithms used to sort the massive amount of data and pick out conversations that are problematic. That could include non-flying-related conversations pilots may be having in violation of sterile cockpit guidelines, Mohan said. “Pilots may need to be more mindful of the conversations they are having,” he said of the potential for such technology to land a flight crew in hot water with their employer. Pilot unions undoubtedly will have major issues with the seeming invasion to privacy that such technologies present, and aviation regulators will have to grapple with the idea of inviting Big Brother into the cockpit. |
Bring it on I say, they'd need a big HR department to deal with all AI induced staff issues from "unnecessary" flight deck banter. :E |
Well, that's frightening but no real surprise that these people think they actually gain something from such intrusiveness: wave the safety flag and all that stuff to justify it and convince Joe Six Pack the government goons have his best interest at heart. :ugh:
"...and aviation regulators will have to grapple with the idea of inviting Big Brother into the cockpit." Aviation regulators ARE Big Brother and they'd be more than happy to be there every minute...and recognize no stopping point in their lust for control. This is where unions earn their dues money by negotiating regulatory protections from these tyrants. |
AI is going to learn a whole lot of new words and er...positions. Going to be even funnier when Pilots start talking about how to Disassemble the AI machines.
|
Going to be even funnier when Pilots start talking about how to Disassemble the AI machines. Yet another reason to be glad I'm long retired |
D4S
Closer to home than you might imaging - Data4Safety https://www.easa.europa.eu/newsroom-...nalysis-europe |
Originally Posted by Herod
(Post 10377511)
"I'm sorry Dave, I can't allow that"
Yet another reason to be glad I'm long retired |
:rolleyes:
|
The practicality’s of grounding aircraft due to lack of crew etc will mean current airlines won’t ever take this technology seriously. If a pilot wants some time off then just talk gibberish for a few minutes and wait for the disqualification message. any game like that can be played two ways..... what happens when both pilots in a two pilot crew have been deemed unfit by the software? How can it tell you will be late for work? |
There's that buzzword again - AI - now overly used to float any crap idea.
Can't wait for F.A.B. to become standard phraseology. |
I don't think WE have anything to fear this side of the border at present...…other than perhaps raised frustration levels
Just look for "Scottish voice recognition" & "Burnistoun" on U-Toob. |
Seems like a solution looking for a problem. What is the documented SAFETY problem they are trying to solve? There isn't one.
What they can offer with this astroturfing product to managers at airlines the illusion of enhanced control of employees. The pathological need to control subordinates and turn humans into robots is in the DNA of managers everywhere, this is just a better tool to satiate that impulse. There are apps and software for office workers that monitor toilet breaks how much time they spend away from their desk, keystrokes, and screen recording etc. Just an extension of this concept. In order to get the sales pitch over the line they add the "stressed" voice safety case. Just another bogus argument trojan horse to get the technology in the flight deck, then mission creep will deliver the manager's nirvana. |
deliver the manager's nirvana |
CT, :ok: new tools (AI) encourage people to look for a problem where the tool can be applied, irrespective of relevance to the end objective; safety. Focus on human conversation assumes that the human is a problem; and because there is a problem the relevant contributing factors can be identified. Extensive research so far has yet to find a satisfactory understanding of human behaviour (for safety purposes); adding more ‘big data’ will not necessarily identify anything better, although it could provide an alternative view. The critical issue is who looks at the alternative view; if this involves the original assumption then those people will find what they were looking for - human error, due to their own human bias. It may be more beneficial for aviation safety to record the assumptions made at management and regulatory levels and asses these conversations for patterns, and if these match what actually happens in operation - how people at the sharp end create safety, vs those who think that they regulate safety. megan, :ok: but managers may have the skill to let hot bricks cool, the use them to build a higher defensive wall. D479, :ok: nothing to fear except fear itself. |
Actually, it's not a dumb idea - as an example, measuring stress statistically airport by airport during landing might tell operators - and safety officials - which approaches and procedures are problematic and might need revising.
The ugly part as usual is the privacy invasion, recordings will get made that never get wiped, and in the end what should be a statistical quality control tool will get turned into big brother. But then, if your phone and your car are snitches, why not your plane? Edmund |
2 hour sector recording:
20 minutes 'operational' 100mins flaming the company. |
Originally Posted by Oriana
(Post 10379756)
2 hour sector recording:
20 minutes 'operational' 100mins flaming the company. Edmund |
Actually, it's not a dumb idea - as an example, measuring stress statistically airport by airport during landing might tell operators - and safety officials - which approaches and procedures are problematic and might need revising. |
Originally Posted by 601
(Post 10380512)
The PF heart rate is normal and the PM is through the roof. Which data set do you use for collection and research or do you have a third crew person on board to determine which data to use.
I can see the Burga becoming popular as with CCTV facial recognition it kills it stone dead. |
All times are GMT. The time now is 19:32. |
Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.