Clear 2 X, #1387. An investigation should not fall back on blaming the human at any level of organisation, regulation or design. The phrase ‘I do not understand’ should trigger the need to look elsewhere, the process, interactions, and assumptions.
A significant safety fallacy is that where a complex situation has been managed on previous flight(s) then it will be on subsequent flights; except this judgement is based on assumptions such as a covert ‘I do not understand’. (Compare with AF447, 737 AMS, 777 Asiana, CRJ Sweden)
Hindsight is a powerful bias. In your view above, it might be better ask if the operator, maintenance, or customer support had any greater knowledge of the system than the pilots may have had.
It would be unfortunate if the manufacturer’s design philosophy of keeping the ‘pilot in control’ has in this instance been translated as ‘the pilot will always manage’. If so, perhaps a poorly applied assumption or weak human factors assessment - a drift from the original safety concepts.