Yes, for sure.
And to your point, computers/software are not and cannot be anticipatory. They are forever and always reactive to present input, even as that input and reaction to the "present" may be a billionth-of-a-second. As you observe, the mere appearance (as in, "mimics or looks like, but isn't"), of anticipatory behaviour is not truly anticipatory.
In terms of software engineering and the above, the question must be asked, "What is behaviour?"
To be humanly anticpatory is a philosophy-of-mind* notion and until one understands and comprehends this, autonomous flight will always and first, be dependent in some way, trapped in "the present" and therefore reliant.
It is these questions which we have not even seen posed in the discourse on autonomous flight, let alone a set of proposed solutions/answers which address such matters.
This doesn't mean that autonomous flight can't be done in the ways proposed, nor are these qualifying questions which, when apparently solved, autonomous flight may occur. We don't know what the qualifying conditions are yet.
It just means that such questions that involve an understanding of the mind must be satisfactorily anticipated and addressed.
*
Conditions for Fully Autonomous Anticipation
The paper, by John Collier, may be useful in understanding what is meant by this sentence. There are a few typos in the PDF, likely through OCR errors.