I understood that the encryption key length is kept as big as it needs to be in order to ensure that any attempt to decrypt it would require either massive computing power and/or sufficiently long term to render the results of the encryption unuseable.
In the case of having massive computing power that can quickly deocde this encrypted data, it is known where such computers are and what they are used for.
As for the time taken to decrypt the data, this is a useful protection if say you are protecting a conversation between 2 computers and you know it takes 20 hours of processing on a highest spec computer that is available to Joe public, then you know that your session time on that conversation should be less than 20 hours to prevent a man-in-the middle attack.
No security is foolproof, but staying on paper based systems just to prevent against the risk of quick theft of large amounts of data is not a good justification to delay introduction of new security technologies. What is needed is an architecture that allows a dynamic ablity to uplift the encryption level, rapid propogation of new keys in the event of key theft, and a public that is willing to help. I fear the latter is the most difficult part to implement