When does AI make understanding optional?

AI keeps shaving time off the front of the work, which is handy until nobody can explain why a decision got made. I’m seeing more places where the output looks fine but the person using it can’t really defend it later.

Has anyone found a good way to tell when that’s just efficient workflow and when it’s a sign people are skipping actual understanding?