The Sirens’ call

We see ourselves as sane, rational beings who can critically assess information. But we are flawed. Not only do we suffer from cognitive dissonance, we also fall under the spell of automation bias. Because of the clean cut, ready, and seemingly objective and neutral output, machines can be perceived as truthsayers, even when evidence shows the contrary.

Just like cognitive dissonance, automation bias sneaks up on us despite our critical thinking skills. Our brains have the natural tendency to save energy and thereby effort. We rely on shortcuts to make sense of our surroundings. These shortcuts become stronger when we have a sense of familiarity and fluency. In other words, whenever we become accustomed to a source of information we tend to trust it more, despite a disclaimer at the bottom of the page.

Generation AI outcome should be approached critically as it confabulates (hallucinates) and contains biases. However, the more we use Generative AI, the more familiar we become with it, the faster we create shortcuts. Its smooth and quick responses add fuel to the fire of automation bias. We know that, in general, AI output should not be trusted at face value, but this specific output must be true (and there is a deadline to be met).

Exposing children and teenagers to Generative AI makes them more familiar with AI from an early age and thereby enforce automation bias later in life. Critical thinking skills are not the saving grace. A consistent skepticism will help, but people have the natural tendency to skip effort and save energy. Just like cognitive dissonance, we must admit we are fallible and that our brain tricks us to chicken out of thinking more deeply. We need to remind ourselves and each other continuously that automation bias exists and effort is required to surpass it. If not, the Sirens’ call will ultimately crash our cognition.

Categories: ,