Sussex study reveals how ‘blind insight’ confounds logic
By: Jacqui Bealing
Last updated: Friday, 14 November 2014

People can gauge the accuracy of their decisions, even if their decision-making performance itself is no better than chance, according to new University of Sussex research.
In a study, people who showed chance-level decision making still reported greater confidence about decisions that turned out to be accurate and less confidence about decisions that turned out to be inaccurate.
The findings, published in Psychological Science, suggest that the participants must have had some unconscious insight into their decision making, even though they failed to use the knowledge in making their original decision, a phenomenon the researchers call “blind insight.”
Lead author, psychologist Dr Ryan Scott says: “The existence of blind insight tells us that our knowledge of the likely accuracy of our decisions – our ‘metacognition’ – does not always derive directly from the same information used to make those decisions. It appears our confidence can confound logic.”
Metacognition, the ability to think about and evaluate our own mental processes, plays a fundamental role in memory, learning, self-regulation, social interaction, and signals marked differences in mental states, such as with certain mental illnesses or states of consciousness.
Consciousness research reveals many instances in which people are able to make accurate decisions without knowing it, that is, in the absence of metacognition. The most famous example of this is blindsight, in which people are able to discriminate visual stimuli even though they report that they can’t see the stimuli and that their discrimination judgments are mere guesses.
Dr Scott and colleagues at the University’s Sackler Centre for Consciousness Science wanted to know whether the opposite scenario to blindsight could also occur. He says: “We wondered: Can a person lack accuracy in their decisions but still be more confident when their decision is right than when it’s wrong?”
The researchers looked at the data of 450 participants performing a simple decision task. The participants first viewed a set of letter strings which, unknown to the participants, followed a complex set of rules specifying the order of letters.
They were then told of the existence of these rules and asked to classify a new set of strings according to whether or not they obeyed the same rules, answering yes or no. After each decision they had to indicate whether or not they had any confidence in their answer.
The researchers found that, while the majority of the participants were able to classify the strings with some accuracy, a large subset performed no better than if they had selected yes or no at random. However, looking at the confidence ratings for that subset of ‘random responders’ revealed that they were more likely to express confidence in their right decisions than in their wrong ones.
In other words, the participants knew when they were wrong, despite being unable to make accurate judgments.
“An everyday example might be trying to decide which of two routes to take on the tube,” says Dr Scott. “You pick what you think is the quickest route but the moment you get on the train you are sure you’ve made a wrong decision. How could that happen? Perhaps your original decision was largely influenced by the number of stops along the different routes, with fewer stops being favoured. But without you being aware, your subsequent confidence draws on something more, perhaps a forgotten previous experience with stoppages on one of those lines. That additional unconscious knowledge could mean that your confidence is often right despite your original decision being no better than chance.”
Professor Anil Seth, co-author of the study, says: “Neuroscientists have known for a long time that people can make accurate decisions without metacognition, for example, without knowing when they are right or wrong. Blind insight reveals the opposite, that people can know whether or not they’ve made a correct decision, despite being unable to make the right one. This means that existing theories of metacognition are wrong, since they assume that both the information used to make the decision and our confidence in it are based on the same information. Our results show this can’t be true.”
Notes for editors
- Blind Insight: Metacognitive Discrimination Despite Chance Task Performance’ Ryan B Scott, Zoltan Dienes, Adam B Barrett, Daniel Bor, and Anil K Seth,’ is published in Psychological Science
- University of Sussex Press Office: Jacqui Bealing and James Hakner 01273 678888, press@sussex.ac.uk
- All data and materials have been made publicly available via Open Science Framework and can be accessed at https://osf.io/ivdk4/files/. The complete Open Practices Disclosure for this article can be found at http://pss.sagepub.com/content/by/supplemental-data. This article has received badges for Open Data and Open Materials. More information about the Open Practices badges can be found at https://osf.io/tvyxz/wiki/view/ and http://pss.sagepub.com/content/25/1/3.full.
- This work was supported by the Economic and Social Research Council (Grant No. RES-062-23-1975), an Engineering and Physical Sciences Leadership Fellowship to A. K. Seth (Grant No. EP/G007543/1), an Engineering and Physical Sciences Fellowship to A. B. Barrett (Grant No. EP/L005131/1), the European Research Council project Collective Experience of Empathic Data Systems project (Grant No. 258749; FP7-ICT-2009-5), and a donation from the Dr. Mortimer and Theresa Sackler Foundation via the Sackler Centre for Consciousness Science.