ConsciousComputation: Computational components of conscious awareness

The MetaLab under PI Steve Fleming have been awarded an ERC Consolidator Grant “ConsciousComputation”. This 5 year project is supported by UKRI under the UK government’s Horizon Europe funding guarantee, and runs from 2023 to 2028.

Background

At any one moment in time, we are aware of only a fraction of the information processed by our brains. We may be aware of the detail of a visual scene, a painting, or the words on a page, and yet remain unaware of our digestion, breathing or feel of the clothes on our skin.

Human conscious awareness can vary independently of behavioural performance, as in neurological conditions such as blindsight, or in healthy observers near perceptual threshold

What all these examples suggest, is that even when conscious level is held constant – even when we are awake, and attentive – the contents of awareness may vary, and do so in ways that are independent of other aspects of mental function. This capacity for awareness of mental states is central to human subjective experience, and understanding its neural and computational underpinnings is a fundamental challenge for 21st century science. In recent decades, new tools and techniques have been developed to characterise dissociations between behavioural performance and awareness in laboratory studies. However, the psychological, computational and neural mechanisms that enable conscious awareness of mental contents remain poorly understood.

Aims

ConsciousComputation aims to reverse engineer how human awareness judgments are formed. We will test a new model that proposes that awareness depends on global, abstract representations of the presence or absence of mental contents. To test this theory we will combine psychophysics, computational model simulations and recent advances in human neuroimaging (including 7 Tesla fMRI and optically-pumped MEG) to visualise the neurocomputational components supporting awareness judgments at an unprecedented level of detail.

Project components

We plan to investigate:

  • How to formalise the distinction between unconscious and conscious processing within powerful models of perception-as-inference

  • Whether and how inferences on awareness and content are distinct at computational and neural levels

  • How perceptual absences are computed and (neurally) represented

  • The relationships between imagination, reality monitoring and conscious experience

  • Whether and how conscious experience constrains planning and goal-directed control

Impact

This project addresses a foundational challenge for 21st century science: how to best characterise a distinction between conscious and unconscious processes in the human brain. An answer to this question will inform how we understand the difference between aware and unaware mental states – with widespread ramifications for psychological theory, mental health treatments, and our understanding of personal-level autonomy and conscious thought. We hope our findings will open up new research frontiers and help progress a cognitive computational neuroscience of conscious awareness.

Researchers contributing to ConsciousComputation include

Publications

Barnett, B., Andersen, L. M., Fleming, S. M., & Dijkstra, N. (2024). Identifying content-invariant neural signatures of perceptual vividness. PNAS Nexus [link]