I'm a PhD student working in the Consciousness and Cognition Lab in Cambridge under the supervision of Dr. Daniel Bor (Queen Mary, University of London) and Dr. Pedro Mediano (Imperial College London).
My work mostly revolves around mathematical theories of consciousness and information theory, with applications to understanding altered information dynamics in the brain in Alzheimer's Disease (AD).
My undergraduate degree was an MMath in pure mathematics from the University of Warwick (UK). I now quite like brains.
What does it mean for three or more systems to have the same information? My current work revolves around exploring measure-theoretical perspectives on Shannon entropy and how we can use this kind of decomposition to explore multivariate information dynamics.
Much recent research has pointed to the perspective that Alzheimer's Disease can be explored as a disorder of consciousness (DOC), with some information theoretical work also following this perspective. I am currently exploring how information theoretical tools can be used to examine altered information dynamics via a partial information decomposition in the brains of Alzheimer's patients.
PUBLICATIONS and Talks
- Keenan J A Down and Pedro A M Mediano. "A Logarithmic Decomposition for Information." Proceedings of the 2023 IEEE International Symposium on Information Theory. (Proofs available on the Arxiv!)
(This paper was a finalist for the ISIT 2023 Student Paper Award).
- Partial Information Decomposition as a tool for understanding Disorders of Consciousness (Jan 2023)
Queen Mary, University of London.
Partial Information Decomposition (PID) is a framework in information theory for capturing information dynamics and exchange between variables. How can we use PID and other information theoretical tools like it to explore phenomenal experience in disorders of consciousness?
- A Signed Measure Space for Information and its Implications (May 2023)
Consciousness and Cognition Lab, University of Cambridge
What is information really about? In 1991, Yeung, building on the work of Hu, demonstrated that there exists a signed measure space for information. In this kind of space, information has a quality (a location in the space) and a quantity. We present recent work on a logarithmic decomposition; a powerful refinement of this space which gives a qualitative and a quantitative basis for disentangling information.
- A Logarithmic Decomposition for Information (Jun 2023)
2023 International Symposium on Information Theory
The Shannon entropy of a random variable X has much behaviour analogous to a signed measure. We demonstrate that there exists a decomposition with intuitive properties which we call the logarithmic decomposition (LD). We show that this signed measure space has the useful property that its logarithmic atoms are easily characterised with negative or positive entropy. We then highlight that our geometric refinement can account for an entire class of information quantities, which we call logarithmically decomposable quantities.
- Logarithmic Decomposition: A Signed Measure Space for Information and a Qualitative Tool for Consciousness Science (Sep 2023)
Models of Consciousness 2023, University of Oxford
Could information be the true substrate of consciousness? Many major theories of consciousness such as Integrated Information Theory (IIT) are built upon the language of information theory. We present results on a new framework called Logarithmic Decomposition (LD), a decomposition which provides not only a PID with many desired properties and a refinement of the classical signed measure space for Shannon entropy, but also a unique mathematical perspective on the nature of shared and bound information. We explore potential experimental implementations of the logarithmic decomposition in computational neuroscience.