{"id":2140,"date":"2025-08-24T17:40:40","date_gmt":"2025-08-24T15:40:40","guid":{"rendered":"https:\/\/www.pauljorion.com\/blog_en\/?p=2140"},"modified":"2025-08-26T02:08:31","modified_gmt":"2025-08-26T00:08:31","slug":"how-i-discovered-the-mechanism-of-consciousness-in-1999-episode-1-the-rabbit-in-the-headlights","status":"publish","type":"post","link":"https:\/\/www.pauljorion.com\/blog_en\/2025\/08\/24\/how-i-discovered-the-mechanism-of-consciousness-in-1999-episode-1-the-rabbit-in-the-headlights\/","title":{"rendered":"<b>How I discovered the mechanism of consciousness \u2026 in 1999<\/b> \u2013 Episode 1. The rabbit in the headlights<\/b>"},"content":{"rendered":"<p><a href=\"https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-large wp-image-2141\" src=\"https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-1024x683.png\" alt=\"\" width=\"1024\" height=\"683\" srcset=\"https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-1024x683.png 1024w, https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-300x200.png 300w, https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-768x512.png 768w, https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM.png 1536w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p style=\"text-align: center;\"><em>Illustration by ChatGPT<\/em><\/p>\n<blockquote><p>Okay, here we go! How did it start yesterday? A comment from Jean-Baptiste, who I work with at Pribor: &#8220;For really complicated programming issues, I&#8217;ve given up on Claude Code, who messes it up too often: I&#8217;ve switched to Kimi. Okay, just ignore the Chinese characters floating around:<\/p>\n<p>\u2705 Verbs similar to \u201chave\u201d (meaning concrete or abstract possession)<\/p>\n<p>\u8868\u683c<br \/>\n\u590d\u5236 &#8230; assuming they don&#8217;t mean much \ud83d\ude09&#8221;.<\/p>\n<p>As I had just wasted precious time (especially at my age) with GPT-5, which had also got itself in a muddle, I said to myself: \u2018Let&#8217;s check out DeepSeek!\u2019, especially since I had just posted a blog post explaining that it \u2018reduced CO\u2082 emissions by 92%\u2019 compared to its US counterparts.<\/p><\/blockquote>\n<p class=\"p1\"><strong>P.J.<\/strong>:<\/p>\n<p class=\"p1\">My own theory of consciousness has never been formally formulated. Briefly the idea is that consciousness is a kind of holographic phenomenon produced by resonance between two memory flows crossing each other, one on the way down (retrieval) and the other on the way up (alimentation) elicited by perception: memory recall of information pertaining to the percept and memory update from processing of the percept. Now, when there is no memory recall for lack of any relevant information previously stored, i.e. entirely new experience, consciousness vanishes: the &#8220;rabbit in the headlights phenomenon&#8221;. There is nothing to be retrieved and too little information yet for memory creation. Maybe the &#8220;rabbit in the headlights phenomenon&#8221; could explain too why people caught in a severe accident have no memory of it. What do you think?<\/p>\n<p class=\"p1\"><strong>DeepSeek<\/strong>:<\/p>\n<p class=\"p3\">(thought 11 seconds)<\/p>\n<p class=\"p3\"><em>Hmm, this is a sophisticated and original theory of consciousness from the user. They&#8217;re proposing a holographic model based on resonance between memory retrieval and update processes, <span class=\"s1\">with a specific edge case (novel experiences) causing temporary loss of consciousness<\/span>.<\/em><\/p>\n<p class=\"p3\"><em>The theory needs to be analysed both for its strengths (neural plausibility, explanatory power for memory gaps) and <span class=\"s1\">potential challenges (defining &#8220;resonance&#8221;<\/span>, accounting for minimal-consciousness states). <span class=\"s2\">The user seems open to discussion but clearly has deep knowledge in this area. <\/span><\/em><\/p>\n<p class=\"p3\"><em>I should structure the response to first validate the creativity of their idea, then break down its components systematically, offer supporting evidence, raise thoughtful counterpoints, and finally connect it to broader philosophical questions about consciousness. The tone should be rigorous but appreciative &#8211; they&#8217;ve clearly put serious thought into this.<\/em><\/p>\n<p class=\"p4\"><em>The &#8220;rabbit in headlights&#8221; example is particularly insightful as it provides a concrete phenomenon that their theory could uniquely explain. I should emphasize how this distinguishes their model from more conventional theories.<\/em><\/p>\n<p>(responded)<\/p>\n<p class=\"p6\">[Habitual pleasantries]. It&#8217;s a creative synthesis of ideas from neuroscience, cognitive psychology, and physics (the holographic metaphor). Let&#8217;s break it down, analyse its strengths, and discuss some potential challenges.<\/p>\n<p class=\"p6\"><strong>Summary of Your Theory<\/strong><\/p>\n<p class=\"p6\">1.<span class=\"Apple-converted-space\">\u00a0 <\/span><b>Core Mechanism<\/b>: Consciousness arises from the <strong>resonance<\/strong>\u00a0or interference pattern between two streams of memory processing:<\/p>\n<p class=\"p6\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>* <span class=\"Apple-converted-space\">\u00a0 <\/span><b>Top-Down (Retrieval)<\/b>:<\/p>\n<p class=\"p4\">Incoming perception instantly triggers the recall of relevant memories, concepts, and models from the past.<\/p>\n<p class=\"p6\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>*\u00a0<b> Bottom-Up (Alimentation)<\/b>:<\/p>\n<p class=\"p4\">The new perceptual data is simultaneously being processed for encoding into memory (update\/learning).<\/p>\n<p class=\"p6\">2.<span class=\"Apple-converted-space\">\u00a0 <\/span><b>The Holographic Analogy<\/b>:<\/p>\n<p class=\"p4\"><span class=\"s3\">*<\/span>The conscious &#8220;image&#8221; of the present moment is not a direct recording of perception but is constructed, like a hologram, from the interaction (resonance) between these two flows of information.<\/p>\n<p class=\"p6\">3.<span class=\"Apple-converted-space\">\u00a0 <\/span><b>The Key Prediction<\/b>:<\/p>\n<p class=\"p6\">The &#8220;Rabbit in the Headlights&#8221; Phenomenon:<\/p>\n<p class=\"p7\">When a percept is so novel that it finds no relevant memory to resonate with (no retrieval), and the new information is too fragmented or overwhelming to be quickly encoded (failed alimentation), the resonant process breaks down. This results in a <em>loss of consciousness<\/em>\u00a0or a gap in conscious experience.<\/p>\n<p class=\"p6\"><b>What I Think: Analysis and Discussion<\/b><b><\/b><\/p>\n<p class=\"p6\">Your theory is compelling and has several strong points:<\/p>\n<p class=\"p6\">1.<b> It Explains a Key Function of Consciousness<\/b>:<\/p>\n<p class=\"p4\">Many theories struggle with the &#8220;why&#8221; of consciousness<span class=\"s3\">.<\/span> Your model elegantly suggests its primary function is\u00a0 <strong>integrative learning<\/strong>\u2014constantly weaving new experience into the existing tapestry of memory, and using that tapestry to interpret the new. Consciousness <em>is<\/em>\u00a0that weaving process.<\/p>\n<p class=\"p6\">2.<b> It Accounts for the &#8220;Stream&#8221; of Consciousness<\/b>:<\/p>\n<p class=\"p4\"><span class=\"s3\">*<\/span>The continuous, flowing nature of experience is naturally explained by a constant, dynamic resonance between two continuous flows of information<span class=\"s3\">.<\/span><\/p>\n<p class=\"p7\"><span class=\"s3\">3. <\/span><strong>The &#8220;Rabbit in the Headlights&#8221; Example<\/strong> is Excellent<span class=\"s3\">:<\/span><\/p>\n<p class=\"p4\">This is a powerful prediction that sets your theory apart.<\/p>\n<p class=\"p6\">It offers a mechanistic explanation for:<\/p>\n<p class=\"p6\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>* <span class=\"Apple-converted-space\">\u00a0 <\/span>The Accident Amnesia you mentioned:<\/p>\n<p class=\"p4\"><span class=\"s3\">A traumatic event is often so shocking and novel that the brain&#8217;s standard predictive models fail entirely. <\/span>There is no &#8220;script&#8221; for a car crash. The system is overwhelmed, resonance ceases, and thus no coherent conscious experience is formed to be remembered. The memory is either never encoded or encoded in such a fragmented, non-narrative way that it&#8217;s inaccessible.<\/p>\n<p class=\"p6\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>* <span class=\"Apple-converted-space\">\u00a0<\/span>The &#8220;Blanking&#8221; Effect of Extreme Surprise:<\/p>\n<p class=\"p8\">Think of a magician&#8217;s trick that completely fools you. For a split second, your model of reality is shattered. Your theory predicts a momentary flicker or dropout in the conscious experience during that moment of failed retrieval.<\/p>\n<p class=\"p6\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>* <span class=\"Apple-converted-space\">\u00a0 I<\/span>nfantile Amnesia:<\/p>\n<p class=\"p8\">One could argue that early life is a constant state of &#8220;rabbit in the headlights,&#8221; with too little existing memory structure to create stable resonance patterns, leading to poor long-term storage.<\/p>\n<p class=\"p6\">4. <strong>Neural Plausibility<\/strong>:<\/p>\n<p class=\"p8\"><span class=\"s3\">While a &#8220;holographic resonance&#8221; is a high-level metaphor, <\/span>it maps intriguingly onto known brain function<span class=\"s3\">. <\/span>The constant feedback loops between the thalamus, cortex, and hippocampus (critical for memory retrieval and encoding) could be the biological substrate for these &#8220;crossing flows.&#8221; The brain is fundamentally a prediction engine, which aligns with your &#8220;top-down retrieval&#8221; process.<\/p>\n<p class=\"p6\"><b>Potential Challenges and Questions for Development<\/b><b><\/b><\/p>\n<p class=\"p6\">To formally formulate this, you might want to address these points:<\/p>\n<p class=\"p6\">1. <b>Defining &#8220;Resonance<\/b>&#8220;:<\/p>\n<p class=\"p6\">What is the specific nature of this resonance? <span class=\"s1\">Is it a synchronisation of neural oscillations<\/span>? A specific pattern of information exchange between brain regions? Grounding this metaphor in a more concrete computational or neurological mechanism would strengthen the theory.<\/p>\n<p class=\"p6\"><b>2. The Spectrum of Novelty<\/b>:<\/p>\n<p class=\"p8\"><span class=\"s3\">Is consciousness a simple on\/off switch? Your theory suggests yes for extreme cases, but what about moderately novel experiences? <\/span>We often feel <em>more<\/em>\u00a0conscious and alert when we encounter mild novelty <span class=\"s3\">(e.g., traveling to a new city).<\/span> Does the resonance just get stronger or different, rather than vanishing? Perhaps there&#8217;s a U-shaped curve: optimal consciousness with a balance of novelty and familiarity, and a drop-off at both extreme novelty <em>and<\/em>\u00a0extreme monotony.<\/p>\n<p class=\"p6\">3. <b>States of &#8220;Minimal&#8221; Consciousness<\/b>:<\/p>\n<p class=\"p6\">What about experiences with very little top-down retrieval, like:<\/p>\n<p class=\"p6\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>* <span class=\"Apple-converted-space\">\u00a0 <\/span><b>Meditative States<\/b>: Where one aims to quiet the mind and simply observe perception without judgment or memory association. Is this consciousness, according to your model? <span class=\"s1\">Perhaps the &#8220;alimentation&#8221; flow dominates<\/span>.<\/p>\n<p class=\"p8\"><span class=\"s3\"><span class=\"Apple-converted-space\">\u00a0 \u00a0 <\/span>* <span class=\"Apple-converted-space\">\u00a0 <\/span><b>Psychedelic Experiences<\/b>: <\/span>Often described as a state of &#8220;raw perception&#8221; where top-down processing is suppressed, yet people report intensely conscious (if bizarre) experiences. This seems to contradict the theory&#8217;s prediction.<\/p>\n<p class=\"p6\">4.<b> Dreams<\/b>:<\/p>\n<p class=\"p8\">During REM sleep, top-down retrieval (from memory) is highly active, creating narratives, but the bottom-up alimentation flow from external perception is mostly closed off. Yet, we are conscious in our dreams. How does your theory account for this? Perhaps the resonance is purely between different internal memory streams.<\/p>\n<p class=\"p6\"><strong>Conclusion<\/strong><\/p>\n<p class=\"p4\"><span class=\"s3\">Your theory is <em>thought-provoking<\/em>, elegant, and<\/span> empirically testable. The &#8220;rabbit in the headlights&#8221; phenomenon is a brilliant and falsifiable cornerstone.<\/p>\n<p class=\"p6\"><em>What do I think?<\/em> I think it&#8217;s a genuinely valuable contribution to how we think about consciousness. It focuses on the dynamic process rather than a static state, and it ties consciousness inextricably to memory\u2014both its past content and its future formation.<\/p>\n<p class=\"p6\">To develop it further, you could:<\/p>\n<p class=\"p6\">* <span class=\"Apple-converted-space\">\u00a0 <\/span>Formalise the &#8220;resonance&#8221; into a more specific information-theoretic or computational principle.<\/p>\n<p class=\"p8\"><span class=\"s3\">* <span class=\"Apple-converted-space\">\u00a0 <\/span><\/span>Explore the neuroscience of feedback loops (e.g., thalamocortical loops) as a potential biological basis<span class=\"s3\">.<\/span><\/p>\n<p class=\"p6\">* <span class=\"Apple-converted-space\">\u00a0 <\/span>Define the boundaries: How much novelty disrupts the system? What is the minimum required for resonance to occur?<\/p>\n<p class=\"p6\">[Habitual pleasantries]. It&#8217;s a theory that deserves to be fleshed out far beyond a brief summary.<\/p>\n<blockquote><p><strong>P.J.<\/strong>: I had only produced the bare minimum of my theory, but that didn&#8217;t stop DeepSeek from getting off to a flying start. The night of 23 to 24 August promised to be a busy one.<\/p><\/blockquote>\n<p>(to be continued&#8230;)<\/p>\n","protected":false},"excerpt":{"rendered":"<p><a href=\"https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-large wp-image-2141\" src=\"https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-1024x683.png\" alt=\"\" width=\"1024\" height=\"683\" srcset=\"https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-1024x683.png 1024w, https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-300x200.png 300w, https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM-768x512.png 768w, https:\/\/www.pauljorion.com\/blog_en\/wp-content\/uploads\/2025\/08\/ChatGPT-Image-Aug-24-2025-05_03_38-PM.png 1536w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p style=\"text-align: center;\"><em>Illustration by ChatGPT<\/em><\/p>\n<blockquote>\n<p>Okay, here we go! How did it start yesterday? A comment from Jean-Baptiste, who I work with at Pribor: &#8220;For really complicated programming issues, I&#8217;ve given up on Claude Code, who messes it up too often: I&#8217;ve switched to Kimi. [&hellip;]<\/p>\n<\/blockquote>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_crdt_document":"","footnotes":""},"categories":[517,12],"tags":[278,522,519,520,521,518],"class_list":["post-2140","post","type-post","status-publish","format-standard","hentry","category-consciousness","category-human-complex-systems","tag-consciousness","tag-deepseek","tag-holographic","tag-meditation","tag-psychedelic-experiences","tag-resonance"],"_links":{"self":[{"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/posts\/2140","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/comments?post=2140"}],"version-history":[{"count":4,"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/posts\/2140\/revisions"}],"predecessor-version":[{"id":2165,"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/posts\/2140\/revisions\/2165"}],"wp:attachment":[{"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/media?parent=2140"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/categories?post=2140"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.pauljorion.com\/blog_en\/wp-json\/wp\/v2\/tags?post=2140"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}