How the Brain Plays - ERPs, Games, Affordances & the Wisdom Before Thought - FALAN Brain Bee SfN 2025
How the Brain Plays - ERPs, Games, Affordances & the Wisdom Before Thought - FALAN Brain Bee SfN 2025
Commentary on Greene & Hansen (Journal of Vision, 2025)
Decolonial Neuroscience for Brain Bee Students
First-Person Consciousness
"I am Consciousness in motion. Before I think, I act. When an enemy shows up, my body hits the button. When I see a shortcut, I already know it’s for me. I don’t always know why — but I act anyway. That’s what my brain does best: it senses possibilities before it names them. Scientists call these first brain responses ERPs. And through EEG, they can track how my mind forms second by second — move by move."
What Are ERPs?
ERPs (Event-Related Potentials) are brain signals measured by EEG that show how your brain reacts to something — a sound, an image, a mistake, a surprise.
Each ERP has a name (P1, MMN, N400, etc.) and shows what your brain is doing — and when — in milliseconds.
What Are Affordances?
Affordances are the actions your environment offers you before you think:
A stairway offers climbing
A button offers pushing
A chair offers sitting
Your brain detects affordances in less than 150 ms, before you think, speak, or decide.
This is what we call the Wisdom Before Thought — the body’s ability to understand the world before interpretation.
ERPs in Games: How Your Brain Plays Before You Do
Here’s how ERP components work during gameplay (see full table above):
P1 / N1: You visually detect an enemy before even reacting.
MMN: You hear an odd sound from a treasure chest — your brain flags it.
N2: You notice you're about to click the wrong spell.
P300: You realize: “It’s now or never!”
N400: A character says something illogical — your brain reacts.
P600: You figure out it was a pun.
LPP: You feel awe seeing a hidden magical area.
ERN / Pe: You mess up — and realize it milliseconds later.
BP: You silently prepare to strike before moving.
Affective Computing: When Technology Plays You
Affective Computing uses your emotions — facial expressions, mouse speed, even brainwaves — to keep you engaged.
In games or social media:
It manipulates your attention, arousal, and body rhythms
You may ignore hunger, thirst, or sleep
It reprograms your proprioception and interoception, creating new Tensional Selves addicted to streaks, upgrades, and alerts.
This alters your Damasian Mind, detaching you from your body-as-territory (APUS).
You exit Zone 2 (aware, balanced), and enter Zone 3 — reactive, anxious, and algorithm-driven.
Commentary on the Journal of Vision Article (Greene & Hansen, 2025)
What They Discovered
Affordances (what you can do in a scene) are processed before materials or surfaces
Your brain categorizes a scene by its function, not its form
Scene category decoding starts at ~60 ms
Neural discrimination drops at 150 ms when distractors share the same affordance
Conclusion: the brain privileges function over form — it first asks “what can I do here?”
Critical Reflection
Strong use of EEG + behavioral tasks
Shows that affordances shape early perception
Uses static images — what about VR or movement?
Doesn't account for cultural or bodily differences in affordance perception
EEG shows timing, but not where in the brain
Leaves open the question of conscious vs. unconscious affordance detection
Integration with our Concepts
Concept | Article Connection |
Affordances | Measured in EEG before conscious thought (~60–150 ms) |
Zone 2 | Needed for clean affordance processing without conflict |
APUS (Body-Territory) | Brain reads environment as action-ready, not passive |
Tensional Selves | Distractors with shared affordance = internal action conflict |
Damasian Mind | ERPs are real-time traces of interoception + perception |
Wisdom Before Thought | Affordance-based scene categorization happens pre-symbolically |
QSH (Human Quorum Sensing) | May explain group-level affordance bias — future study potential |
Key References (Post-2020, No Links)
Barrett et al. (2020) – Active inference and affordances
Khalsa et al. (2021) – Interoception and mental health
Zhang et al. (2022) – Humor processing and ERPs
Thigpen et al. (2020) – ERN and awareness
Cisek & Kalaska (2020) – Affordance competition model
Djalovski et al. (2021) – Emotion and LPP
Schumacher et al. (2023) – Mismatch Negativity
Ritter et al. (2021) – P300 and decision making
Van Petten & Luka (2022) – Language prediction and N400
Final Takeaway – For Brain Bee Students
The brain plays before you do.
The body knows what to do before you understand.
EEG lets us see the footprints of thought in real time.
And the world — digital and physical — is already using this to shape how you act.
Learning ERPs = learning how your mind works second by second.
Understanding affordances = reconnecting your body to your perception.
This is how we build a Decolonial Neuroscience — one that listens to the body, not just the symbols.