Where you look next isnβt arbitrary.
In our new paper, we model human eye movements in immersive visual search as reinforcement learning under cognitive constraints. π§΅
Where you look next isnβt arbitrary.
In our new paper, we model human eye movements in immersive visual search as reinforcement learning under cognitive constraints. π§΅
New preprint from Lindsey Tepfer (@ltjaql.bsky.social) and me! We silenced portions of internal monologues in two films to manipulate participants' access to characters' thoughts. Using ISC and RSA, we found that this aligned later neural processing of the narrative & encoding of trait impressions.
π€
With some trepidation, I'm putting this out into the world:
gershmanlab.com/textbook.html
It's a textbook called Computational Foundations of Cognitive Neuroscience, which I wrote for my class.
My hope is that this will be a living document, continuously improved as I get feedback.
Iβm doing great {{citation needed}}
New paper from the lab, "Perceiving Event Structure in Brief Actions," now out in Cognitive Psychology :)
Led by the inimitable Zekun Sun
This was my lab's first foray into event cognition
gift link: sciencedirect.com/science/arti...
Out now in Scientific Reports! Despite high correlations, ChatGPT models failed to replicate human moral judgments. We propose tests beyond correlation to compare LLM data and human data.
With @mattgrizz.bsky.social @andyluttrell.bsky.social @chasmonge.bsky.social
www.nature.com/articles/s41...
Waiting for some kind of Doctor Who announcement on Doctor Who Day.
it's been a long few days, but these folks make socializing simpler (and we caught a gorgeous denver sunset on the way out) #NCA2025 π¦