Jolande Fooken's Avatar

Jolande Fooken

@ookenfooken

Postdoctoral fellow at TU Darmstadt. Lived a previous life in Vancouver and Kingston, Canada 🍁 Vision Science | Eye Movements | Motor Control | Eye-hand coordination

556
Followers
261
Following
10
Posts
09.02.2024
Joined
Posts Following

Latest posts by Jolande Fooken @ookenfooken

Post image

Why is touch perceived as weaker during movement?

In our new preprint πŸ“, we examine tactile suppression during reaching.

Using optimal control theory, we show that tactile suppression reflects dynamic, uncertainty-dependent integration of forward model predictions and sensory feedback.

27.02.2026 10:57 πŸ‘ 36 πŸ” 9 πŸ’¬ 2 πŸ“Œ 3
Preview
Encrypted Rich text CryptPad: end-to-end encrypted collaboration suite

+++ Teilt und unterschreibt unseren offenen Brief an die Hessische Landesregierung! +++ Wir werden die bevorstehende katastrophale Verschlechterung unseres Hochschulsystems nicht hinnehmen und fordern eine Neuverhandlung des Hochschulpakts! #nocuts #nocutsinhessen cryptpad.fr/pad/#/2/pad/...

03.02.2026 07:13 πŸ‘ 57 πŸ” 40 πŸ’¬ 1 πŸ“Œ 12
A teaser figure showing the process of metamers rendered differentially (MRD). Target scene parameters are used to render a target scene. A new scene is initialized from some starting point, and renders are created from this scene. The loss between the initial and target scenes is measured. MRD allows the gradients wrt the loss to be propagated to the scene parameters (e.g. lighting, geometry or material) for gradient-based optimization.

A teaser figure showing the process of metamers rendered differentially (MRD). Target scene parameters are used to render a target scene. A new scene is initialized from some starting point, and renders are created from this scene. The loss between the initial and target scenes is measured. MRD allows the gradients wrt the loss to be propagated to the scene parameters (e.g. lighting, geometry or material) for gradient-based optimization.

Legit super excited about this work coming out. My amazing doctoral student @ben.graphics has been working on an idea to use physically based differentiable rendering (PBDR) to probe visual understanding. Here, we generate physically-grounded metamers for vision models. 1/4

arxiv.org/abs/2512.12307

17.12.2025 21:17 πŸ‘ 53 πŸ” 15 πŸ’¬ 4 πŸ“Œ 3
Fully funded PhD position via imprs-is

Fully funded PhD position via imprs-is

A mobile EEG/EyeTracking setup

A mobile EEG/EyeTracking setup

EEG system photographed with backlight and glowing electrodes

EEG system photographed with backlight and glowing electrodes

S-CCS Lab PhD Position

3+2 year 100% TVL-13 position in '26 - open topic on the intersection of combined EEG-EyeTracking, Statistical Methods, Cognitive Modelling, VR/Mobile EEG, Vision ...

Apply via Max-Planck IMPRS-IS program until 2025-11-16 imprs.is.mpg.de

Read: www.s-ccs.de/philosophy

23.09.2025 07:56 πŸ‘ 16 πŸ” 11 πŸ’¬ 0 πŸ“Œ 2
An image showing the presentation times and places of the members of the Perception Lab at TU Darmstadt.

An image showing the presentation times and places of the members of the Perception Lab at TU Darmstadt.

I won't get to join you all at #ecvp2025 @ecvp.bsky.social this year because I'm on holiday in Australia... but my lab is there in force! Please do go say hi, and check out my students' and postdocs' amazing work!

@ben.graphics @lschmittwilken.bsky.social @lreining.bsky.social

25.08.2025 10:56 πŸ‘ 32 πŸ” 5 πŸ’¬ 0 πŸ“Œ 0
Post image

Looking forward to meeting you #ECVP2025 Mainz this week, including collaborative work with @tobnie.bsky.social @dominikstrb.bsky.social @ookenfooken.bsky.social @fatatai.bsky.social @tsawallis.bsky.social @mamassian.bsky.social @guidomaiello.bsky.social @mariaeckstein.bsky.social and many others

25.08.2025 11:58 πŸ‘ 16 πŸ” 5 πŸ’¬ 0 πŸ“Œ 1
Preview
About SIGHT Trial University of East Anglia

πŸŽ‰ The SIGHT trial has HRA & HCRW ethics approval! We’re set to launch the largest ever trial of therapy for spatial inattention post-stroke.

Recruitment starts now β€” learn more at πŸ”— www.uea.ac.uk/about/school...

#SIGHTtrial #StrokeRecovery #UEA

11.06.2025 19:34 πŸ‘ 11 πŸ” 4 πŸ’¬ 0 πŸ“Œ 0
Preview
Zwei Exzellenzcluster für die TU Darmstadt Großer Erfolg für die Technische UniversitÀt Darmstadt: Zwei ihrer Forschungsprojekte werden künftig als Exzellenzcluster gefârdert. Die Exzellenzkommission im Wettbewerb der prestigetrÀchtigen Exzell...

Reasonable Artificial Intelligence und The Adaptive Mind: Die TU Darmstadt wird im Rahmen der Exzellenzstrategie des Bundes und der LΓ€nder mit gleich zwei gefΓΆrderten Clusterprojekten ausgezeichnet. Ein Meilenstein fΓΌr unsere UniversitΓ€t! www.tu-darmstadt.de/universitaet...

22.05.2025 16:20 πŸ‘ 32 πŸ” 10 πŸ’¬ 0 πŸ“Œ 2
​​Next Generation Ophthalmic Testing: β€œContinuous Psychophysics” for Rapid and User-Friendly Assessment of Visual, Movement, and Cognitive Disorders​ Discover more about our research project: ​​Next Generation Ophthalmic Testing: β€œContinuous Psychophysics” for Rapid and User-Friendly Assessment of Visual, Movement, and Cognitive Disorders​ at the U...

I’m excited to announce that I’m recruiting for a fully-funded 3.5-year PhD position in my lab in Southampton (UK). For details and to apply, see here: www.southampton.ac.uk/study/postgr...

15.05.2025 16:30 πŸ‘ 17 πŸ” 14 πŸ’¬ 0 πŸ“Œ 0
registration | IICCSSS International Interdisciplinary Computational Cognitive Science Summer School

Registration for IICCSSS 2025 in Darmstadt is open! πŸ₯³ Sign up now for a week of exciting talks, hands-on projects and inspiring discussions! www.iiccsss.org/registration/
As always, IICCSSS is free, and open to all students who are excited about computational cognitive science πŸ’‘πŸ§ 

09.05.2025 20:25 πŸ‘ 4 πŸ” 3 πŸ’¬ 0 πŸ“Œ 0

Western University is seeking applications for Canada Excellence Research Chairs. Please reach out if you are interested in Theme 2: Neuroscience. Western has extraordinary strengths in cognitive, molecular and systems neuroscience across species (rodents, NHPs, humans) uwo.ca/research/cer...

05.05.2025 22:47 πŸ‘ 72 πŸ” 57 πŸ’¬ 0 πŸ“Œ 5
Post image

Interested in motor control or cerebellar function? We have two openings for graduate students for Fall 25. Join the sensorimotor superlab - our interdisciplinary research group Paul Gribble and Andrew Pruszynski. Application instructions at diedrichsenlab.org. Please repost πŸ™

18.11.2024 23:49 πŸ‘ 11 πŸ” 15 πŸ’¬ 0 πŸ“Œ 0
Post image

Congratulations to Matthias Schultheis for defending his PhD thesis 'Inverse reinforcement learning for human decision-making under uncertainty' with distinction. Significant contributions to understanding bounded actors with inverse POMDPs for partial observabilities and non-stationary behavior

17.04.2025 13:37 πŸ‘ 12 πŸ” 2 πŸ’¬ 1 πŸ“Œ 0
Submissions – 47th ECVP 2025 Mainz

πŸ₯ #ECVP early-bird registration ends this Sunday, April 6 🦜

********************* heads up **************************

Register now & save your early-bird rate: ecvp2025.uni-mainz.de/submissions

03.04.2025 08:37 πŸ‘ 7 πŸ” 5 πŸ’¬ 1 πŸ“Œ 0

EyeRepost

11.03.2025 21:32 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

Congratulations to @dominikstrb.bsky.social for defending his PhD thesis 'Inverse normative modeling of continuous perception and action' with distinction, with significant contributions to understanding bounded actors with inverse models, reconciling normative and descriptive models of behavior

07.03.2025 10:12 πŸ‘ 23 πŸ” 3 πŸ’¬ 4 πŸ“Œ 1
Post image

My great collaborators and I published a project on rapid eye and hand responses in an interception task.
jov.arvojournals.org/article.aspx...
I like to summarize my papers in Dr. Seuss style 🟩πŸ₯šπŸ₯šπŸ”

19.11.2024 10:58 πŸ‘ 8 πŸ” 4 πŸ’¬ 0 πŸ“Œ 0
Post image

Happy to announce!
Come join us for a great meeting!
Check philandneuro.com and join the society :-)
#neuroscience #philosophy

13.11.2024 23:30 πŸ‘ 44 πŸ” 20 πŸ’¬ 0 πŸ“Œ 2

Unfortunately reviewer 3 disagrees

27.09.2024 17:23 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

I have 2:
Whatever doesn’t add subtracts
&
Every paragraph advances the story

17.09.2024 10:45 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

I think when publishing we should keep the public in mind. If I read an Rxiv paper outside my own field, I don’t feel confident in judging the quality of the work. Open review is one option that may work. But at some point time is a constraint for readers as well…

12.09.2024 18:03 πŸ‘ 2 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Thank you #ECVP2024 it was a blast! Great science and lots of food for thought (and little actual food). See yβ€˜all in Mainz πŸ‡©πŸ‡ͺ🎭🍷

30.08.2024 09:55 πŸ‘ 4 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Agent-environment loop. a) The agent receives an observation, which may be a noisy or partial version of the environment state. Based on this observation and potentially an internal state, they perform an action, which affects the state at the next time step. b) In online control, the action is directly a function of the observation. c) In model-based control, the agent acts based on an internal state in the form of a belief.

Agent-environment loop. a) The agent receives an observation, which may be a noisy or partial version of the environment state. Based on this observation and potentially an internal state, they perform an action, which affects the state at the next time step. b) In online control, the action is directly a function of the observation. c) In model-based control, the agent acts based on an internal state in the form of a belief.

If it looks like online control, it is probably model-based control.

New preprint on #PsyArXiv, accepted at #CogSci2024

osf.io/preprints/ps...

13.06.2024 15:08 πŸ‘ 3 πŸ” 1 πŸ’¬ 0 πŸ“Œ 1

Honestly, the fit thing won’t be obvious until youβ€˜ve actually started working on a project. But I think it’s worth asking about ambitions and interests in and outside the lab to get a sense of personality.

25.04.2024 23:35 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Finished reading β€žThe Dispossessedβ€œ by Ursula Le Guin.
Such a deep book.

β€œNo man earns punishment, no man earns reward. Free your mind of the idea of deserving, the idea of earning, and you will begin to be able to think.”

15.04.2024 19:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

I am recruiting two PhD students to work on proprioception in an FWO funded project:
www.kuleuven.be/personeel/jo...
#sensorimotor

Spread the word

11.04.2024 13:10 πŸ‘ 5 πŸ” 5 πŸ’¬ 0 πŸ“Œ 0

In case this is helpful to anyone, I shared this data quality indicator list on OSF: osf.io/t8upr

If you are looking for a standardized data cleaning checklist for creating quality data, I've also provided one example here: datamgmtinedresearch.com/clean#clean-...

28.03.2024 13:04 πŸ‘ 16 πŸ” 11 πŸ’¬ 0 πŸ“Œ 0
Debates on the nature of artificial general intelligence The term β€œartificial general intelligence” (AGI) has become ubiquitous in current discourse around AI. OpenAI states that its mission is β€œto ensure that artificial general intelligence benefits all of...

New column by me in Science: "Debates on the nature of artificial general intelligence"
Β 
www.science.org/doi/10.1126/...

21.03.2024 18:50 πŸ‘ 37 πŸ” 12 πŸ’¬ 1 πŸ“Œ 2
Preview
Realizing the full potential of behavioural science for climate change mitigation - Nature Climate Change Behavioural science offers valuable insights for mitigating climate change, but existing work focuses mostly on consumption and lacks coordination across disciplines. In this Perspective, the authors ...

🚨It's finally out!! 🀩

In Nature Climate Change, we present a highly ambitious vision for how to realize the full potential of behavioral science for climate change mitigation 🌍

I hope you'll like it!

#BehSci #ClimateChange #BehSciSky #GreenSky #ClimateAction
www.nature.com/articles/s41...

15.03.2024 11:31 πŸ‘ 85 πŸ” 34 πŸ’¬ 3 πŸ“Œ 1
Preview
THE GRAND PLAN | Nicole Rust

Brain and mind researchers of all types: I hope you'll join this conversation at Cognitive Computational Neuroscience (August 6-9, Boston).

I'm envisioning a community-centered conversation unlike any I've seen before; because it's unusual, I unpack it here:
www.nicolerust.com/grandplan

08.03.2024 13:34 πŸ‘ 39 πŸ” 23 πŸ’¬ 3 πŸ“Œ 5