Data Science Influencer: “Alright everyone, it’s pride 🌈🌈month✨✨✨, so you know what that means: ranking the machine learning algorithms by how gay they are”
Data Science Influencer: “Alright everyone, it’s pride 🌈🌈month✨✨✨, so you know what that means: ranking the machine learning algorithms by how gay they are”
Found something akin to reward history in macaque playing a gambling task (Fig 2G).
www.biorxiv.org/content/10.1...
I’m just waiting for the next W.H. announcement to be “It's got what plants crave. It's got electrolytes”
2/2 a note: these are full Bayesian models. Hierarchical priors etc. .
Congrats and well deserved!
#neuroskyence GLM/GAM are used for neural tuning estimation. I made updates to an easy to use python lib. Supports splines, and regs. and likelihood combos not readily available (ARD , Bayesian Group Lasso, etc), (poisson, neg binomial, gauss, zeroinf). Jax/NumPyro for high-D bayes. Linked
Nice. I recommend PyTorch or Jax for such problems to compute your gradients. Minimize the log-posterior to add in some priors for ease. It then effectively ends up being a lagrangian problem. Let me know if you want more info—- I love this stuff.
feel free to msg me, too.
another question: are we thinking inverse optimal control (IOC)? how interpretable does the cost function need to be (Network vs hand parameterized)? It’s hard to get around model fitting for this problem. But we can do well on the latter using the Karush-Kuhn-Tucker conditions.
Do you mean something like dynamic mode decomposition with control? Or the, eg, Q and R in LQR?
The data is the data. I just tried to listen what it spoke to me.
I could not have done this without the sweat and immense effort driven by a whole team of people. @assiachericoni.bsky.social in particular.
Excited to share our latest preprint with @camillopadoasch.bsky.social and Xiao-Jing Wang! We present a biologically plausible framework showing how neural circuits compute & compare value to drive flexible economic decision making.
www.biorxiv.org/content/10.1...
So is V4 also premotor cortex?! Title change?
If you identify as a Jewish member of the academic community, and are appalled by the actions being taken in the name of defending our community, please consider signing this letter (which can be done anonymously) and also sharing it with others:
forms.gle/prnRbq69a6YN...
the authoritarianism is bad, but the loser energy is absolutely intolerable
New Lab Preprint! "Independent Continuous Tracking of Multiple Agents in the Human Hippocampus" led by my graduate student Assia Chericoni.
www.biorxiv.org/content/10.1...
👏
Very BOLD predictions to make.
This is really cool!
new working title: the prefrontal cortex is just premap cortex.
What’s the minimal amount of time I have to write before I play Elden? Asking for a friend.
👏🙌
😜
And on representations and dynamics, let’s take a classic eco psych topic: optical flow and locomitio. If a network codes the lawful transitions between optical states, and self motion is coded, then network can code for the dynamics. Does it not represent in classic sense using neural integration?
It wasn’t meant to be inflammatory. More a reflection of the training I had in grad school, where discussion of representational building blocks was scoffed at rather than treated with intrigue.
There’s a weird history in psychology dealing with this, starting with people in ecological psych fields (Gibsonian). And they were desperate to be as non-representational as possible, were really the first to use dyanamics (Turvey, Kelso). That articles date tracks with that insular thinking.
Do you want to fit nonlinear tuning curves, using Poisson (GAM/GLM) models, but want flexible basis functions, Bayesian machinery, and smoothing and regularization of penalized splines. I wanted a bayesian approach in a modern framework (Pyro). Without guarantee github.com/JustFineNeur...