Finally, our work was done in two sensillum types out of dozens, so we can't be sure that other ORN types may not have different F-I curves and adaptation; however, nonlinear F-I curves are ubiquitous and strong adaptation has been seen in many studies...
27.01.2026 20:34
๐ 0
๐ 0
๐ฌ 0
๐ 0
So, ephaptic interactions mainly seem to matter when one of the ORNs in the sensillum is adapted due to longer odour exposure, pointing to a more significant role of ephaptic interactions for novelty detection, as previously described.
All our observations are backed up by a computational model.
27.01.2026 20:31
๐ 0
๐ 0
๐ฌ 1
๐ 0
To our surprise, the non-linearity of frequency-input relations (F-I curve) of spiking olfactory receptor neurons (ORNs) means that the ephaptic inhibition is all but invisible if ORNs are highly activated at odour onset. Adaptation additionally masks the effects.
27.01.2026 20:27
๐ 0
๐ 0
๐ฌ 1
๐ 0
History-dependent ephaptic interactions in paired olfactory receptor neurons
Olfactory sensing begins with the transduction of odors into receptor currents on the dendrites of olfactory receptor neurons (ORNs). In insects and many other arthropods, ORNs are grouped stereotypically in hair-like sensilla on the surface of olfactory organs, enabling mutual inhibition through non-synaptic 'ephaptic' interactions (NSIs). Given the electrical, and therefore virtually instantaneous, nature of NSIs, it has been hypothesized that they contribute to processing fast temporal elements of mixed odor plumes. Here, we present single sensillum recordings and computational modeling that characterize NSIs during short offset dual-odor stimulations in the olfactory sensilla of adult female Drosophila melanogaster. We find in the experiments that the magnitude of inhibition between co-housed ORNs cannot be predicted by their instantaneous activity (firing rate) alone. It is adaptation-dependent, with strong effects only occurring when the inhibited ORN is adapted. This limits the usefulness of NSIs for fast odor processing when ORNs lack time to adapt. We reproduced the observed phenomena in a computational model and use this model to explain how the adaptation-dependence of NSI-mediated inhibition arises from nonlinearities in neural responses. We conclude that NSIs are unlikely to support the encoding of fast temporal dynamics in mixed odor stimuli, instead contributing to slower peripheral processing, supporting roles such as novelty detection. More broadly, we demonstrate how the nonlinear interactions of fairly simple electrical components lead to non-intuitive results, offering insight into the longstanding debate around ephaptic interactions in other systems, such as the mammalian CNS. ### Competing Interest Statement The authors have declared no competing interest. Leverhulme Trust, https://ror.org/012mzw131, RPG-2019-232 Engineering and Physical Sciences Research Council, https://ror.org/0439y7842, EP/P006094/1, EP/S030964/1
Happy to share our preprint on ephaptic interactions between olfactory receptor neurons in Drosophila by Lydia Ellison.
doi.org/10.64898/202...
We set out to see how near-instantaneous electrical interactions are good for processing tiny odour onset delays, only to find that they weren't.
27.01.2026 20:24
๐ 5
๐ 3
๐ฌ 1
๐ 0
Happy to share our preprint with @neworderofjamie.bsky.social , @danakarca.bsky.social and @drtnowotny.bsky.social !
Weโve been working on a neuron position learning algorithm by coupling space and time! See manuscript below ๐
27.01.2026 12:02
๐ 16
๐ 8
๐ฌ 1
๐ 0
Excited to see the paper fully published. It's an important milestone for training SNNs with exact gradients, replacing our earlier tricks of a "delay line augmentation" to capture temporal relationships. Delays can now be learnt alongside weights naturally. Amazing work @mbalazs98.bsky.social !
25.11.2025 18:24
๐ 21
๐ 8
๐ฌ 0
๐ 0
Project suggestions : Sussex AI PhD Programme : ... : AI Research Group : University of Sussex
If you're interested in doing a PhD at Sussex, lots of exciting PhD project ideas at www.sussex.ac.uk/research/cen... including several Spiking Neural Network projects with myself and @drtnowotny.bsky.social . Funding available via www.sussex.ac.uk/study/fees-f... or www.sussex.ac.uk/study/fees-f...
31.10.2025 13:11
๐ 2
๐ 3
๐ฌ 1
๐ 0
Excellent paper and great new project. Very interesting how the race of technology between GPU and FPGA shapes up.
16.07.2025 15:25
๐ 8
๐ 1
๐ฌ 0
๐ 0
06.07.2025 10:22
๐ 2
๐ 0
๐ฌ 0
๐ 0
Strong results on Speech commands with GLE presented by Paul Haider at #cns2025florence today . @cnsorg.bsky.social
06.07.2025 10:19
๐ 0
๐ 0
๐ฌ 1
๐ 0
And we are off - excited that #cns2025florence is under way with record attendance - first keynote starts in 5 minutes.
05.07.2025 14:08
๐ 10
๐ 2
๐ฌ 1
๐ 0
Towards a Better Future: How Sussex Students are Using AI to Change the World!
YouTube video by University of Sussex
Very excited by our new Sussex AI promotional video www.youtube.com/watch?v=D5dS...
13.05.2025 12:18
๐ 2
๐ 1
๐ฌ 0
๐ 0
Posting this after some recent conversations with potential international applicants - still time to apply to our Masters courses and International PhD Academy for 2025 entry - join the diverse and vibrant Neuroscience community on our beautiful campus next to Brighton
13.05.2025 16:12
๐ 6
๐ 6
๐ฌ 0
๐ 0
... and you can see the great lineup of tutorials and workshops here: ocns.memberclicks.net/cns-2025-mee...
Of course, tutorial and workshop registration can also be added later if you have already registered for the main meeting. @cnsorg.bsky.social
17.03.2025 18:50
๐ 0
๐ 0
๐ฌ 0
๐ 0
The deadline is tomorrow - last push! @cnsorg.bsky.social
17.03.2025 17:44
๐ 0
๐ 0
๐ฌ 0
๐ 1
Only 5 days to go to the (extended) deadline. Make them count. A great lineup of keynotes, tutorials and workshops is secured - add your research as an oral or poster by submitting an abstract.
13.03.2025 09:37
๐ 0
๐ 0
๐ฌ 0
๐ 1
@drtnowotny.bsky.social and I are again participating in Google Summer of Code under the @incforg.bsky.social. We have 3 paid projects involving GeNN for contributors with a range of skills and experience levels. If you're interested, please get in touch via the forums linked from the thread:
06.03.2025 12:23
๐ 4
๐ 4
๐ฌ 1
๐ 0
CNS 2025 Abstract Submission
Call for Abstracts: 34th Annual Computational Neuroscience Meeting, CNS*2025 ๐
๐ Join us in Florence ๐ฎ๐น July 5-9, 2025! ๐ง โจ
๐๏ธ Abstract Deadline: March 11, 2025
๐ Submit here: www.cnsorg.org/cns-2025-abs...
๐ Workshops: www.cnsorg.org/cns-2025-cal...
๐ Tutorials: www.cnsorg.org/cns-2025-cal...
24.01.2025 06:32
๐ 4
๐ 2
๐ฌ 0
๐ 0
This is awesome work from Balazs! Not only does our Eventprop-based method supports multiple spikes per neuron and recurrent connectivity but uses less than half the memory of the current state-of-the-art delay-learning method and is up to 26x faster.
23.01.2025 12:50
๐ 9
๐ 2
๐ฌ 0
๐ 0
Wikipedia is one of the last major bastions of verified information. Which, of course, is why the oligarchs want to destroy it.
You can donate to them here.
donate.wikimedia.org/w/index.php?...
22.01.2025 14:37
๐ 248
๐ 139
๐ฌ 4
๐ 12
Figure displaying the results of computational benchmarking of the algorithm. A) GPU memory use is plotted versus the number of timesteps in the simulation. We compared GeNN versus SpyX and for 256 hidden neurons and 1024 hidden neurons. The graph shows a linear increase of memory requirements for SpyX but essentially flat lines for GeNN. B) A similar graph for the training time as a function of timesteps. All curves increase but the increase for SpyX is much steeper than for GeNN. C) A bar graph comparing the training time for different elements of the network as a function of timesteps. The most time is spent on the synapses, followed by neuron simulation and other bits. The compile time is negligible. All times increase roughly linearly with number of time steps and somewhat sub-linearly with the number of hidden neurons.
We also classified the Spiking Speech Commands (SSC) with good success. Finally, the GeNN implementation of Eventprop has very beneficial computational scaling properties compared to BPTT in Spyx (github.com/kmheckel/spyx). All details at doi.org/10.1088/2634.... @sussexai.bsky.social 5/5
21.01.2025 17:06
๐ 3
๐ 0
๐ฌ 1
๐ 0
Bar graph showing the performance of our algorithm in classifying the Spiking Heidelberg Digits data set with four different variations: plain, with delay line input, with delay and shift, and with delay, shift and blend. The performance increases in this order. Bars are shown for all four conditions for three hidden layer sizes. The performance is best for the largest hidden layer but the differences are not very large.
We extended Eventprop to a wider class of loss functions and found that, with a โloss-shapingโ term, we could achieve fast and reliable learning. Combining this with 3 data augmentations, we obtained a SOTA SHD classification accuracy of 93.5ยฑ0.7% (n=8) on the test set after rigorous validation. 4/5
21.01.2025 16:57
๐ 2
๐ 0
๐ฌ 1
๐ 0
This initially failed due to average cross-entropy loss creating unhelpful gradients in the hidden layer and the fact that spike creation and deletion is not โvisibleโ in the exact gradients calculated by Eventprop. 3/5
21.01.2025 16:51
๐ 2
๐ 0
๐ฌ 1
๐ 0
GeNN ยท GeNN by genn-team GeNN by genn-team
We implemented Eventprop (Wunderlich & Pehle, 2021) in our GeNN (genn-team.github.io) simulator and attempted to classify the Spiking Heidelberg Digits (SHD, zenkelab.org/resources/sp...) in a 3-layer network. 2/5
21.01.2025 16:50
๐ 2
๐ 0
๐ฌ 1
๐ 0
Loss shaping enhances exact gradient learning with Eventprop in spiking neural networks - IOPscienceSearch
Loss shaping enhances exact gradient learning with Eventprop in spiking neural networks, Nowotny, Thomas, Turner, James P, Knight, James C
Just out in J Neuromorph Comput & Eng: Loss shaping enhances exact gradient learning with Eventprop in Spiking Neural Networks doi.org/10.1088/2634.... With @neworderofjamie.bsky.social.
We report how Eventprop scales to harder learning tasks.
TL;DR: It works great but not without extra effort. ๐งช๐งต
21.01.2025 16:49
๐ 10
๐ 4
๐ฌ 1
๐ 1
"A study of federally funded research projects in the United States estimated that principal investigators spend on average about 45% of their time on administrative activities related to applying for and managing projects rather than conducting active research"
www.pnas.org/doi/10.1073/...
04.01.2025 13:26
๐ 998
๐ 464
๐ฌ 34
๐ 84
We are proposing PhD project ideas along these lines:
10.01.2025 15:28
๐ 12
๐ 5
๐ฌ 0
๐ 0
More than 60 German-speaking universities and research institutes have just jointly announced they will cease activities on X because โthe current orientation of the platform is not compatible with their core valuesโ incl. scientific integrity, transparency and democratic discourseโ ๐งช
10.01.2025 13:50
๐ 768
๐ 219
๐ฌ 17
๐ 28
University of Sussex
Hi Manu,
I am a Professor of Informatics and work in Computational Neuroscience and neuromorphic computing. I would like to contribute to the Science feed. My staff page is profiles.sussex.ac.uk/p206151-thom... and Google Scholar scholar.google.com/citations?us...
08.01.2025 09:19
๐ 7
๐ 1
๐ฌ 2
๐ 0
New paper on spike sorting www.sciencedirect.com/science/arti...
with Lydia Ellison, Georg Raiser, Alicia Garrido Peรฑa and George Kemenes. TL;DR: SSSort software now handles overlapping spikes in addition to the extreme spike shape changes it was already good at. github.com/grg2rsr/SSSort
20.12.2024 20:30
๐ 8
๐ 2
๐ฌ 0
๐ 0