We're glad to announce the NeSy 2025 Test of Time award for "Probabilistic Inference Modulo Theories"!
πRodrigo de Salvo Braz was here to accept the award.
This is groundwork for recent NeSy approaches like DeepSeaProbLog and the probabilistic algebraic layer.
09.09.2025 21:11
π 5
π 5
π¬ 1
π 0
EurIPS is coming! π£ Mark your calendar for Dec. 2-7, 2025 in Copenhagen π
EurIPS is a community-organized conference where you can present accepted NeurIPS 2025 papers, endorsed by @neuripsconf.bsky.social and @nordicair.bsky.social and is co-developed by @ellis.eu
eurips.cc
16.07.2025 22:00
π 143
π 70
π¬ 2
π 19
We propose Neurosymbolic Diffusion Models! We find diffusion is especially compelling for neurosymbolic approaches, combining powerful multimodal understanding with symbolic reasoning π
Read more π
21.05.2025 10:57
π 93
π 27
π¬ 4
π 6
This year I am co-organizing the 8th iteration of the Tractable Probabilistic Modeling #TPM workshop at #UAI2025 π΄ Rio de Janeiro edition π΄
π lnkd.in/dDK8T5Au
β° Submission deadline: May 23th AoE
π΄ Date: July 15th
π§΅π
30.04.2025 10:54
π 16
π 9
π¬ 1
π 2
Today we have @lennertds.bsky.social from KU Leuven teaching us how to adapt NeSy methods to deal with sequential problems π
Super interesting topic combining DL + NeSy + HMMs! Keep an eye on Lennert's future works!
30.04.2025 14:13
π 9
π 3
π¬ 0
π 1
Have you thought that in computer memory model weights are given in terms of discrete values in any case. Thus, why not do probabilistic inference on the discrete (quantized) parameters. @trappmartin.bsky.social is presenting our work at #AABI2025 today. [1/3]
29.04.2025 06:58
π 45
π 11
π¬ 3
π 1
the #TPM β‘Tractable Probabilistic Modeling β‘Workshop is back at @auai.org #UAI2025!
Submit your works on:
- fast and #reliable inference
- #circuits and #tensor #networks
- normalizing #flows
- scaling #NeSy #AI
...& more!
π deadline: 23/05/25
π tractable-probabilistic-modeling.github.io/tpm2025/
16.04.2025 08:40
π 38
π 19
π¬ 1
π 3
great to have David Watson (dswatson.github.io) visiting us today and talking about #trustworthy #AI #ML for tabular data with #trees and #circuits
with connections to #generative modeling, #causality and #fast inference!
15.04.2025 10:30
π 6
π 3
π¬ 1
π 0
speakers | colorai
The AAAI Workshop on Connecting Low-Rank Representations in AI
You can find the speakers bios and the abstracts of presentations here: april-tools.github.io/colorai/spea...
Check them out!
04.03.2025 20:50
π 0
π 0
π¬ 0
π 0
The last speaker of the workshop is Alexandros Georgiou, who is giving an introduction to polynomial networks and equivariant tensor network architecture, as well as how to implement them.
04.03.2025 20:45
π 0
π 0
π¬ 1
π 0
After lunch break, Andrew G. Wilson (@andrewgwils.bsky.social) is now giving his presentation on the importance of linear algebra structures in ML, as well as on how to navigate such structures in practice.
04.03.2025 19:36
π 2
π 0
π¬ 1
π 0
After Nadav it is now the turn of Guillaume Rabusseau, who is joining us online
Guillaume guides us through interesting expressiveness relationships of families of RNNs that are parameterized through tensor factorizations techniques
04.03.2025 17:05
π 3
π 1
π¬ 1
π 0
Live from the CoLoRAI workshop at AAAI
(april-tools.github.io/colorai/)
Nadav Cohen is now giving his talk on "What Makes Data Suitable for Deep Learning?"
Tools from quantum physics are shown to be useful in building more expressive deep learning models by changing the data distribution.
04.03.2025 14:54
π 14
π 4
π¬ 1
π 0
[CoLoRAI] FinLoRA: Finetuning Quantized Financial Large Language Models Using Low-Rank Adaptation
YouTube video by april lab
we're almost ready for the @realaaai.bsky.social #AAAI25 Workshop on Connecting Low-rank Representations in #AI (#CoLoRAI) tomorrow!
we also have video presentations for some of the accepted papers you can already check (π april-tools.github.io/colorai/acce...)!
π½οΈ www.youtube.com/watch?v=JlVd...
03.03.2025 22:20
π 12
π 6
π¬ 0
π 0
We all know backpropagation can calculate gradients, but it can do much more than that!
Come to my #AAAI2025 oral tomorrow (11:45, Room 119B) to learn more.
27.02.2025 23:45
π 27
π 10
π¬ 1
π 0
We are going to present our poster "Sum of Squares Circuits" at AAAI in Philadelphia today
Hall E 12:30pm-14:00pm poster #840
We trace expressiveness connections of different types of additive and subtractive deep mixture models and tensor networks
π arxiv.org/abs/2408.11778
27.02.2025 12:59
π 14
π 3
π¬ 0
π 1
Home | AAAI'25 tutorial
The AAAI'25 tutorial on Tensor Factorizations + Probabilistic Circuits
Are you at AAAI in Philadelphia and interested about #tensor-factorizations or #circuits or even both?
Then join us today at our tutorial: "From tensor factorizations to circuits (and back!)"
Details and materials here
april-tools.github.io/aaai25-tf-pc...
Time 4:15pm - 6:00pm, Room 117
25.02.2025 17:38
π 31
π 11
π¬ 0
π 2
I am at @realaaai.bsky.social #AAAI25 in sunny #Philadelphia π
reach out if you want to grab coffee and chat about #probabilistic #ML #AI #nesy #neurosymbolic #tensor #lowrank models!
check out our tutorial
π april-tools.github.io/aaai25-tf-pc...
and workshop
π april-tools.github.io/colorai/
25.02.2025 15:33
π 20
π 8
π¬ 1
π 0
Causal normalizing flows: from theory to practice
In this work, we deepen on the use of normalizing flows for causal reasoning. Specifically, we first leverage recent results on non-linear ICA to show that causal models are identifiable from observat...
Right, but what are Causal NFs again? In case you missed our NeurIPS 2023 Oral, Causal NFs are Deep Learning models that learn causal systems (SCMs) while having *theoretical guarantees*!
In short, you can accurately use them for causal inference tasks π§ͺ
arxiv.org/abs/2306.05415
13.02.2025 17:54
π 5
π 1
π¬ 1
π 0
GitHub - adrianjav/causal-flows: CausalFlows: A library for Causal Normalizing Flows in Pytorch
CausalFlows: A library for Causal Normalizing Flows in Pytorch - adrianjav/causal-flows
Have you ever been curious to try Causal Normalizing Flows for your project but found them intimidating? Say no more π
I just released a small library to easily implement and use causal-flows:
github.com/adrianjav/ca...
13.02.2025 17:54
π 39
π 10
π¬ 1
π 2
Happy to see our work at TMLR!
We systematically show the relationships between two apparently different fields: tensor factorizations and circuits, and how bridging the two enables us to exchange results, research opportunitie in ML, and practical implementation solutions.
12.02.2025 10:04
π 20
π 4
π¬ 0
π 0
Interested in estimating posterior predictives in Bayesian inference? Really want to know if your approximate inference "is working"?
Come to our poster at the NeurIPS BDU workshop on Saturday - see TL;DR below.
11.12.2024 17:25
π 40
π 11
π¬ 3
π 0
π£ Does your model learn high-quality #concepts, or does it learn a #shortcut?
Test it with our #NeurIPS2024 dataset & benchmark track paper!
rsbench: A Neuro-Symbolic Benchmark Suite for Concept Quality and Reasoning Shortcuts
What's the deal with rsbench? π§΅
10.12.2024 19:10
π 35
π 8
π¬ 1
π 4
I wanted to make my first post about a project close to my heart. Linear algebra is an underappreciated foundation for machine learning. Our new framework CoLA (Compositional Linear Algebra) exploits algebraic structure arising from modelling assumptions for significant computational savings! 1/4
05.12.2024 15:15
π 140
π 21
π¬ 3
π 2
@ropeharz.bsky.social and his pet dinosaur are on bsky!
follow him for #probabilistic #ML content!
21.11.2024 21:08
π 3
π 2
π¬ 0
π 0