Job alert! Come and work with us at @oii.ox.ac.uk. Weβre recruiting a Postdoctoral Researcher working with @bmittelstadt.bsky.social and @cruss.bsky.social. Full-time position, starts 1 October 2025. Closing date for applications: noon, 30 July. Apply today: bit.ly/3TuJlGc #hiring
09.07.2025 09:13
π 5
π 6
π¬ 0
π 0
One week left to submit your application!
Apply to work with Prof Sandra Wachter at the Hasso Plattner Institute and collaborate with me and Chris Russell at the Oxford Internet Institute, University of Oxford.
@swachter.bsky.social
@hpi.bsky.social
@cruss.bsky.social
@oii.ox.ac.uk
09.06.2025 11:18
π 4
π 2
π¬ 0
π 0
Postdoctoral Researcher (m/f/x) in Technology and Regulation
Postdoctoral Researcher (m/f/x) in Technology and Regulation
Are you interested in the governance of emergent tech?
Come & work w/ me @bmittelstadt.bsky.social & @cruss.bsky.social
We are looking for 3 Post Docs in
Law: tinyurl.com/4rbhcndp
Ethics: tinyurl.com/yc2e2km4
Computer Science/AI/ML: tinyurl.com/yr5bvnn5
Application deadline is June 15, 2025.
27.05.2025 06:45
π 10
π 14
π¬ 0
π 3
See our recent FAccT paper for analysis of how many of these models are for generating nonconsensual sexual imagery arxiv.org/pdf/2505.03859
21.05.2025 18:34
π 29
π 5
π¬ 0
π 0
Still time to apply to work with me and @bmittelstadt.bsky.social and @cruss.bsky.social @oii.ox.ac.uk
15.05.2025 10:59
π 3
π 3
π¬ 0
π 0
OII | Dramatic rise in publicly downloadable deepfake image generators
New Oxford study uncovers explosion of accessible deepfake AI image generation models intended for the creation of non-consensual, sexualised images of women.
New! Latest study from @oii.ox.ac.uk reveals a concerning trend: easily accessible AI tools designed to create deepfake images, primarily targeting women, are rapidly proliferating. Read more: bit.ly/4kc1iVk 1/5
07.05.2025 10:27
π 8
π 4
π¬ 1
π 1
Postdoctoral Researcher (m/f/x) in Machine Learning and Artificial Intelligence
Postdoctoral Researcher (m/f/x) in Machine Learning and Artificial Intelligence
Come & work with me @hpi.bsky.social & @bmittelstadt.bsky.social & @cruss.bsky.social @oii.ox.ac.uk
I am looking for 3 post docs on the governance of emergent tech.
CS: tinyurl.com/yr5bvnn5
Ethics: tinyurl.com/yc2e2km4
Law: tinyurl.com/4rbhcndp
Application deadline is 15.06.2025.
05.05.2025 11:47
π 12
π 11
π¬ 0
π 3
Editorial
Out now in #AIRe, the Journal of AI Law and Regulation, my new editorial discussing the state of research on fairness in AI in an increasingly hostile geopolitical climate, and the need for European leadership going forward.
Open access link: doi.org/10.21552/air...
#AI #DEI @oii.ox.ac.uk
14.04.2025 08:29
π 16
π 5
π¬ 1
π 1
The 4th Monocular Depth Estimation Challenge (MDEC) is coming to #CVPR2025, and Iβm excited to join the org team! After 2024βs breakthroughs in monodepth driven by generative model advances in transformers and diffusion, this year's focus is on OOD generalization and evaluation.
21.12.2024 15:52
π 22
π 3
π¬ 1
π 1
Diagram showing the combination of two heads.
The trick is model surgery on a validation set. We train a multi-head model, the first head solves the original task and the other heads predict groups using a squared loss. A weighted sum of all these heads can enforce any fairness definition, and has the same architecture as the original net
11.12.2024 11:29
π 0
π 0
π¬ 1
π 0
Cartoon logo of an ox and scales
An example showing how to enforce minimum group recall in computer vision.
New fairness toolkit at #NeurIPS today. This fixes most of the problems I've run into in the field.
It is robust to overfitting, works for #NLP and computer vision, and can enforce any definition of fairness that can be written as a function of a confusion matrix. t.ly/ZpRJ-
How do we do that....
11.12.2024 11:29
π 2
π 0
π¬ 1
π 0
Sorry, not this year. Maybe next time.
01.10.2024 13:31
π 0
π 0
π¬ 1
π 0
This is a common problem with LLMs if the temperature is set to zero. It might just be that these small models need a higher temperature.
01.01.2024 12:13
π 0
π 0
π¬ 0
π 0