We’ll be releasing the first dataset of 5D & 6D lens blur fields for smartphone & SLR lenses—stay tuned!
We’ll be releasing the first dataset of 5D & 6D lens blur fields for smartphone & SLR lenses—stay tuned!
And with more realistic renders, we can also do better device-specific image restoration.
Lens Blur Fields let you render device-specific depth-of-field, blur a resolution chart, or a 3D scene:
Two smartphones of the same make can have subtly different PSFs—your phone has its own blur signature 📱🔍
We show this with the lens blur fields of two iPhone 12 Pros:
Our capture setup only needs a monitor + a simple phone/camera stand. The pipeline is light ✨
1️⃣ Capture a focal stack of monitor patterns (in minutes)
2️⃣ Train an MLP via non-blind deconvolution
3️⃣ Get a continuous, device-specific PSF model
Optical blur (PSF) is a laundry list of degrading effects, e.g. defocus, diffraction, and aberrations.
It’s hard to calibrate as it varies with sensor position, focus, target distance & image plane location.
We introduce Lens Blur Fields—tiny MLPs that can model this high-dimensional PSF.
Huge thanks to my amazing co-authors:
Zhecheng Wang, Rebecca Lin, Daniel Miau, Florian Kainz, Jiawen Chen, Xuaner (Cecilia) Zhang, David B. Lindell & Kiriakos N. Kutulakos
📄 Paper ➡️ blur-fields.github.io
💻 Code: coming soon!
#ComputationalPhotography #IEEECS #ComputerVision #Optics
Every lens leaves a blur signature—a hidden fingerprint in every photo.
In our new #TPAMI paper, we show how to learn it fast (5 mins of capture!) with Lens Blur Fields ✨
With it, we can tell apart ‘identical’ phones by their optics, deblur images, and render realistic blurs.