Johannes Jakob Meyer's Avatar

Johannes Jakob Meyer

@jjmeyer

Quantum Information @ FU Berlin. Imprint on www.johannesjakobmeyer.com

503
Followers
187
Following
31
Posts
08.10.2023
Joined
Posts Following

Latest posts by Johannes Jakob Meyer @jjmeyer

Our work shows that there is a need for quantum communication theory and computational complexity to converge more fully, with many more exciting questions ahead!

23.01.2026 15:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

First, we show that there exists a nearly maximal separation between the computational and unbounded two-way quantum capacity. Second, we show a transition in computational capacity from nearly maximal to zero when the complexity (in terms of Choi rank) goes from polynomial to super-polynomial.

23.01.2026 15:21 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

We give a first definition of a computational capacity measure and show that it related to computational distillable entanglement for "efficiently stretchable" channels. We give a channel model for which we can tightly bound these quantities to establish two key results:

23.01.2026 15:21 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
The computational two-way quantum capacity Quantum channel capacities are fundamental to quantum information theory. Their definition, however, does not limit the computational resources of sender and receiver. In this work, we initiate the st...

Last year, I became interested in the question what happens if we impose computational efficiency onto information theory. In our latest paper, we analyze how this changes the amount of information that can be transmitted over a channel -- spoiler alert: a lot! scirate.com/arxiv/2601.1...

23.01.2026 15:21 πŸ‘ 14 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0
Preview
Efficient Quantum Measurements: Computational Max- and Measured RΓ©nyi Divergences and Applications Quantum information processing is limited, in practice, to efficiently implementable operations. This motivates the study of quantum divergences that preserve their operational meaning while faithfull...

I also wanted to add a shout-out to Álvaro, Thomas and Jan who put out a work with similar mindset today: arxiv.org/abs/2509.21308. It turns out that our results mostly complement each other quite nicely, so please have a look there as well!

26.09.2025 13:00 πŸ‘ 5 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

A big thanks to my coauthors Asad, Jacopo, Lorenzo, Sofiene and Jens.

26.09.2025 13:00 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

This is in stark contrast to how people approached computational information theory to date, where the approaches focused on single-shot statements. These are of course more exact, but also much more complicated to manipulate and build intuition for.

26.09.2025 13:00 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

We believe that our definition has a nice level of abstraction that captures the essence of information theory under computational constraints while at the same time having the look and feel of unbounded information theory.

26.09.2025 13:00 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

I invite you to discover many exciting things in the paper, like computationally measured divergences and a computational Stein's Lemma or a computational Rains bound on efficiently distillable entanglement.

26.09.2025 13:00 πŸ‘ 2 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

Another highlight is the fact that we can obtain separations in hypothesis testing, both between bounded and unbounded observers as well as classical-quantum separations. This means there exist classical distributions testable when having access to a quantum computer but not classically.

26.09.2025 13:00 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

We can use the computational relative entropy as a mother quantity. We exemplify this by defining a computational entropy SΜ²(ρₙ):=βˆ’D(ρₙ‖𝕀) and show that it quantifies the best rate at which we can compress a state under complexity restrictions -- just like the entropy in the unbounded setting.

26.09.2025 13:00 πŸ‘ 4 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
The computational version of Pinsker's inequality.

The computational version of Pinsker's inequality.

For example, we obtain a computational version of Pinsker's inequality, which looks exactly like the unbounded relation only that the involved quantities carry underscores, marking them as computational quantities.

26.09.2025 13:00 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

One of our main contributions is to turn this process of "polynomial regularization" into a rigorous notion. It allows us to reap the benefits of regularizing to obtain simpler expressions that closely resemble their counterparts from unbounded information theory.

26.09.2025 13:00 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Of course, talking about polynomial scaling only makes sense when there is a scaling parameter. Therefore, all our results pertain to families of quantum states indexed by some parameter n. This is often the number of qubits, but need not be.

26.09.2025 13:00 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

We propose the following remedy: we define the computational relative entropy DΜ²(ρₙ‖σₙ) as the best error exponent in asymmetric hypothesis testing when restricted to polynomially many copies and polynomial-time tests and take the polynomials to be arbitrarily large.

26.09.2025 13:00 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

But there is a catch! The connection between hypothesis testing and the relative entropy only holds if arbitrary tests can be performed. The optimal tests, however, could be exponentially complex to implement. We also need a large number of copies of the state to get sufficient statistics.

26.09.2025 13:00 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

The fact that the relative entropy and its derived quantities appear so often in information theory is that it quantifies the best achievable error exponent in asymmetric hypothesis testing, a primitive many important information processing tasks can be reduced to.

26.09.2025 13:00 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
The definition of the Umegaki relative entropy.

The definition of the Umegaki relative entropy.

The relative entropy D(ρ‖σ) is a fundamental quantity in classical and quantum information theory. It appears in many important theorems and serves as a mother quantity to derive other important measures from -- for example mutual information or entropy.

26.09.2025 13:00 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
The distracted boyfriend meme, with the "girlfriend" being the relative entropy and the distraction being the computational relative entropy.

The distracted boyfriend meme, with the "girlfriend" being the relative entropy and the distraction being the computational relative entropy.

You think we already have enough entropies and divergence measures? I don't think so!

In our latest work (arxiv.org/abs/2509.20472), we introduce yet another information theoretic measure -- the computational relative entropy.

Let's have a quick rundownπŸ‘‡

26.09.2025 13:00 πŸ‘ 18 πŸ” 3 πŸ’¬ 1 πŸ“Œ 1
Preview
Simulating quantum chaos without chaos Quantum chaos is a quantum many-body phenomenon that is associated with a number of intricate properties, such as level repulsion in energy spectra or distinct scalings of out-of-time ordered correlat...

arxiv.org/abs/2410.18196

12.05.2025 20:57 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

I think the best bit is that they have a guy named "Clifford Mapp" on their team!

20.04.2025 18:27 πŸ‘ 7 πŸ” 0 πŸ’¬ 0 πŸ“Œ 1

Very nice! Michael Wolf also has some very good newer ones: mediatum.ub.tum.de/download/170...

27.03.2025 21:31 πŸ‘ 7 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0
Preview
Registrations Registrations Registrations close: 14 March To register please create an account on our ConfTool portal, accessible through the link below (or use the same account you created for submissions of ta…

Registrations are now open for #QCTiP2025: qctip2025.com/registration....
Don’t wait too longβ€”spots for participants without a talk are first-come, first-served!

17.02.2025 08:49 πŸ‘ 5 πŸ” 2 πŸ’¬ 0 πŸ“Œ 2

Context please

18.01.2025 12:18 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

I guess that particular myth was formulated so that you can debunk it. I never heard anyone claim that error mitigation cannot be used, just that it is no scalable solution for the problem.

14.01.2025 08:20 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

We should all recall once in a while that we as a community are in a hamster wheel of our own making. This means we can also just choose to be nicer and more helpful to each other.

09.12.2024 08:53 πŸ‘ 5 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

That said, one line rejects should not happen. At the end of the day we've all been on the other side and justifiably annoyed by bad reviews.

09.12.2024 08:53 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Therefore, it's more of an answer to the question: "how likely is this to cross the bar relative to the other submissions in your batch?". In that way, "weak reject" doesn't mean the submission is not good science, but just that there is so much other good stuff out there.

09.12.2024 08:53 πŸ‘ 5 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

Something that is important for context – and that I only realized after seeing the process from the inside – is that the scores are not meant to judge the quality of the submission. The task of the PC is to make a program.

09.12.2024 08:53 πŸ‘ 6 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0
Post image

Submissions for QCTiP 2025 are now open.

05.12.2024 10:45 πŸ‘ 24 πŸ” 16 πŸ’¬ 1 πŸ“Œ 2