MCMC using bouncy\textit{bouncy} Hamiltonian dynamics: A unifying framework for Hamiltonian Monte Carlo and piecewise deterministic Markov process samplers

This paper introduces a unifying framework based on bouncy Hamiltonian dynamics that rigorously connects Hamiltonian Monte Carlo and piecewise deterministic Markov process samplers, enabling the construction of rejection-free, competitive samplers that bridge the gap between these two major Bayesian inference paradigms.

Andrew Chin, Akihiko Nishimura

Published 2026-03-10
📖 5 min read🧠 Deep dive

Here is an explanation of the paper "MCMC using bouncy Hamiltonian dynamics," translated into simple, everyday language with creative analogies.

The Big Picture: Two Rival Ways to Explore a Maze

Imagine you are trying to find the best spots in a giant, foggy maze (this maze represents complex data in statistics). You want to visit the most interesting areas (the "high probability" zones) without getting stuck in a corner or wandering aimlessly.

For a long time, statisticians have had two main ways to do this:

  1. The "HMC" Method (Hamiltonian Monte Carlo): Think of this as a skier. The skier has momentum. They glide down the slopes, using the shape of the terrain to guide them. They move fast and cover a lot of ground. However, if they hit a wall or a bad patch of ice, they have to stop, check if they made a mistake, and sometimes turn back. It's efficient, but that "checking and turning back" slows them down.
  2. The "PDMP" Method (Piecewise Deterministic Markov Process): Think of this as a bouncy ball or a zig-zag runner. This runner moves in a straight line until they hit an invisible wall, at which point they instantly bounce off and change direction. They never stop to "check" if they made a mistake; they just keep moving. This is great for speed, but it can be tricky to tune so they don't get stuck bouncing in the same small circle.

The Problem: These two methods have been like rival sports teams. They use different rules, different math, and rarely talk to each other.

The Solution: This paper introduces a new hybrid athlete called the Hamiltonian Bouncy Particle Sampler (HBPS). It combines the best of both worlds: the smooth, powerful momentum of the skier and the instant, no-stopping bounces of the runner.


The Secret Ingredient: The "Inertia Battery"

How did they build this hybrid? They invented a new concept called Inertia.

Imagine the skier (HMC) is carrying a battery that powers their momentum.

  • In standard skiing, if the terrain changes, the skier might crash or have to stop to check their map.
  • In this new system, the skier has a battery that drains as they move.
  • The Rule: As long as the battery has charge, the skier glides smoothly.
  • The Bounce: The moment the battery hits zero, the skier doesn't stop. Instead, they hit a "bounce" mechanism. They instantly reflect off an invisible wall and keep going, but now with a fresh battery.

This "battery" (called inertia in the paper) is the magic glue. It allows the system to move deterministically (like a machine) but bounce exactly when needed to stay on the right path, without ever needing to stop and ask, "Did I make a mistake?"

Why is this a Big Deal?

1. No More "Stop and Check"

In the old HMC method, the computer often has to simulate a path, check if it's valid, and if it's not, throw it away and try again. This is like driving a car, stopping at every intersection to check a map, and then reversing if you took a wrong turn. It wastes time.
The new HBPS method is like a self-driving car that never stops. If it hits a wall, it bounces. It never rejects a move; it just adjusts its path instantly. This makes it much faster.

2. It's the "Universal Translator"

The paper proves that the "Skier" (HMC) and the "Bouncy Ball" (PDMP) are actually the same thing, just viewed from different angles.

  • If you let the "battery" run out very slowly, the HBPS looks like a Skier.
  • If you recharge the battery constantly and randomly, the HBPS turns into a Bouncy Ball.
    This unification means that ideas invented for one method can now be used to improve the other.

3. Real-World Superpowers

The authors tested this new method on two massive, real-world problems:

  • Medical Study: Analyzing data from nearly 73,000 patients to see which blood thinner is safer. This involved millions of variables.
  • Virus Evolution: Tracking how HIV mutations spread across different populations.

In both cases, the new HBPS method was significantly faster and more accurate than the current best methods. It handled the "foggy maze" of high-dimensional data better than anyone else, finding the answers in less time.

The Takeaway

Think of this paper as inventing a new type of vehicle for exploring complex data landscapes.

  • Old vehicles (HMC) were fast but stopped too often to check maps.
  • Other vehicles (PDMP) kept moving but were hard to steer.
  • The new HBPS vehicle has a smart battery that lets it glide smoothly and bounce instantly when needed, never stopping, never wasting time.

This breakthrough doesn't just give statisticians a faster tool; it proves that the two biggest schools of thought in Bayesian statistics are actually part of the same family, opening the door for even more innovations in the future.