Pretrained battery transformer (PBT): A foundation model for universal battery life prediction

This paper introduces the Pretrained Battery Transformer (PBT), a foundation model that leverages battery-knowledge-encoded mixture-of-experts layers to overcome data scarcity and heterogeneity, achieving state-of-the-art universal battery life prediction across diverse chemistries and conditions.

Ruifeng Tan, Weixiang Hong, Jia Li, Jiaqiang Huang, Tong-Yi Zhang

Published 2026-03-12
📖 5 min read🧠 Deep dive

The Big Problem: The "Black Box" of Battery Life

Imagine you buy a new electric car. You want to know: "How many years will this battery last before it dies?"

Right now, figuring this out is like trying to guess the lifespan of a human by watching them run a single lap. To get a real answer, scientists have to run batteries through thousands of charge-and-discharge cycles. This takes months or even years. It's expensive, slow, and creates a bottleneck for making better batteries.

Worse, every battery is different. Some are made in Japan, some in China. Some use different chemicals (like Lithium-ion vs. Sodium-ion). Some are charged fast, some slow. Some are used in hot deserts, others in cold snow.

Because of these differences, a computer program that learns from one type of battery often fails when you show it a different type. It's like teaching a student to drive a Toyota and then asking them to drive a tractor immediately after. They might know how to steer, but they won't know how to handle the tractor's unique engine.

The Solution: The "Battery Super-Student" (PBT)

The researchers created a new AI model called PBT (Pretrained Battery Transformer). Think of PBT not just as a calculator, but as a super-student who has read every battery manual ever written.

Here is how it works, broken down into three simple concepts:

1. The "Mixture of Experts" (The Team of Specialists)

Usually, AI models try to be a "jack of all trades," learning one big rule for everything. But batteries are too complex for one rule.

PBT uses a strategy called Mixture of Experts (MoE). Imagine a hospital emergency room.

  • If a patient comes in with a broken leg, the Orthopedic Doctor takes over.
  • If they have a heart issue, the Cardiologist steps in.
  • If it's a skin rash, the Dermatologist handles it.

PBT does the same thing. It has a "Gatekeeper" (a smart router) that looks at the battery.

  • Is it a Lithium battery? The Lithium expert wakes up.
  • Is it a Sodium battery? The Sodium expert wakes up.
  • Is it being used in extreme heat? The Heat expert wakes up.

By only using the right "expert" for the specific battery, the model learns much faster and more accurately, even when it doesn't have much data to work with.

2. The "Knowledge Injection" (The Textbook vs. The Trial-and-Error)

Most AI models learn by trial and error. They look at data and guess. But in battery science, we already know a lot of physics (e.g., "Heat makes batteries degrade faster").

PBT is special because it injects human knowledge directly into its brain.

  • The Soft Encoder: Imagine a librarian who reads a book description and summarizes the key points for you. PBT reads the battery's specs (like "18650 size," "Graphite anode") and turns them into a smart summary.
  • The Hard Encoder: Imagine a strict bouncer at a club. If the battery is a "Lithium" type, the bouncer physically blocks the "Sodium" expert from entering the room. This forces the AI to focus only on the rules that actually apply to that specific battery.

This combination allows PBT to learn from a small amount of data because it already "knows" the physics behind the numbers.

3. The "Universal Translator" (Transfer Learning)

The researchers trained PBT on 13 different datasets containing thousands of batteries. This is like the student reading 13 different textbooks on driving.

Once trained, they tested PBT on batteries it had never seen before (like new Sodium-ion batteries or huge industrial batteries).

  • Old AI: "I've never seen a Sodium battery. I give up." (High error).
  • PBT: "I've seen Lithium, and I know the physics of how ions move. Even though this is Sodium, the rules are similar. I can predict this." (Low error).

The Results: Why It Matters

The paper shows that PBT is a massive leap forward:

  • It's Faster: It can predict battery life using just the first few cycles of data, saving months of testing time.
  • It's Smarter: It outperformed the best existing methods by 21.8% on average. In some difficult cases, it was 86.9% better.
  • It's Universal: It works on Lithium, Sodium, and Zinc batteries, and even on huge industrial batteries used in factories.

The Bottom Line

Think of PBT as the first "Foundation Model" for batteries. Just as large language models (like the one you are talking to now) learned to speak any language by reading the whole internet, PBT learned to "speak" battery life by reading every battery dataset it could find.

This means we can now design better batteries, manufacture them faster, and predict when they will fail with much higher accuracy. It turns the slow, expensive process of battery testing into a quick, smart prediction, helping us move faster toward a world powered by clean energy.