Supporting physics instructors to use a variety of evidence-based approaches to improve student learning: An example from quantum mechanics

This paper illustrates how supporting a quantum mechanics instructor through an online community enabled him to persist with evidence-based active engagement strategies by adapting his approach to include productive struggle via graded corrections, ultimately improving student learning outcomes when the initial clicker-based method failed.

Original authors: Paul Justice, Emily Marshman, Chandralekha Singh

Published 2026-03-03
📖 5 min read🧠 Deep dive

This is an AI-generated explanation of the paper below. It is not written or endorsed by the authors. For technical accuracy, refer to the original paper. Read full disclaimer

Imagine teaching a complex subject like Quantum Mechanics is like trying to teach a group of people how to navigate a dense, foggy forest using a new, high-tech map. The map is supposed to be "evidence-based," meaning it was proven to work by the people who made it. But when you try to use it with your specific group of hikers, they get lost anyway.

This paper tells the story of two physics professors (let's call them Professor A and Professor B) who faced this exact problem. It's a story about how support, patience, and having a "toolbox" of different teaching strategies can save the day when one method fails.

Here is the story broken down into simple parts:

1. The Problem: The "Magic Map" Didn't Work

The professors were teaching a tricky concept called "Adding Angular Momentum." Think of this like trying to figure out the total spin of two spinning tops combined. It's abstract and confusing.

To help students, they used a method called Clicker Questions (CQS).

  • How it works: The teacher asks a multiple-choice question. Students vote with a handheld device (a "clicker"). Then, they turn to their neighbor and argue about the answer. Finally, they vote again.
  • The Goal: This "peer discussion" is supposed to clear up confusion, like a group of hikers helping each other find the path.

Professor A's Experience:
Professor A tried this map. It worked well for some parts of the forest (students learned how to list the possible directions), but it failed for the tricky parts (calculating the actual forces). The students were still confused about why the map worked the way it did.

Professor B's Experience:
Professor B tried the same map the next year. But this time, it failed almost completely. The students were even more lost. Why? Because the students didn't have enough basic knowledge to start with. The "peer discussion" didn't work because half the group didn't know the rules of the game.

2. The Crisis: The "I Give Up" Moment

Usually, when a teacher tries a new method and it fails, they feel discouraged. They might think, "This evidence-based method is a scam. I'll just go back to lecturing from a chalkboard."

This is where the paper's main message comes in: You need a support system.

The authors argue that teaching is a process, not a switch. If a tool doesn't work immediately, you don't throw it away; you tweak it, or you grab a different tool from your toolbox. But you need other teachers to tell you, "Hey, don't quit! Try this other thing."

3. The Solution: The "Do-Over" Incentive

Professor B didn't give up. With encouragement from other physics educators, he tried a different strategy called "Incentives for Learning from Mistakes" (ILM).

Think of this like a video game "Continue" button.

  • The Old Way: You take a test, get it wrong, and the teacher gives you the answer key. You look at it, nod, and move on. You never really learned why you were wrong.
  • The New Way (ILM): Professor B gave the students their graded tests back. He marked the wrong answers but didn't give the solutions yet. He said: "If you figure out your own mistakes and fix them, I will give you back half the points you lost."

This forced the students to struggle productively. They had to open their notes, re-read the textbook, and think hard about why they got it wrong before they could see the right answer. It was like making them re-run the level in the video game until they beat it, rather than just watching someone else play.

4. The Result: The "Struggle" Paid Off

The results were amazing.

  • The students who took the time to fix their mistakes didn't just get their points back; they actually learned the material better.
  • On the final exam (the "boss battle" of the course), the students who had fixed their mistakes scored significantly higher than those who didn't bother to fix them.
  • Even the students who didn't fix their mistakes did better than before, just by seeing the correct answers later, but the ones who did the "hard work" of fixing them were the true winners.

The Big Takeaway

The paper concludes with a simple, powerful lesson for all teachers (and anyone learning anything new):

  1. Don't Panic: If a new teaching method fails, it doesn't mean you are a bad teacher or the method is bad. It just means it needs adjustment.
  2. Have a Toolbox: Don't rely on just one trick. If "Clicker Questions" don't work, try "Mistake Correction." If that doesn't work, try something else.
  3. Find Your Tribe: Teachers need a community (online or in-person) to talk to. When you are stuck, a friend can say, "I had that happen too! Try this instead." This support keeps teachers from quitting and helps students learn better.

In short: Teaching is a journey of trial and error. When you hit a wall, don't turn around and go home. Ask a friend for a different ladder, climb over the wall, and keep going. That's how students learn physics.

Drowning in papers in your field?

Get daily digests of the most novel papers matching your research keywords — with technical summaries, in your language.

Try Digest →