A Fault Detection Scheme Utilizing Convolutional Neural Network for PV Solar Panels with High Accuracy

This paper proposes a straightforward and effective Convolutional Neural Network (CNN) scheme for detecting faults in PV solar panels, achieving high accuracy rates of 91.1% for binary classification and 88.6% for multi-classification while outperforming previous models using the same datasets.

Maryam Paparimoghadamborazjani, Amin Kazemi

Published 2026-03-02
📖 4 min read☕ Coffee break read

Imagine you have a massive field of solar panels, like a giant garden of glass flowers that drink sunlight to power your home. But just like real flowers, these glass ones can get sick. They might get cracked, covered in dust, or shaded by a tree branch. When they get sick, they stop producing energy, and if you don't catch the problem early, the whole garden suffers.

This paper is about building a super-smart digital detective to find these sick panels automatically.

The Problem: The "Needle in a Haystack"

Traditionally, checking thousands of solar panels is like trying to find a single broken tile in a giant roof by walking around and looking at every single one. It's slow, expensive, and humans get tired. Sometimes, the damage is so small (like a tiny crack) that the human eye misses it until it's too late.

The Solution: The "Digital Eye" (CNN)

The authors created a computer program called a Convolutional Neural Network (CNN). Think of this not as a boring calculator, but as a super-observant robot eye that has been trained to look at pictures of solar panels.

Here is how this "robot eye" works, using some simple analogies:

  1. The Training Phase (Learning to See):
    Imagine you are teaching a child to identify fruits. You show them a thousand pictures of apples, oranges, and rotten fruit. Eventually, the child learns that "apples are round and red," while "rotten fruit has brown spots."
    The authors did the same thing with their computer. They fed it thousands of photos of solar panels:

    • Healthy panels (Normal)
    • Cracked panels (Like a broken eggshell)
    • Dusty panels (Like a dirty window)
    • Shadowed panels (Like a plant in the shade)
  2. The Layers (The Detective's Toolkit):
    The computer doesn't just look at the whole picture at once. It breaks the image down into layers, like peeling an onion:

    • Convolution Layers: These are like magnifying glasses that scan the image for specific patterns. One magnifying glass looks for lines (cracks), another looks for dark patches (dust or shadows).
    • Pooling Layers: This is like summarizing a story. Instead of remembering every single pixel, it says, "Okay, there was a crack in this area," and moves on. This makes the computer faster and less confused.
    • Fully Connected Layers: This is the brain that puts all the clues together. It takes the "crack" clue and the "dust" clue and decides, "Ah, this is a cracked panel!"

The Results: A Winning Detective

The authors tested their new detective against other existing methods (like comparing a new sports car to an old sedan).

  • The Binary Test (Sick vs. Healthy): When asked to simply say "Is this panel broken or not?", their new model got it right 91.1% of the time. The old models were only right about 75% of the time. That's a huge difference!
  • The Multi-Class Test (What's wrong with it?): When asked to be more specific ("Is it cracked, dusty, or shaded?"), the new model got it right 88.6% of the time, while the old models struggled at around 70%.

Why Not Use Pre-Made Brains? (Transfer Learning)

The researchers also tried using "pre-trained" brains (like AlexNet or DarkNet). These are like hiring a chef who is an expert at cooking Italian food and asking them to cook sushi.

  • The Result: It didn't work well. The pre-trained chefs were used to different ingredients (different types of images).
  • The Lesson: Sometimes, it's better to train a fresh, simple brain specifically for the job (like training a chef specifically for sushi) rather than trying to force an expert from a different field to do it.

The Bottom Line

This paper shows that we can build a simple, effective, and highly accurate system to scan solar farms. Instead of sending humans to climb on roofs or walk through fields, we can use this AI to look at photos and instantly spot the broken, dusty, or shaded panels.

Why does this matter?

  • Saves Money: You fix the problem before it gets worse.
  • Saves Energy: The solar farm keeps running at full power.
  • Safety: No humans need to risk injury inspecting dangerous heights.

In short, the authors built a smart, automated inspector that keeps our solar energy gardens healthy, ensuring we get the most power possible from the sun.