Large-scale automated detection of gray whales off California in panchromatic and multispectral satellite imagery.

This study demonstrates the feasibility of using artificial intelligence to automatically detect gray whales and other large cetaceans in sub-meter satellite imagery off the California coast, achieving high detection accuracy and enabling large-scale monitoring to support conservation policies.

Original authors: HOUEGNIGAN, L., Cuesta Lazaro, E.

Published 2026-04-19
📖 4 min read☕ Coffee break read
⚕️

This is an AI-generated explanation of a preprint that has not been peer-reviewed. It is not medical advice. Do not make health decisions based on this content. Read full disclaimer

Imagine you are trying to find a specific type of needle in a massive, ever-changing haystack that covers an entire ocean. That needle is a gray whale, and the haystack is the Pacific Ocean off the coast of California.

For decades, scientists have tried to count and track these whales to protect them from ships, fishing nets, and noise. Traditionally, they've used boats or planes to look for them. But that's like trying to find needles in a haystack by walking through it with a flashlight: it's slow, expensive, and you can only check a tiny patch at a time.

This paper describes a new, high-tech solution: using satellites and artificial intelligence (AI) to scan the entire ocean at once.

Here is a breakdown of how they did it, using simple analogies:

1. The Problem: The "Needle in a Haystack"

Gray whales migrate thousands of miles along the California coast. They are in danger of hitting ships or getting tangled in fishing gear. We need to know exactly where they are to keep them safe. But the ocean is huge, and whales are small, dark, and often underwater. Finding them is incredibly hard.

2. The Tool: The "Super-Eye" Satellite

The researchers used high-resolution satellite images (from companies like Maxar). Think of these satellites as having "super-eyes" that can see details as small as a single tile on a roof from space.

  • Multispectral vs. Panchromatic: The satellites take pictures in two ways.
    • Panchromatic: Like a black-and-white photo. It's very sharp but lacks color information.
    • Multispectral: Like a photo taken with a special camera that sees colors humans can't (like infrared). This is like giving the AI "X-ray vision" to see through the water's surface better.

3. The Training: Teaching the AI to Spot Whales

You can't just tell a computer "look for whales." You have to teach it.

  • The "Classroom": The researchers gathered a small "classroom" of 221 confirmed gray whales found in satellite images (mostly from Mexican lagoons where whales gather in huge groups).
  • The "Test": They showed these images to 20 different AI "students" (neural networks). These students had to learn to distinguish a whale from:
    • A ship (which looks like a long line).
    • A plane or helicopter (which looks like a dot).
    • Foam or rocks (which look like white blobs).
    • Just empty water.

4. The Breakthrough: The "Detective" Algorithm

The researchers didn't just let the AI guess. They used a two-step detective process:

  1. The Signal Processor: First, they used a mathematical trick (subspace methods) to quickly scan the image and say, "Hey, there's something weird here that isn't water." This narrows down the search area.
  2. The Deep Learning Classifier: Then, they fed those suspicious spots into a powerful AI (specifically, one called EfficientNetB0) that had been trained on millions of other images. This AI acts like a seasoned detective who can look at a blurry shape and say, "That's definitely a whale, not a rock."

5. The Results: Finding Thousands of Whales

The system worked incredibly well.

  • Accuracy: When using the full-color "X-ray vision" (multispectral) images, the AI was 99.9% accurate. It almost never made a mistake.
  • The Black-and-White Test: Even when they only used the black-and-white (panchromatic) images, the AI was still 87% accurate. This is important because black-and-white images are available more often and cover more area.
  • The Big Discovery: They ran this system over 624,000 square kilometers of the California coast. The result? They found 3,353 gray whales in a single sweep. They also spotted other whales like humpbacks, blues, and fins that they weren't even specifically looking for!

6. Why This Matters: The "Traffic Cop" for the Ocean

Think of this technology as a real-time traffic cop for the ocean.

  • Before: We had to guess where whales might be based on old data or wait for a boat to report a sighting.
  • Now: We can see exactly where the whales are right now.

This allows governments and shipping companies to:

  • Slow down ships in specific areas where whales are currently swimming (preventing collisions).
  • Move fishing gear away from whale hotspots (preventing entanglement).
  • Map new migration routes that whales might be taking due to climate change.

The Bottom Line

This paper proves that we can use satellites and AI to count and track whales on a massive scale, something that was previously impossible. It turns the ocean from a "black box" where we guess what's happening, into a clear window where we can see the whales, understand their movements, and protect them much more effectively. It's a giant leap forward for saving these magnificent creatures.

Drowning in papers in your field?

Get daily digests of the most novel papers matching your research keywords — with technical summaries, in your language.

Try Digest →