Deep Learning-Based Meat Freshness Detection with Segmentation and OOD-Aware Classification

This study presents a deep learning framework for meat freshness detection that combines U-Net-based segmentation with OOD-aware classification, demonstrating that EfficientNet-B0 achieves the highest accuracy (98.10%) on RGB images while supporting practical on-device deployment via TensorFlow Lite.

Hutama Arif Bramantyo, Mukarram Ali Faridi, Rui Chen, Clarissa Harris, Yin Sun

Published 2026-03-03
📖 4 min read☕ Coffee break read

Imagine you are at a grocery store, holding a package of chicken. You want to know: "Is this fresh, or has it gone bad?"

Usually, you have to rely on your nose (smelling it) or your eyes (looking for weird colors), which can be tricky, especially if the meat is wrapped in plastic that reflects light. Sometimes, even experts get it wrong.

This paper describes a new smartphone app idea that acts like a "super-senses" assistant for checking meat freshness. Here is how it works, broken down into simple steps:

1. The Problem: The "Plastic Glare" and the "Unknown"

Most old computer programs for checking meat were trained only on raw meat sitting on a table. They got confused when they saw meat inside a plastic tray with shiny reflections, or if the lighting was weird. Also, if you accidentally pointed the camera at an empty tray or a plate, the old programs would just guess wildly, saying "This is fresh meat!" even though there was no meat there.

2. The Solution: A Two-Step Detective Team

The researchers built a system that works like a two-person detective team:

  • Detective #1: The "Cut-Out Artist" (Segmentation)
    Before the computer tries to guess if the meat is good, it first uses a tool called U-Net to act like a digital pair of scissors. It looks at the photo, finds the meat, and "cuts it out" from the background. It ignores the plastic tray, the price tag, the glare, and the table. It isolates just the meat so the next detective only has to look at the important part.

    • Analogy: Imagine trying to find a specific person in a crowded, noisy room. The "Cut-Out Artist" puts a spotlight only on that person and turns off the lights everywhere else, so you aren't distracted.
  • Detective #2: The "Expert Judge" (Classification)
    Once the meat is isolated, the system looks at five different "brain" models (like ResNet, ViT, and EfficientNet) to decide if the meat is Fresh or Spoiled. They tested these brains on both unpackaged meat and meat in plastic packages.

    • The Winner: One brain, called EfficientNet-B0, turned out to be the smartest and fastest. It got the answer right 98% of the time.

3. The Safety Net: The "I Don't Know" Button (OOD Awareness)

This is the most important part. In the real world, sometimes a picture is too blurry, too dark, or shows something that isn't meat at all (like a shoe or an empty tray).

Old systems would force a guess, saying "It's fresh!" even when they were 10% sure. This new system has a "No Result" button.

  • If the computer is unsure, or if the picture looks weird (Out-of-Distribution), it simply says, "I can't tell. Please check this manually."
  • Analogy: Think of a human doctor. If a patient has symptoms that don't match any known disease, a good doctor won't guess; they will say, "We need more tests." This system does the same thing to prevent dangerous mistakes.

4. The Real-World Test: Running on Your Phone

The researchers didn't just test this on a giant supercomputer. They shrunk the system down to run on a regular Samsung smartphone.

  • They found that the EfficientNet and MobileNet models were like "sports cars": they were incredibly fast (taking less than 20 milliseconds to decide) and still very accurate.
  • The bigger, heavier models (like ViT) were like "trucks": they were slower and took longer to make a decision, which isn't great for a quick grocery store check.

Summary: Why This Matters

This paper presents a practical tool that:

  1. Ignores distractions (plastic glare, trays) by cutting out just the meat.
  2. Knows when it's unsure and refuses to guess, keeping you safe from bad meat.
  3. Runs fast on a regular phone, meaning you could potentially use this in a supermarket or your own kitchen tomorrow.

It's a step toward making food safety automatic, objective, and accessible to everyone, not just food scientists.