Conversational AI-Enhanced Exploration System to Query Large-Scale Digitised Collections of Natural History Museums

This paper presents a human-centred system design that leverages conversational AI and function-calling capabilities to enable natural language querying and visual-spatial exploration of nearly 1.7 million digitised natural history specimen records at the Australian Museum, overcoming the limitations of traditional keyword-based search tools.

Yiyuan Wang, Andrew Johnston, Zoë Sadokierski, Rhiannon Stephens, Shane T. Ahyong

Published Thu, 12 Ma
📖 5 min read🧠 Deep dive

Imagine a massive library, but instead of books, it's filled with 21 million real-life treasures: stuffed birds, ancient shells, preserved insects, and fossils. This is the Australian Museum. For years, most of these treasures have been locked away in the "back room," visible only to scientists with special keys (databases) and the ability to speak "computer code."

The average person couldn't easily peek inside. They had to rely on a few items on display or ask a librarian (a museum staff member) to hunt for specific facts, which took a long time.

This paper describes a new digital magic wand created by researchers to solve this problem. They built a system called the Australian Museum Collection Explorer that lets anyone chat with the museum's entire collection using plain English.

Here is how it works, broken down into simple concepts:

1. The Problem: The "Black Box" of Data

Think of the museum's digitized records as a giant, messy warehouse.

  • The Old Way: To find something, you had to know the exact aisle number, the box label, and the filing system. If you didn't know the technical terms, you were stuck.
  • The Result: Most people never saw the vast majority of the museum's treasures.

2. The Solution: A Conversational Guide

The team built a system with two main parts, like a tour guide who is also a super-powered map:

  • The Interactive Map (The Visual Eye): Imagine a giant digital globe. Instead of just pins, it shows nearly 1.7 million dots, each representing a specimen. You can zoom in on your own neighborhood and see, "Oh, there's a bird specimen collected right here in 1920!" It turns dry data into a visual story of where life exists.
  • The Chatbot (The Talking Brain): This is the star of the show. Instead of typing complex search codes, you can just ask questions like:
    • "Show me all the kangaroos found in New South Wales in the 1980s."
    • "How many beetles do you have from Christmas Island?"
    • "Upload a photo of this bird I saw in my garden and tell me what it is."

3. The Secret Sauce: How the AI "Thinks"

You might worry, "Won't the AI just make things up?" (This is a common problem with AI called "hallucinating").

The researchers solved this with a clever trick called Function Calling.

  • The Analogy: Imagine the AI is a very smart but slightly forgetful tour guide. If you ask a question, instead of guessing the answer from its memory (which might be wrong), it has a magic phone connected directly to the museum's official database.
  • How it works: When you ask, "How many beetles?", the AI doesn't guess. It picks up the phone, asks the database, "Hey, how many beetles are in the system?", gets the exact number, and then tells you the truth. It never guesses; it always checks the source.

4. The Journey: How They Built It

The team didn't just build it in a lab and hope for the best. They used a Human-Centered Design approach, which is like cooking a meal by tasting it as you go.

  • Step 1: They talked to museum scientists and public engagement staff. The scientists said, "We need to show the scale of our collections," and the public team said, "People need to see pictures, not just text."
  • Step 2: They built a rough prototype (a "beta" version) and tested it with volunteers.
  • Step 3: The volunteers gave feedback. They said, "The map is great, but I want to see photos of the animals," and "The AI sounded too robotic; make it sound like a scientist."
  • Step 4: They refined the system based on this feedback, creating the final version you see today.

5. Why This Matters: From "Access" to "Agency"

The paper argues that we are moving from Ubiquitous Access (just being able to get to the museum online) to Ubiquitous Agency (having the power to control your own exploration).

  • Before: The museum told you what to look at.
  • Now: You can follow your own curiosity. If you are curious about a specific bug in your backyard, you can ask the system about it, see where similar bugs were found nearby, and learn about them instantly.

The Big Picture

This project is like turning a static, dusty archive into a living, breathing conversation. It proves that we don't need to be computer experts to explore the wonders of nature. By combining a visual map with a truthful AI chatbot, the Australian Museum is opening its back room to the world, allowing anyone to become a temporary scientist, explorer, or storyteller.

It's not just about showing data; it's about weaving the museum's history into our daily lives, right from our phones, making the natural world feel a little closer and a lot more understandable.