SuperSkillsStack: Agency, Domain Knowledge, Imagination, and Taste in Human-AI Design Education

This study analyzes how 80 student design teams integrated generative AI into their creative process, revealing that while AI serves as a cognitive accelerator for early-stage tasks like brainstorming, human competencies in agency, domain knowledge, imagination, and taste remain essential for interpreting context, validating outputs, and refining design solutions.

Qian Huang, King Wang Poon

Published Tue, 10 Ma
📖 5 min read🧠 Deep dive

Imagine you are a chef trying to create a brand-new, award-winning dish. Suddenly, a super-smart robot assistant walks into your kitchen. This robot can chop vegetables in a millisecond, suggest 500 different spice combinations, and write a perfect recipe description in seconds.

But here's the catch: The robot has never actually tasted the food, smelled the ingredients, or seen the customers who will eat it.

This is exactly what the paper "SuperSkillsStack" is about. It looks at how design students (our "chefs") are learning to work with Generative AI (the "robot assistant") in their creative projects. Instead of letting the robot take over the kitchen, the researchers found that the best students use a specific set of human skills to keep the robot helpful without letting it run the show.

They call these four human skills the "SuperSkillsStack." Here is what they are, explained with simple analogies:

1. Agency: Being the Captain, Not the Passenger

The Analogy: Think of AI as a very fast, powerful car. Agency is your ability to hold the steering wheel.

  • What the students did: They didn't just let the car drive itself to wherever it wanted. They decided when to turn the engine on, where to drive, and when to stop.
  • In the study: Students realized that for certain parts of the job (like interviewing people or looking at a messy construction site), the robot shouldn't be driving at all. They chose to use the robot only for brainstorming or organizing notes, but they kept the final decision-making power in their own hands. They treated the AI as a tool, not a boss.

2. Domain Knowledge: The "Local Expert" vs. The "Tourist"

The Analogy: Imagine a tourist with a map (the AI) and a local resident who has lived there for 20 years (the student). The tourist's map might say, "Turn left at the big red building," but the local knows that building was demolished last week.

  • What the students did: The AI is like the tourist; it has read millions of books and knows general facts, but it doesn't know the specific reality of the project.
  • In the study: When the AI suggested a design for a park, it might have been technically correct but totally wrong for that specific neighborhood. The students used their own "local knowledge" (gained by actually visiting the site and talking to people) to catch these mistakes. They acted as the fact-checkers who said, "No, that won't work here because the ground is wet and the locals hate that color."

3. Imagination: The Spark Plug

The Analogy: Think of AI as a machine that can shoot out 1,000 fireworks in a second. Imagination is your ability to look at that explosion and say, "Hey, that blue spark looks like a dragon!" and then build a story around it.

  • What the students did: They used the AI to break out of creative ruts. When they were stuck, the AI would give them a wild, weird idea.
  • In the study: The students didn't just copy the AI's ideas. They used the AI's suggestions as a "jumping-off point." The AI provided the raw material (the fireworks), but the students provided the vision to turn those sparks into a coherent design. The AI expanded their options, but the humans chose which path to walk down.

4. Taste: The Final Taste-Test

The Analogy: This is the most important part. Imagine the robot chef has made 100 different soups. Taste is your ability to take a spoonful of each, close your eyes, and say, "This one is too salty," "This one is boring," and "This one is a masterpiece."

  • What the students did: The AI is great at making lots of things, but it's terrible at knowing what is good or appropriate.
  • In the study: Students constantly had to filter the AI's output. They rejected ideas that sounded fancy but were actually useless. They rewrote the AI's text to make it sound human and authentic. They used their "gut feeling" and experience to decide what was worth keeping and what should be thrown in the trash.

The Big Takeaway

The paper concludes that AI is a "Cognitive Accelerator," not a replacement.

Think of it like this:

  • Before AI: A student had to dig through a library for 10 hours to find 5 ideas.
  • With AI: The student gets 500 ideas in 5 seconds.
  • The Result: The student saves time on the search, but they still have to do the hard work of judging which idea is the best.

The Lesson for Education:
We shouldn't teach students how to just "use the robot." Instead, we need to teach them how to be better captains, better local experts, and better judges. If we do that, AI becomes a super-powerful sidekick that helps humans create amazing things, rather than a tool that makes humans lazy.

In short: The robot can generate the noise, but only a human can make the music.