barboragustafsson.com

Case study · 2026 · In user testing

Helping teachers actually use the AI tools they already have.

Teachers in Uppsala were given access to powerful AI tools and a short onboarding session. Most stopped using them. The interface assumed knowledge they didn’t have. This is the story of what happened when I tried to fix that.

 

MY ROLE

Product designer, end-to-end

YEAR

2026

DURATION

10 weeks

TOOLS

Figma, Claude

Three problems standing between teachers and AI.

Help teachers write reliable prompts without teaching them prompt engineering?

Remove language uncertainty before they even start?

Make the tool usable during actual lesson prep, not in dedicated training sessions?

Understanding the problem before building anything.

I conducted qualitative interviews with teachers and supplemented that with secondary research on platforms like Reddit, where people talk openly about their frustrations with AI tools. To speed that up, I used an AI tool to scrape and summarise discussion threads, which let me get through a much larger volume of material than I could have manually.

What came up consistently: teachers did not know what made a prompt work and how to reduce the risk of AI hallucinations. Small changes in wording produced completely different results, with no explanation why. The tool felt unpredictable, so they stopped trusting it.

“AI can’t be trusted, it makes stuff up constantly.”

– Teacher interviewed during research

Johanna, 46

History and Swedish teacher at a primary school, mentor

  • Prepares own materials carefully and adapts content to student interests
  • Has tried AI tools but finds results unreliable and inconsistent
  • Unsure whether to write prompts in Swedish or English, and whether it matters
  • Feels that AI takes more time than it saves
  • Open to AI usage, uses it for brainstorming ideas
  • Interested in learning more about AI, but the time investment has not felt worth it so far

 

Several teachers were genuinely unsure whether to write in Swedish or English and whether it affected the quality of the answer. That uncertainty alone was enough to stop some of them from starting.

Double diamond, applied.

Discover
Teacher interviews and secondary research to understand real pain points

Define
Core barriers: language uncertainty and cognitive load

Develop
Three solution directions explored and evaluated against user needs

Deliver
Designed, built, and launched. Currently in user testing.

A: Prompting coach

Great as a learning tool. Did nothing to help teachers during actual lesson planning.

B: AI-powered checker

Functionally accurate but wasteful on tokens. Not sustainable from a practical standpoint.

C: Pre-filled forms + rule-based checker

Gentle guidance with a practical output. Ready to use straight away. This became the winning design.

A structured prompt in 3 steps.

PromptKee is a web-based prompt generator. Based on research I identified the 10 most common AI use cases for teachers. Each one has its own pre-filled form. Teachers fill in the relevant details, generate a prompt, and paste it into whichever AI tool they use.

The generated prompt includes built-in rules: the AI cannot lie, must admit when it does not know something, and must always produce output suitable for a school environment. Teachers do not need to know any of this. It is handled automatically.

Early signals are encouraging.

2/2

Teachers who previously gave up on AI completed a structured prompt without help on their first try

 

0

Language hesitation observed when the interface was shown in Swedish

 

All three teachers in the broader testing group responded positively to the concept. Testing also revealed that the interface still assumed more understanding of how AI works than most teachers have. Users lacked a mental model for what the tool was doing behind the scenes. That’s something I had underestimated in the initial design. That finding directly shaped the next iteration, simplifying the language and adding contextual cues to make the process more transparent.

The challenge I am still working on.

The core tension in this project is how to give teachers a genuinely useful tool without asking them to learn anything new. Teachers do not have time for onboarding. The design has to be immediately obvious or it will not get used at all.

  • Are the 10 categories the right 10, or are there gaps that only show up in real use?
  • Do teachers feel AI responses are more reliable when the built-in rules are applied?
  • Is the copy-paste step a friction point significant enough to affect whether people return?
  • Does using the tool change how teachers feel about AI over time, not just in the moment?

Access ≠ usability.

Giving people a tool is not enough. If the interaction is unclear, users will stop and blame the tool, not the onboarding. The teachers I spoke with were not wrong to be frustrated. The entry point just had too much friction.

Language turned out to be a UX problem, not just a content one. Offering the interface in the user’s own language is not a nice-to-have. For some users it directly affects whether they feel confident enough to start at all. That is an accessibility consideration as much as anything else.