Real progress in a design thinking workshop is not about students completing each stage — it is about what shifts in their thinking along the way. For HODs and educators evaluating a workshop’s impact, knowing what good progress looks like at each stage gives you something concrete to observe, not just a process checklist to tick off. This is what C-Academy’s facilitators look for across every stage of the EDIT Design Thinking® methodology when working with secondary school students in Singapore.

1. Why Measuring Design Thinking Workshop Progress Stage by Stage Matters for HODs

When schools brief C-Academy before a workshop, one of the most common questions from HODs is: “How will we know if it’s working?” The instinct is often to wait for the final pitch presentation — but by then, you have missed six earlier opportunities to see learning in action.

Stage-by-stage progress signals matter because design thinking is a design thinking process skill, not a knowledge test. A student can produce a polished prototype and still have skipped genuine empathy. Another student might present nervously but have done the deepest problem framing in the room. Watching for the right signals at each stage helps educators distinguish real growth from surface performance.

What good workshop facilitation requires is the ability to read the room at each stage — and adapt. At one partner school, C-Academy facilitators noticed early on that students were not engaging as expected during the opening sessions. Rather than pushing through, facilitators paused, consulted the teachers, and readjusted how information was being framed and delivered — shifting to a more interactive, hands-on approach for the next stage. The change was immediate. Once the design thinking methodology was recalibrated to meet the students where they were, engagement climbed and learning accelerated noticeably. This kind of facilitation skills — knowing when to pivot, not just how to present — is central to how C-Academy designs its workshops.

Across 5 different Singapore secondary schools, C-Academy has recorded an average improvement of 37% in design thinking competence from pre- to post-workshop assessment. That improvement is not uniform: it reflects the cumulative effect of progress at each individual stage of the workshop.

2. Stage 1: Learning Journey — What Genuine Curiosity Looks Like

The Learning Journey is a site visit or real-world encounter with the community, organisation, or context students will be designing for. It opens the workshop and sets the tone for everything that follows. Getting the workshop agenda right for this stage — building in enough open observation time rather than back-to-back structured activities — is one of the most important workshop best practices a facilitator can apply.

Good progress here does not look like students dutifully filling in observation sheets. It looks like students who stop walking to ask an unscripted question. Who notice something that was not pointed out to them. Who come back with a detail that was not on the worksheet — a facial expression, a frustrated comment overheard, a moment of friction they spotted in the environment.

Students from Ngee Ann Secondary and Northbrooks Secondary went on a Learning Journey to Punggol Library as part of their design thinking workshop. Rather than simply exploring the space, students conducted guided observation exercises designed to deepen their empathy for library users — noticing how different people navigated the space, what friction points emerged, and where the experience fell short of user needs. The structured empathy exercise within the library setting helped students move beyond surface impressions and into genuine user journey mapping.

C-Academy facilitators brief students before a Learning Journey to resist the urge to jump to solutions. The instruction is simple: your only job today is to notice. Students who have genuinely absorbed this come back to the classroom quieter and more unsettled — in a productive way. They have encountered a real problem, not a tidy case study.

3. Stage 2: Empathise — Signs Students Are Truly Listening, Not Just Noting

The Empathise stage is where students build empathy maps, conduct empathy interviews, and attempt to understand the lived experience of their user. It is also the stage most commonly done superficially. Good workshop facilitation at this stage requires guiding students past their assumptions and into genuine listening — a form of user-centric innovation that begins with setting aside what you think you already know.

Surface-level empathy looks like this: students ask prepared questions, record the answers, and move on. Deep empathy looks different. Students pause during an interview when something unexpected comes up. They probe: “Can you tell me more about that?” They return to the facilitator with a finding that contradicts their assumption — and say so.

One of the most striking examples of this came from a Hougang Secondary School cohort working with SADeaf and the deaf community. None of the students had any prior connection to the deaf community. After a structured empathy session — listening carefully to the experiences shared by their deaf community partners from SADeaf — students were able to construct empathy maps that identified a wide range of genuine, specific challenges faced by deaf individuals in everyday life. The quality and depth of the empathy maps surprised the facilitating team. Without shared lived experience, and purely through attentive listening and structured empathy exercises, the students demonstrated exactly the kind of user needs understanding that underpins strong design thinking.

In C-Academy workshops, a reliable signal of genuine empathy is when a student revises something they wrote down in Stage 1. They came in with one observation; the interview revealed something different. That revision — that willingness to be wrong — is one of the strongest progress indicators at this stage.

4. Stage 3: Define — When a How Might We Statement Becomes a Real Problem

The Define stage asks students to synthesise their empathy findings into a clear problem statement, typically framed as a How Might We (HMW) question. This is the stage that separates workshops that produce creative problem solving from those that produce solutions looking for a problem.

Most early-draft HMW statements are either too broad (How might we make the world better for elderly people?) or too solution-specific (How might we build an app for elderly people to find help?). Neither is useful. A strong HMW is specific enough to point towards a solution space, but open enough to allow multiple approaches.

Progress at this stage is visible in the revision process. Students who are genuinely engaging with Define will rewrite their HMW statement at least twice. They will test it against their empathy map: does this actually reflect what the user told us? Does it address a real user need, or a problem we invented?

C-Academy facilitators use a simple prompt at this stage: “Read your HMW statement to someone who was not on your Learning Journey. Does it make them feel something?” If the answer is no, the statement is not specific enough yet. The goal is a problem statement with enough human detail that a stranger would immediately understand why it matters — and want to solve it. Affinity grouping of empathy data before writing the HMW is a workshop best practice that consistently improves the quality of Define outputs.

5. Stage 4: Ideate — The Shift From Safe Ideas to Surprising Ones

Ideation is where students use tools like C-Academy’s Random Cards and Idea Dice to generate a volume of ideas before selecting the most promising. The common mistake is stopping too early — at the first idea that sounds reasonable. Good workshop facilitation here alternates between divergent thinking (generating freely without judgement) and convergent thinking (evaluating and selecting with rigour).

Good progress in ideation has a specific texture: the room gets louder, then quieter, then louder again. The first phase is volume — students generating quickly without self-censoring. The second is reflection — students returning to their HMW and asking which ideas actually address it. The third is energy — when a genuinely unexpected idea surfaces and the group leans in.

At Methodist Girls’ School (Secondary), students used Random Cards and Idea Dice to ideate around reimagining their classroom — exploring how the learning space could be made more comfortable, more stimulating, and better suited to how students actually want to learn. The ideation tools pushed the groups past predictable suggestions (more tables, better lighting) into more unexpected territory: ideas about acoustic zoning, flexible furniture configurations, and personalised corner spaces that reflected different learning styles. The quality and specificity of the ideas generated through structured ideation tools consistently exceeded what the same students produced in unstructured brainstorms.

Facilitators at C-Academy watch for the moment a student pushes back on their own group’s favourite idea. That critical instinct — the ability to question a popular but weak idea — is a sign that creative collaboration is working as a thinking process, not just an exercise in creative confidence.

6. Stage 5: Prototype and Test — Progress Beyond the Pretty Model

Prototyping is the stage most likely to produce impressive-looking outputs that mask shallow thinking. A beautifully crafted model with no clear user in mind is not a strong prototype — it is a craft project. The goal of rapid prototyping in a design sprint context is to make an idea tangible enough to gather real user feedback — not to produce a finished product.

Good progress at the Prototype and Test stage shows up in how students handle user feedback. Strong groups build quickly and cheaply, then actively seek out criticism. They are not attached to the model; they are attached to the problem. When a tester says “I would never use this”, a progressing group asks why — and modifies.

Northbrooks Secondary students prototyped a sound detector for their library — a device intended to alert users when noise levels rose above a comfortable threshold. During user testing, the user feedback they gathered was largely critical: the idea was seen as potentially disruptive, difficult to implement practically, and unlikely to change behaviour on its own. But rather than abandoning the direction, the group used the feedback to interrogate their original intent. The core goal — reducing noise disruption in the library — was sound. What needed to change was the intervention mechanism. The prototype had done its job: not to succeed, but to teach. The group used what they learned from testing to explore alternative approaches that conveyed the same intended action in a less intrusive way.

C-Academy uses the Sweet Spot of Innovation framework at this stage: students evaluate their prototype against three questions — Is it desirable? Is it feasible? Is it viable? This gives groups a structured lens for deciding what to keep, change, or discard based on user feedback before the Final Pitch.

7. Stage 6: Final Pitch — Presenting the Sweet Spot of Innovation

The Final Pitch is where students present their solution — but the quality signal is not production value. By this stage, students are not presenting a value proposition statement in isolation. They are presenting their Sweet Spot of Innovation: the point at which desirability, feasibility, and viability intersect. Alongside this, they present the three prototypes developed and tested across the programme — each representing an iteration in their thinking. The evaluation is based on how clearly students can convey their entire EDIT Design Thinking® process: the empathy work, the problem framing, the ideation choices, and what they learned from testing.

A weak pitch describes the product: “We designed a device with three features.” A strong pitch demonstrates the journey: it references the specific user need uncovered during empathy, the HMW that guided ideation, the feedback that shaped the prototype, and the reasoning behind the final solution direction.

The Northbrooks Secondary group that prototyped the library sound detector is a strong illustration of this. Their user feedback had been largely negative — most testers did not find the prototype feasible or workable. And yet the group stood out during the final pitch. Not because their solution was polished, but because they could articulate exactly what they had learned. They walked through their design thinking process clearly — why the problem mattered, how their empathy data led them to the sound detector concept, what the testing revealed, and how that feedback shifted their thinking about alternative solutions. Their ability to reflect on their findings, acknowledge the limitations of their prototype, and demonstrate genuine learning throughout the process was precisely what the evaluating facilitators were looking for.

8. What These Signals Mean for Your School’s Design Thinking Workshop Design

For HODs bringing design thinking into their school’s applied learning programme or enrichment calendar, these stage-by-stage signals are practical briefing tools for leadership teams and for evaluating workshop providers.

Before a workshop, share this framework with your observers: at each stage, look for the behavioural shift, not just the output. The student who revises their empathy map is making more progress than the student who completed it neatly the first time. The group that rebuilds their prototype after negative user feedback is further along than the group that defended it.

Understanding what good progress looks like at each stage also helps schools brief providers more effectively. When workshop objectives are tied to specific stage-level behaviours — rather than general outcomes like “students will learn design thinking” — the workshop agenda, remote workshops if applicable, and virtual facilitation structures can all be aligned to what you actually want to see. This is where workshop best practices translate from theory into measurable student outcomes.

C-Academy facilitators debrief with HODs after each session and share stage-level observations that help schools understand what they observed and how to contextualise it. Schools interested in learning more about how the programme is structured can reach out through the C-Academy website.

Design Thinking for Schools with Measurable Outcomes

Share your level, cohort size, and theme. Get a tailored programme proposal.

Students Learning Design Thinking Methodology with C-Academy

Browse Similar Articles