When a vendor tells you their design thinking workshop was “well-received,” that is not programme evaluation — it is feedback collection. For HODs in Singapore schools making decisions about enrichment time, ALP investment, and 21CC competency development, the difference between a post-session survey and a rigorous pre and post design thinking workshop assessment Singapore schools can actually rely on is the difference between anecdote and evidence.
This guide explains what a credible assessment framework looks like, what C-Academy tracks across its programmes, and the questions every HOD should be asking any design thinking training provider before signing off on an engagement.

1. Why Most Workshop Evaluations Miss What Actually Matters
The most common form of workshop evaluation in Singapore secondary school contexts is the post-activity survey: students rate enjoyment, perceived usefulness, and facilitator quality. These surveys have their place, but they tell you nothing about whether student competence actually changed.
A student can enjoy a workshop thoroughly and walk away with no measurable improvement in their ability to empathise with users, frame problems, or generate and test ideas. Conversely, a workshop that challenges students — one that requires them to sit with ambiguity, discard weak ideas, and rebuild prototypes under feedback — may receive middling enjoyment scores while producing the strongest learning outcomes.
The core issue is intent mismatch: enjoyment surveys measure experience quality, not competence growth. HODs who rely on them alone are measuring the wrong thing. To answer the question “Did this programme develop my students’ design thinking capabilities?” you need a pre and post competency assessment — one administered before the programme begins and after it concludes, using a consistent rubric that tracks the same domains at both points.
This is what a design thinking assessment actually measures: the shift in student competence across specific, trackable domains — before and after the programme — using structured pre/post student surveys that produce defensible, pre/post assessment data rather than positive sentiment.
C-Academy has structured its programme evaluation around this distinction from the outset. Across all delivery engagements, the assessment approach is designed to produce measurable student learning gains, not just post-session satisfaction scores.
2. What a Rigorous Pre and Post Assessment Should Measure
What is a pre and post assessment in education that schools can trust, and how do you measure the effectiveness of a design thinking workshop? The answer lies in identifying the specific design thinking principles the workshop is designed to develop, establishing a baseline before delivery, and re-measuring those same competencies after completion using the same instrument.
A strong pre and post assessment should measure overall design thinking competence — the student’s integrated ability to move through the design thinking process with increasing creative confidence and skill. This is not a single dimension. It reflects how students perform across the full arc of a design challenge, including their design thinking mindsets, their understanding of design thinking as a methodology, and their ability to apply human-centered design principles in a real-world context.
What Assessment Should Not Measure
Participation rates, session attendance, and group presentation polish are activity indicators. They are useful for programme management but tell you nothing about individual competency development. A rigorous instrument separates these clearly, so that the data you present to your department head or school principal reflects genuine learning growth rather than programme activity.
C-Academy’s assessment approach captures both student self-assessment and educator observation — producing quantitative and qualitative data that maps directly to your school’s 21CC reporting and applied learning programme objectives.
3. The Four Competency Domains HODs Should Track
C-Academy’s pre and post project surveys — administered to both students and educators — measure four specific design thinking mindsets that together constitute overall design thinking competence. These are the domains tracked across every programme, from schools such as Hougang Secondary, Pei Hwa Secondary, Ngee Ann Secondary, and Northbrooks Secondary, with structured survey questions for each.
Empathy
Do students find it easy to imagine what others are going through when they face difficulties? Can they put themselves in others’ shoes to see problems from a different point of view? C-Academy gathers empathy data through structured empathy interviews conducted during the programme, as well as pre and post self-assessment questions that track whether students have grown in their capacity to apply empathy maps and observe user needs beyond surface-level observation. Educators independently rate whether their students demonstrate empathic responses in team interactions.
This domain maps to the Empathise phase of C-Academy’s EDIT Design Thinking® methodology and is central to a human-centric approach to learning.
Experimentation
Are students willing to try out new ideas or solutions to solve problems, even when the outcome is uncertain? This domain reflects design thinking mindsets around experimentation — the willingness to move beyond the first obvious answer, engage in divergent ideation, and work through a prototype challenge using low-fidelity prototypes. Students produce at least one concrete artefact per session as evidence of active experimentation, making this domain trackable at the individual level, not just through facilitator observation.
Collaboration
Can students work effectively with others in a team to solve challenges? Do they find it useful to hear other people’s ideas so that solutions can be improved together? Educators observe and rate this shift from individual problem-solving to collective creative thinking — a core outcome of C-Academy’s signature activity structure. This domain also captures facilitation quality under pressure: how well student teams maintain collaborative focus when the problem is ambiguous or the prototype challenge is not going to plan.
Creative Confidence
Do students believe they can solve difficult problems that may not have a clear solution? Do they understand that one problem can have more than one answer? Creative confidence is often the domain with the sharpest pre-to-post movement, because most students arrive at workshops with limited prior exposure to open-ended problem definition. Tracking this shift allows HODs to see how students build design thinking capability progressively across sessions.
Design & Design Thinking Understanding
In addition to the four mindset domains, the survey also tracks whether students understand what the design thinking process is about, and whether educators have grown in their confidence to apply it across subject contexts. This domain captures the pedagogical logic of the programme and its relevance to Singapore curriculum alignment and MOE schools’ 21CC development goals. It also provides educator-specific data — important for schools integrating design thinking into Applied Learning Programme (ALP) or APLM frameworks across subject areas.
4. How Independent Assessment Protects the Integrity of Your Data
What makes a design thinking assessment credible enough to stand up in a principal’s office or a budget review? The answer is not just methodology coherence — it is independence.
Why Facilitator Observation Alone Is Not Enough
In many workshop contexts, outcome data is gathered solely through facilitator observation — the delivery team notes what they saw, interprets student behaviour, and reports on progress. This is useful for real-time facilitation, but it has a fundamental limitation: the data reflects the facilitator’s interpretation, not a direct, unmediated record of student or educator growth. Even the most experienced design thinking facilitator cannot eliminate the effect of student-facilitator trust dynamics or facilitator profile effects on how they perceive and report on student performance.
How C-Academy Collects Unbiased Outcome Data
C-Academy’s approach adds a different layer. The pre and post competency data is gathered directly from participants — both students and educators — through structured surveys administered without any intervention from the facilitating team. Students self-assess their own empathy, experimentation, collaboration, and creative confidence. Educators independently rate what they observed in their students’ design thinking mindsets. Neither group is guided by the facilitator in how to respond.
This independent assessor model means the outcome data reflects what participants themselves report, quantitatively and qualitatively, before and after the programme — producing pre/post assessment data that is clear, documented, and free from facilitator bias.
For HODs, this matters when the assessment data needs to serve a purpose beyond internal reflection — for example, when justifying programme renewal to senior leadership, presenting outcomes to the principal, or contributing to school-wide 21CC reporting. Data gathered directly from participants carries more evidential weight than data filtered through the delivery team’s interpretation.
When evaluating any design thinking training providers, ask specifically: how is your outcome data collected, and does your pre-programme briefing process include a structured, independent survey administered before delivery begins?
5. Reading the Results: What a Strong Outcome Looks Like
Across C-Academy’s programmes, the survey data consistently shows improvement in student competency across all design thinking competence domains — with the strongest gains typically in Empathy and Collaboration, reflecting the structured school onboarding and community stakeholders engagement built into C-Academy’s programme design.
The most striking single-cohort result to date is Sembawang Secondary School, where students moved from 13.5% to 69.5% — a 56% improvement in overall design thinking competence across the full pre and post assessment. This is not a single-domain result; it reflects the complete picture across all measured domains at baseline and at programme completion, including all four mindset domains and the design thinking process understanding component.
What makes this figure meaningful is context. A 56% improvement from a high baseline would suggest a competent programme. A 56% improvement from a low baseline — 13.5% — tells a different story: this cohort arrived with almost no prior design thinking experience, which is consistent with what most secondary school students bring to a first engagement. The programme met them at a genuine starting point and moved them substantially.
For HODs, a strong result is not simply a high post-score. It is a large, consistent delta between pre and post, replicated across multiple Singapore school experience contexts, with a documented methodology and independent assessor model behind it.
6. Questions Every HOD Should Ask a Design Thinking Provider Before Committing
Before signing off on a design thinking workshop engagement, HODs should ask the following questions directly. The answers will tell you quickly whether a provider has a credible, documented methodology — or whether they are making outcome claims without methodology to back them up.
Questions About Assessment Design
What specific competency domains does your assessment instrument measure? A credible design thinking facilitator should be able to name the four specific competencies their survey covers — and explain how each maps to observable student behaviour. If the answer is “student engagement” or “workshop format satisfaction,” that is a red flag.
Is your assessment data gathered independently from your facilitation team? This is a structural integrity question. Ask explicitly who administers the surveys, and confirm that students and educators complete them without intervention from the delivery team.
Can you share aggregate student competency data from previous cohorts? Reputable design thinking training providers maintain this data and can share it appropriately. If a provider cannot produce any longitudinal or cross-cohort data, their assessment practice is likely retrospective rather than systematic.
Questions About School Experience and Context
What is the baseline competence you typically see at the start? This contextualises any post-programme result. A provider who knows their typical entry-point student competency data — including the student profile of schools they have worked with — has been running structured assessments consistently. Ask about their experience across different school delivery experience contexts, including ALP, CCE, and APLM frameworks.
How does your programme prepare educators, not just students? The educator survey data matters as much as the student data. A design thinking training provider with a structured school onboarding process will have educator pre/post data to share, not just student outcomes. Ask also whether their facilitators meet the programme minimum for school delivery experience — a WSQ ACLP-certified trainer background is a meaningful signal of professional facilitation standards.
Questions About Curriculum and Framework Alignment
How do your outcomes connect to MOE’s 21CC framework? Design thinking maps clearly to Adaptive and Inventive Thinking under the 21st Century Competencies framework. A provider who cannot articulate this connection may not understand the Singapore curriculum alignment their programme is operating within.
7. Turning Assessment Data Into a Case for Future Programme Investment
Good assessment data does not only validate a completed programme — it is the foundation for securing the next one.
HODs who receive rigorous pre and post data from C-Academy are equipped to present a structured outcome case to their school’s senior leadership: here is where our students were, here is where they are now, here is how this maps to our 21CC development goals, and here is why continued investment makes sense. This is a measurable, data-supported argument for programme continuity — the kind of argument that survives budget scrutiny and cross-department comparison, and speaks directly to the innovation and design thinking budget conversations HODs face each year.
C-Academy’s design thinking workshops use the EDIT Design Thinking® methodology across a programme minimum of four sessions, with the full programme structured from Learning Journey through to Final Pitch Presentation. The final presentation is itself an assessment moment — students demonstrate design thinking capability progressively built across the programme, in front of peers and educators, using the concrete artefacts developed during the workshop sequence.
The pre and post assessment is built into this arc — not added on as a reporting afterthought, but designed as an integral part of how programme impact is demonstrated.
Post Workshop Report: Beyond the Numbers
Depending on the context, type of project, and programme output, C-Academy also offers an optional Post Workshop Report — a documented output through which schools can see and assess the results of the design thinking workshop beyond the assessment numbers alone. This report captures qualitative insights, team outputs, facilitator observations, and student work samples, giving HODs a fuller picture of what the programme produced and how it can be built upon in subsequent sessions or future programme investment proposals.
For schools considering a design thinking programme and wanting to understand how outcomes are tracked and reported, C-Academy’s team is available to walk HODs through the full assessment methodology before any commitment is made.



