When Singapore secondary school HODs plan a design thinking programme, the learning outcomes are often the last thing written — and the first thing that causes problems. Vague goals like “students will appreciate creativity” make it nearly impossible to assess progress, justify budget, or demonstrate value to school leadership. Drawing on C-Academy’s delivery experience across Singapore secondary schools, this guide walks HODs through how to set outcomes that are specific, measurable, and meaningfully aligned to MOE’s 21st Century Competencies framework — and how experiential learning, structured well, produces design thinking skills that are both observable and lasting.
—
1. Why Generic Learning Outcomes Undermine Design Thinking Programmes
Most design thinking programmes in Singapore schools begin with good intentions and end with difficult questions: Did students actually improve? What changed? How do we know?
The problem usually traces back to outcomes written at the wrong level of abstraction. Statements like “students will develop creative confidence” or “students will learn to collaborate” are aspirational — but they are not assessable. When an HOD cannot point to observable, measurable student behaviour before and after a programme, it becomes very difficult to report meaningful outcomes to a principal, apply for ALP funding, or plan the next iteration.
Generic outcomes also tend to misalign with what a design thinking workshop actually delivers. Design thinking is a structured process — it develops specific, trainable competencies, not general personality traits. When outcomes are written as personality descriptors rather than competency indicators, the programme itself often drifts. Facilitators have no clear target, students have no clear expectation, and assessment becomes subjective.
When C-Academy began working with Methodist Girls’ School (Secondary), the planning process started with a frame and scope session with key teachers and school leaders. Rather than arriving with a pre-set programme, C-Academy facilitated a structured conversation to surface the school’s key challenges, identify what student development looked like in their specific context, and align the workshop’s set objectives with the school’s wanted learning outcomes. That session shaped everything that followed — from the design challenge brief to the session structure to the assessment criteria. It is an example of what happens when outcome-setting is treated as a collaborative process rather than a box to tick before delivery begins. For schools that treat this session seriously, it becomes the pedagogical scaffold on which the entire programme is built.
C-Academy’s pre-programme planning process with Singapore schools always begins with this kind of structured conversation before any workshop design begins. The most consistent finding: schools that invest time in aligning objectives upfront produce clearer student outcomes, stronger facilitator focus, and more useful post-programme data.
—
2. What Good Design Thinking Outcomes Actually Sound Like
“When students leave this programme able to reframe a problem based on what a real user told them — not what they assumed — that is a learning outcome. Everything else is decoration.”
This is the standard C-Academy holds outcome-writing to: observable, specific, and grounded in what the design thinking process actually produces. It is a standard that matters for instructional design, for student motivation, and for curriculum documentation — and it is one that the best Singapore secondary schools have learned to apply consistently across their enrichment and ALP programmes.
Good outcomes share three features. They describe a student behaviour, not a programme activity. They are assessable through a concrete output — a document, a prototype, a survey response. And they connect to a recognised competency framework, so that the language of the outcome does double duty in MOE documentation. School leadership and curriculum planning teams find this approach significantly easier to work with than outcome lists written in vague aspirational language.
What makes this approach especially valuable in authentic learning contexts is that it forces programme designers to confront whether the workshop actually produces what it claims to produce. If an outcome cannot be observed or assessed, it is not an outcome — it is a hope. This distinction is what separates genuinely future-forward skill development from well-intentioned but unmeasured activity.
—
3. Aligning Outcomes to MOE’s 21st Century Competencies Framework
MOE’s 21st Century Competencies (21CC) framework gives Singapore schools a ready-made vocabulary for design thinking outcomes — and yet it is underused in programme planning.
The 21CC framework identifies three broad domains relevant to design thinking: Civic Literacy, Global Awareness and Cross-Cultural Skills; Critical and Inventive Thinking; and Communication, Collaboration and Information Skills. Of these, Critical and Inventive Thinking maps most directly to design thinking competencies — specifically the sub-competencies of adaptive thinking, inventive thinking skills, and the ability to generate and evaluate ideas.
For an Applied Learning Programme (ALP) submission or a school’s Learning for Life Programme (LLP) documentation, outcomes written using 21CC language are significantly easier to justify to MOE and school leadership. An outcome such as “Students will demonstrate the ability to reframe a real-world problem using evidence gathered through empathy interviews” is directly traceable to Critical and Inventive Thinking under 21CC — and it is assessable. This kind of outcome also reflects the educational priorities that MOE has consistently articulated for Singapore’s future workforce: students who can navigate multifaceted challenges, work across diverse perspectives, and design solutions that serve real communities.
C-Academy structures its post-programme reporting to map directly to these 21CC sub-competencies, giving partner schools evidence they can use in their own curriculum documentation. Across the five Singapore secondary schools C-Academy has worked with, this outcome mapping has been used to support ALP and school curriculum reporting — giving HODs a concrete, MOE-aligned evidence base that goes beyond anecdote.
When writing outcomes, HODs should test each one against two questions: (1) Does this appear in or closely relate to the 21CC framework? (2) Would a classroom observer be able to see evidence of this competency during the programme? If the answer to either is no, the outcome needs revision.
—
4. The Four Competency Domains Worth Measuring in a Design Thinking Workshop
Based on C-Academy’s pre- and post-programme assessment methodology, there are four competency domains that consistently show measurable development within a design thinking workshop for secondary school students. These domains form the foundation of any evidence-based approach to design thinking education and skills acquisition.
Empathy and Observation. The ability to suspend assumptions, observe users in context, and extract insight from what people do rather than what they say. This is assessed by examining whether students can identify unarticulated needs from their field research. It is the starting point of human-centered design and the competency that most reliably predicts the quality of everything that follows. Students who develop strong empathy skills are better equipped to understand student demographics, community needs, and the real human stakes of the problems they are working on.
Problem Framing. The ability to move from a broad challenge to a specific, actionable “How Might We” (HMW) statement that accurately reflects empathy findings. Students who cannot do this tend to generate solutions that do not address real user needs. Strong problem framing is the bridge between empathy research and productive ideation — and it is a higher-order thinking skill that transfers directly to academic, professional, and civic contexts. It is also one of the most transferable problem-solving approaches students can develop in school.
Ideation Fluency. The ability to generate a high volume of diverse ideas before converging on a preferred direction. C-Academy uses tools including Random Cards and Idea Dice to develop this competency — and it is one of the most reliably measurable, since volume and variety of ideas can be counted. Brainstorming techniques and divergent thinking skills are directly exercised here, and the tools lower the barrier to participation for students who might otherwise hold back. This is where creative activities move from entertainment to genuine competency development.
Prototyping and Iteration. The ability to translate an idea into a tangible, testable form and respond to user feedback by modifying the prototype. This domain assesses willingness to test, fail, and improve — a disposition observable within a single session. Physical artefacts produced in this phase — storyboards, dioramas, mockups, and other hands-on learning projects — serve as assessment evidence and give students something concrete to reflect on. Iteration over perfection is the core mindset this phase builds.
Across C-Academy’s programmes in five Singapore secondary schools, students show an average improvement of 37% in overall design thinking competence across these four domains between pre- and post-programme assessment. This figure represents the average general improvement across all assessment questions, aggregated across five different schools. At Sembawang Secondary School specifically, the overall design thinking competence score rose from 13.5% to 69.5% — a 56% improvement — within a single programme cycle.
—
5. How C-Academy Measures Student Progress: The Pre/Post Survey Approach
One of the most consistent gaps in school design thinking programmes is the absence of a structured measurement approach. Without a pre-programme baseline, outcome reporting reflects assumed change rather than actual change — and assumed change does not hold up in ALP reporting, school board presentations, or budget justification conversations.
C-Academy administers a pre-survey before the first workshop session and a post-survey after the final session. The assessment instrument is built on metrics from the DesignSingapore Council’s Learning by Design framework — a recognised benchmark for design thinking competence in Singapore schools — supplemented with open-ended questions crafted specifically for each school’s context. These additional questions are developed by the facilitation team to capture the nuances of each school’s goals and student interest, and they serve a dual purpose: providing school-specific data and helping the facilitators improve each subsequent session based on student feedback.
The surveys are administered independently of the delivery team to ensure objectivity. Results are compiled into the post workshop report, which maps student growth across the four competency domains and flags areas where the programme could be strengthened in future iterations. This is formative assessment in the truest sense: data that improves the programme, not just data that justifies it. For institutes of higher learning and secondary schools alike, this kind of rigorous measurement is what distinguishes a genuine learning programme from a one-off event.
—
6. How C-Academy Structures Outcomes Across All Six Workshop Sessions
C-Academy’s full design thinking workshop runs across six structured stages: Learning Journey, Empathise, Problem Definition, Ideation, Prototyping and Testing, and Final Pitch Presentation. Well-written learning outcomes should be traceable to specific stages — not written as a single list covering the whole programme. Each stage is part of a project-based course structure designed to build competency progressively, not all at once.
The overall timeline is documented in the project proposal sent to each school before delivery begins. This document includes each session’s date and time, workshop title, and a detailed breakdown of the learning outcomes, key activities, and expected outputs for each session. The same information is shared with students in the presentation slides at the start of each workshop session — so students know what they are working towards from the moment they sit down. For schools that request it, all of this is also compiled into the post workshop report at the end of the programme, giving both HODs and school leadership a complete record of what was delivered, what was measured, and what students produced.
A practical approach is to assign one primary outcome to each stage:
- Session 1, Learning Journey: Students will identify at least two unmet user needs through direct observation and conversation with real stakeholders in real-world contexts.
- Session 2, Empathise: Students will produce an empathy map that accurately distinguishes between what users say, think, do, and feel.
- Session 3, Problem Definition: Students will articulate a focused HMW statement supported by evidence from their empathy research.
- Session 4, Ideation: Students will generate a minimum of 20 ideas individually using eye-catching worksheets and structured ideation tools before evaluating options as a team.
- Session 5, Prototyping and Testing: Students will test at least one prototype with a real or simulated user and incorporate feedback into a revised version.
- Session 6, Final Pitch: Students will present a value proposition that demonstrates the Sweet Spot of Innovation — where user needs, technical feasibility, and business viability intersect.
This stage-by-stage structure gives HODs a natural assessment framework, makes it straightforward to identify which sessions are delivering against expectations, and gives students a clear sense of progress at every point in the programme.
—
7. Connecting Design Thinking Outcomes to School Culture and National Growth
The strongest design thinking programmes are not stand-alone events — they are embedded in the school’s broader teaching processes and aligned to its long-term educational priorities. Schools that approach programme design this way create genuine bottom-up initiatives: student-led work that surfaces real problems from the school’s own community and generates solutions with authentic learning contexts.
This connection to school culture also matters for sustainability. A one-off problem surfacing workshop produces insight but not change. A programme integrated into the core curriculum — with scaffold learning structures, clear competency progression, and links to existing CCE and ALP objectives — builds design thinking as a lasting capability rather than a memorable exception. Schools that create discussion spaces for students to explore design challenges across different themes — from digital services and online learning platforms to healthcare policies and aging population challenges — find that students develop broader civic awareness alongside their design competency. When HODs work with industry partners and external-partnered design challenges to ground the programme in the external environment, students develop the interpersonal skills, civic literacy, and changemakers mindset that Singapore’s future workforce will need.
Students who participate in hands-on learning projects across diverse themes — from food science and mobile apps to healthcare and community services — develop problem-solving methods that generalise across contexts. Visiting a design education summit or engaging with Singapore student learning spaces outside the classroom gives students exposure to studying zones and physical learning spaces that challenge their assumptions about where and how learning happens. This experimental nature of design thinking — learning by doing, in real-world contexts — is what makes it genuinely transferable. Design tool fluency, whether using digital prototyping software or physical artefacts, develops as a byproduct of solving real problems rather than as an end in itself. This is the strategic advantage that schools investing in rigorous design thinking programmes are building, whether they frame it that way or not.
—
8. Common Mistakes HODs Make When Defining Design Thinking Outcomes
Writing for coverage, not depth. Listing ten outcomes across a four-session programme signals ambition but produces shallow assessment. Three to four well-written outcomes assessed rigorously will yield more useful data than ten outcomes assessed loosely.
Conflating learning outcomes with programme activities. “Students will complete an empathy interview” is an activity, not an outcome. The outcome is what students will be able to demonstrate as a result of doing the interview. This distinction is fundamental to instructional design and to meaningful educational assessment.
Omitting a baseline. Without a pre-programme measure, it is impossible to show growth. C-Academy conducts independent pre- and post-programme surveys using consistent assessment criteria — built on the DesignSingapore Council’s Learning by Design metrics — so that outcome reporting reflects actual change, not assumed change. The survey is administered before the first and after the last workshop session, with additional open-ended questions crafted by facilitators to capture school-specific insights.
Setting outcomes the school cannot assess. If an HOD cannot point to a concrete student output — a document, a prototype, a presentation, a survey response — that provides evidence of the outcome, the outcome is not assessment-ready. Real-world skill building and knowledge application must leave a trace.
Failing to align with school culture and existing frameworks. Outcomes written in isolation from the school’s ALP objectives, CCE curriculum priorities, and character development goals are harder to justify and less likely to be sustained. The best outcomes are written collaboratively — by HODs, school leaders, and programme facilitators together — which is exactly why the frame and scope session matters so much at the start of any engagement. A teacher training workshop or discussion space for staff can be a useful precursor to ensure the programme lands well across the whole school community.
—
9. A Practical Outcomes Template for Singapore Secondary School HODs
The following template can be adapted for any C-Academy design thinking programme or used as a starting point for a school’s own programme documentation. C-Academy provides schools with a version of this framework as part of the onboarding process — populated with programme-specific outcomes, mapped to 21CC sub-competencies, and aligned to the school’s stated goals from the frame and scope session. It reflects a scenario-based approach to curriculum design: outcomes are written against real student situations, not abstract competency statements.
| Competency Domain | Learning Outcome | Assessment Evidence | MOE 21CC Link | |—|—|—|—| | Empathy & Observation | Students will identify at least two distinct user needs through field research | Completed empathy map | Critical & Inventive Thinking | | Problem Framing | Students will produce a HMW statement grounded in empathy findings | Written HMW statement + brief rationale | Critical & Inventive Thinking | | Ideation Fluency | Students will generate ≥20 ideas before selecting a direction | Ideation record / Random Cards output | Communication, Collaboration & Information Skills | | Prototyping & Iteration | Students will test and revise a prototype based on user feedback | Prototype + revision log | Critical & Inventive Thinking | | Presentation & Communication | Students will present a value proposition to a panel using structured pitch format | Pitch presentation + evaluator rubric | Communication, Collaboration & Information Skills |
Used consistently, this framework gives HODs a programme that is easier to defend internally, more useful for MOE curriculum documentation, and — most importantly — more likely to produce genuine student growth. Schools that combine clear outcome-setting with C-Academy’s independent assessment approach leave each programme cycle with data they can actually use. For schools considering the international baccalaureate or GCE A-levels as a context for design thinking integration, the same outcome framework applies — the competencies being measured are transferable across examination systems and learning pathways alike.


