—
When Sembawang Secondary School ran C-Academy’s design thinking programme, students entered with an overall design thinking competence score of 13.5%. By the end, that figure had risen to 69.5% — a 56% improvement. That kind of shift does not happen by accident. It happens when schools treat competency data not as a reporting exercise, but as a tool for continuous programme improvement.
Most schools running design thinking programmes collect some form of feedback. Far fewer use that data systematically to refine what they deliver. This article is for HODs and programme coordinators who want to close that gap — and who want to understand how a structured design thinking competency model can transform a one-off workshop into a measurable educational intervention.
—
1. Why Competency Data Is the Missing Link in Most School Design Thinking Programmes
Design thinking has earned a firm place in Singapore’s applied learning landscape. Schools invest in workshops, facilitators, and curriculum time — yet many struggle to answer a straightforward question from their principals or parents: what did students actually learn?
The problem is not a lack of effort. It is a lack of structured measurement. Most post-programme feedback collects student satisfaction — enjoyment ratings, general impressions — rather than competency change. Satisfaction data tells you whether students liked the experience. Competency data tells you whether they grew.
The distinction matters for programme sustainability. When a school can show that students improved in problem-framing, ideation, or collaborative thinking, it builds the case for continued investment. When it cannot, the programme risks being cut in the next budget cycle — not because it failed, but because its value was never made visible.
This is where the design thinking competency model becomes essential. Rather than treating design thinking as a soft skill that resists measurement, a well-designed competency framework makes skill development visible, trackable, and actionable. It turns the design thinking process from an experience into an educational intervention with demonstrable outcomes.
C-Academy’s approach to design thinking assessment was built around this gap. Rather than relying on end-of-day feedback forms, the model uses a structured pre/post competency rubric that measures overall design thinking competence before and after the programme — across four core domains that map directly to the EDIT Design Thinking® methodology.
—
2. What a Design Thinking Competency Assessment Actually Measures
A well-designed design thinking competency assessment does not test knowledge of frameworks. It measures whether students can apply design thinking behaviours — and whether that application has improved.
The design thinking framework C-Academy uses for assessment is grounded in the EDIT Design Thinking® methodology, which structures the design thinking approach across six stages: Learning Journey, Empathise, Define, Ideate, Prototype, and Pitch. The competency rubric focuses on the four core skill domains where measurable change is most observable:
Empathise. Can the student observe and listen to understand a real user’s needs, rather than assuming they already know the answer? This is the foundation of user experience thinking — the ability to set aside one’s own perspective and genuinely investigate someone else’s reality. It is also the starting point for customer experience design: understanding what a person actually needs, not what you assume they need.
Define. Can the student synthesise observations into a clear, actionable problem statement — a “How Might We” question that opens up solution space rather than closing it down? Strong problem definition is one of the most transferable design thinking skills, with direct applications in academic writing, project planning, and entrepreneurial thinking.
Ideate. Can the student generate a range of ideas, including unconventional ones, and evaluate them against user needs? This domain captures creative thinking, creative problem solving, and the ability to move beyond the first obvious answer — skills that underpin both entrepreneurial skills and creative synthesis.
Test. Can the student prototype quickly, gather feedback, and iterate — rather than defending their first idea? This domain measures feedback seeking, discovery process orientation, and the willingness to treat early failure as information rather than defeat.
These are not abstract competencies. They are observable, teachable, and measurable. A student who scores low on problem definition at the start of a programme and significantly higher at the end has demonstrably developed a skill that transfers across subjects, CCAs, and future careers.
The design thinking competency model also captures broader design thinking traits that emerge across all four domains: integrative thinking, systems thinking, and the ability to hold multiple perspectives simultaneously. These are the deeper cognitive shifts that distinguish a student who has genuinely internalised the design thinking methodology from one who has simply completed the activities.
—
3. How to Read Your Pre/Post Data: What the Gaps Are Telling You
Raw scores are only useful if you know how to interpret them. A pre/post competency assessment produces several types of data patterns, each pointing to something specific about programme design.
A large overall gap (e.g., 13.5% to 69.5%) signals strong programme effectiveness — but it also raises a follow-up question: which competency areas drove the gain? If empathy scores improved dramatically but ideation scores moved less, that tells you students responded well to the empathy-building activities but may need more structured ideation scaffolding in future cohorts. This is where the design thinking competency model earns its value — not in the headline number, but in the domain-level breakdown.
A small overall gap signals one of three things:
- The programme was too short to produce measurable change
- The activities were not sufficiently challenging relative to students’ starting point
- The assessment was administered too close to the programme end, before learning had consolidated
Uneven gains across student groups are equally informative. If higher-ability students improved significantly while lower-ability students showed minimal change, the programme design may need differentiated scaffolding. If the reverse is true, the programme may be pitching its complexity too low for stronger students. Understanding learning styles and how different students engage with experiential learning is essential context for interpreting these patterns.
Competency development in the ideation domain tends to show the widest variation across school cohorts. This is because creative thinking and opportunity recognition are skills that many students have been actively discouraged from exercising in academic settings — where being right is valued over being generative. Schools that allocate more time to the ideation stage, and that explicitly frame “bad ideas” as part of the discovery process, consistently see stronger ideation gains.
C-Academy shares these pattern analyses with schools after each programme — not just the headline numbers, but the interpretation. The goal is to give HODs and programme coordinators the data literacy to make informed decisions about programme design, not just to report outcomes.
—
4. Using Assessment Findings to Strengthen Future Programme Design
Data without action is just reporting. The real value of design thinking competency assessment is what it enables you to change. This is where the design thinking approach shifts from a pedagogical philosophy to a practical tool for school improvement.
Here is how schools can translate specific data findings into programme adjustments:
If empathy scores are high but define scores are low: Students are observing well but struggling to synthesise. In the next cohort, extend the problem definition session. Add a structured “How Might We” practice exercise before students attempt their own. C-Academy’s facilitators have found that groups benefit from seeing two or three worked examples of strong versus weak HMW statements before writing their own. This is a direct application of design thinking pedagogy — using the design thinking process itself to improve how design thinking is taught.
If ideation scores are low across the board: The issue is usually psychological, not cognitive. Students are self-censoring. Introduce warm-up exercises that explicitly reward volume over quality — Random Cards, Worst Possible Idea, or rapid-fire brainstorming with a timer. The goal is to break the habit of evaluating ideas before generating them. This builds entrepreneurial skills and creative synthesis capacity that extends well beyond the design thinking classroom.
If test/prototype scores are low: Students are struggling to make their ideas tangible. Reduce the fidelity requirement for the first prototype. Explicitly frame the prototype as a communication tool, not a finished product. Introduce the concept of value creation through iteration — that each prototype cycle adds value not by perfecting the idea, but by revealing what needs to change.
If overall scores are high but interdisciplinary collaboration scores lag: The programme may be running in a context where students are working with familiar peers on familiar problems. Introduce cross-class or cross-level challenges. Interdisciplinary collaboration is one of the design thinking traits that requires genuine unfamiliarity to develop — students need to work with people who think differently from them.
The design thinking model is not a fixed curriculum. It is a framework that should be continuously refined based on what the data reveals. Schools that treat each cohort as a learning cycle — running the programme, measuring outcomes, adjusting the design, and running again — are practising the same iterative improvement logic that design thinking teaches their students. This spirit of experimentalism — testing, learning, and adjusting — is what separates schools that improve their programmes from those that simply repeat them.
—
5. The Role of Entrepreneurial Education and Competency Development
One of the most significant shifts in Singapore’s secondary school curriculum over the past decade has been the growing emphasis on entrepreneurial education — not as a standalone subject, but as a disposition woven through applied learning programmes, CCAs, and enrichment activities.
Design thinking is one of the most effective vehicles for developing entrepreneurial competencies in secondary school students. The overlap is not incidental. The design thinking process — empathise, define, ideate, prototype, test — mirrors the entrepreneurial mindset: understand a real problem, generate novel solutions, test assumptions cheaply, and iterate based on feedback.
The specific entrepreneurial competencies that design thinking programmes develop include:
Opportunity recognition. Students who complete the empathy and define stages develop the habit of looking for unmet needs — the foundational skill of entrepreneurial thinking. They learn to see problems not as fixed conditions but as design challenges with potential solutions.
Creative problem solving. The ideation stage builds the capacity to generate multiple solutions to a single problem — a core entrepreneurial skill that distinguishes adaptive thinkers from those who default to the first available answer.
Feedback seeking. The prototype and test stages build the habit of actively seeking critical feedback and using it to improve — rather than avoiding feedback to protect a preferred solution. This is one of the most important entrepreneurial competencies, and one of the hardest to develop in academic settings where feedback is typically evaluative rather than generative.
Value creation. Students who complete a full design thinking cycle — from empathy research to final pitch — develop an intuitive understanding of how value is created through the alignment of user needs, feasible solutions, and viable delivery. This is the core logic of entrepreneurial education, expressed through the design thinking framework. The innovation process — from problem identification through to tested solution — becomes something students have lived, not just studied.
C-Academy’s programmes are designed with this entrepreneurial dimension explicitly in mind. The Final Pitch stage, in particular, is structured around the Sweet Spot of Innovation — the intersection of desirability, feasibility, and viability — which is the same framework used in entrepreneurial education contexts from secondary school through to business school.
—
6. Why Independent Assessment Produces More Actionable Data
One of the most important design decisions in C-Academy’s assessment model is that assessors are external to the facilitating team. The people who deliver the programme do not score the students.
This matters for two reasons.
First, it removes confirmation bias. Facilitators who assess their own students tend — unconsciously — to score generously. Independent assessment produces scores that reflect actual student performance, not the facilitator’s investment in the outcome. This is a basic principle of educational measurement, and one that is frequently overlooked in school enrichment programmes.
Second, it produces data schools can trust. When a HOD presents competency improvement data to a principal or a parent group, the credibility of that data depends on how it was collected. An independent assessment model means the school can say, with confidence, that the results were not self-reported by the programme provider.
Across C-Academy’s programmes in five Singapore secondary schools, the average improvement in overall design thinking competence is 37% — measured across empathy, problem-framing, ideation, and prototyping. That figure comes from independently assessed pre/post data. It is not a satisfaction rating. It is a measured competency shift, built on the same rigour that DesignSingapore’s Learning by Design metrics framework applies to design education outcomes.
The 37% average also masks significant variation. Sembawang Secondary School achieved a 56% improvement. Other schools have achieved gains in the 20–30% range. The variation is informative — it reflects differences in programme length, challenge design, student starting points, and the degree to which schools invested in the pre-programme scoping session with C-Academy. Schools that treat the scoping session as a genuine design conversation, rather than an administrative step, consistently produce stronger outcomes.
—
7. Turning Competency Data Into a Case for Continued Investment
For HODs, the most practical use of design thinking competency data is internal advocacy. Schools that can demonstrate measurable student outcomes are far better positioned to secure continued programme funding, timetable allocation, and leadership support.
A strong data case for continued investment includes three elements:
Baseline and endpoint scores, presented clearly. Not just “students improved” but “students entered at X% and exited at Y% on a validated competency rubric.” The specificity matters — it signals that the measurement was rigorous, not impressionistic.
Comparison to school-wide learning goals. Map the competency gains to MOE’s 21st Century Competencies framework — specifically Adaptive and Inventive Thinking, Communication, and Collaboration. Design thinking competency data is not a standalone metric; it is evidence of 21CC development. Framing it this way connects the programme to the school’s existing strategic priorities, which makes the case for continued investment significantly stronger.
A forward recommendation. Data is most persuasive when it comes with a proposal. “Based on this cohort’s results, we recommend extending the ideation session by one period in the next run, and introducing a second prototyping round. Our projected outcome is a further improvement in test-phase competency scores.” This demonstrates that the school is using the data actively — not just collecting it — and that the programme is being treated as a continuous improvement cycle rather than a one-off event.
C-Academy provides schools with a post-programme data summary that includes all three elements — designed to be shared directly with school leadership. The summary includes domain-level score breakdowns, comparison to C-Academy’s cross-school benchmarks, and specific recommendations for the next cohort based on the data patterns observed.
—
8. Skill Development Beyond the Classroom
The most durable outcome of a well-delivered design thinking programme is not a score on a competency rubric. It is a shift in how students approach problems — a set of design thinking traits that persist long after the workshop ends.
Teachers who have worked with C-Academy consistently report the same observation: students who have completed a design thinking programme ask “why” more often, tolerate ambiguity more comfortably, and revise their thinking more readily when new information arrives. These are not skills that show up in a pre/post assessment. They are the deeper outcomes of genuine competency development — the kind that shapes how a student approaches a group project, a community challenge, or eventually a career.
The skill development that design thinking programmes produce is also cumulative. A student who completes one programme develops a foundation. A student who engages with design thinking across multiple years — through enrichment programmes, CCAs, and applied learning activities — builds a progressively more sophisticated design thinking approach. Schools that treat design thinking as a one-time exposure miss this compounding effect. Schools that build it into their enrichment architecture over time produce graduates who are genuinely equipped for the adaptive, collaborative, solutions-oriented work that Singapore’s future workforce will require.
This is the longer horizon that C-Academy’s programmes are designed with in mind. The competency data is the evidence. The skill development is the outcome. And the design thinking competency model is the bridge between the two.
—
[Links]
- Design thinking programmes for Singapore secondary schools: /design-thinking-course-singapore/
- C-Academy’s Applied Learning Programme support: /applied-learning-programme-singapore/
- About C-Academy’s facilitators and methodology: /about-us/
- Sustainability design thinking workshop: /let-out-your-creativity/sustainability-design-thinking-workshop/
- Reimagining Learning Spaces workshop: /let-out-your-creativity/reimagining-learning-design-thinking-workshop/



