MDM640 Week 4
Measuring Design Effectiveness
Among the frameworks covered this week, the Goals–Signals–Metrics (GSM) model stands out as particularly suitable for evaluating the effectiveness of my thesis project’s design solution. As discussed by Huang (2020), GSM begins by identifying clear user-centric goals, then determining the behavioral signals that indicate progress toward those goals, and finally selecting quantifiable metrics to track those signals over time. This hierarchy ensures that measurement is grounded in purpose, not just data availability.
For Wonder Woods Workshops, the GSM model would allow me to define goals such as fostering community engagement and promoting sustainable practices. Signals could include repeat workshop registrations or social media interactions reflecting emotional connection. These would then translate into actionable metrics like Net Promoter Score (NPS), session completion rates, or qualitative feedback scores. Importantly, these metrics help identify friction points and opportunities, enabling iterative design refinement focused on user experience and brand resonance.
In contrast, the Cafédirect case study (2020) relied on outcome-driven business metrics such as market share, revenue growth, and social impact contributions. While highly effective, this approach emphasizes performance after implementation. The GSM framework, by comparison, emphasizes a continuous feedback loop. This distinction is critical for Wonder Woods Workshops, as it prioritizes ongoing experiential quality alongside ethical integrity.
The comparison underscores how GSM supports early-stage design with real-time user insights, while Cafédirect’s model proves the long-term impact of emotionally and ethically grounded branding. Bridging these approaches could guide Wonder Woods Workshops to both measure and amplify its success across touchpoints.
References
Cafédirect. (2020, November). Week 4 case study: Substantiating a redesign effort – Cafédirect. Cafédirect. https://effectivedesign.org.uk/case-study/cafedirect
Huang, K. (2020, February 25). 10 frameworks to help you measure success in design: Setting the right metrics. UX Collective. https://uxdesign.cc/how-to-measure-success-in-design-f63f96a0c541
Lollypop Design. (2023, July 17). Top 5 UX metrics frameworks to measure your design performance. Lollypop Design. https://lollypop.design/blog/2023/july/top-5-ux-metrics-frameworks-to-measure-your-design-performance/
The Process of Self & Peer Evaluation Forms
The weekly self-evaluations were instrumental in shaping the Wonder Woods Workshops Brand Playbook. By consistently assessing specific criteria such as strategic alignment, narrative clarity, and design cohesion, I was able to identify gaps early and iteratively refine each section. These evaluations created a cadence of accountability and focus, helping me clarify the brand’s tone of voice, visual language, and values with each update.
Using a structured form for peer evaluations provided clarity and consistency in offering feedback. It guided my attention to key evaluation metrics rather than surface impressions. Peer evaluations gave me valuable insight into alternative approaches to storytelling, hierarchy, and visual coherence, elements I then reexamined in my work.
That said, I struggled with one peer evaluation. The submission offered only the skeleton of a playbook; it addressed some of the required structural expectations but lacked the substance necessary to represent the brand meaningfully. I worried I might have been overly critical, but I found it challenging to engage constructively when there was so little clarity about the brand’s identity or direction. It reminded me how essential narrative depth and brand articulation are, not just for evaluation but for audience connection.