Using UX methods to improve student satisfaction, course delivery, and day-to-day learning outcomes
Summary
In 2023, while working as a Lead UX/UI Design Teacher at Ironhack, I treated the classroom as a product experience: students had goals, constraints, feedback loops, and measurable outcomes. Ironhack already operated with weekly satisfaction surveys and structured rituals (daily standups and weekly retrospectives). My challenge was to use that existing system and continuously improve the learning experience based on evidence, not assumptions.
Over the year, I applied a UX approach—research, prioritization, iteration, and validation—to refine lesson delivery, increase engagement, and resolve recurring friction points. The result was a measurable increase in satisfaction and a more resilient teaching process that improved from cohort to cohort.
Timeline: 2023 — full year
Context
Ironhack runs intensive bootcamps where students learn through morning instruction and afternoon project work. Each cohort functions like multiple small product teams working in parallel, often with different projects and levels of confidence. The learning experience depends not only on the curriculum, but also on facilitation quality, clarity of expectations, psychological safety, and the effectiveness of feedback cycles.
The program included two key rituals that influenced the experience:
- Daily standups to align progress and plan the day
- Weekly retrospectives to surface what worked and what did not
In addition, students completed weekly surveys evaluating the teacher, content, and overall experience, creating a consistent measurement loop.
My Role
Lead UX/UI Design Teacher, responsible for course delivery, coaching project teams, and improving the learning experience using survey data, qualitative feedback, and continuous iteration.
The problem
The program had a strong foundation, but the learning experience was not consistently predictable across cohorts and moments in the curriculum:
- Some topics needed deeper coverage and more practice to ensure retention
- Students wanted stronger cross-cohort interaction and community moments
- Individual confidence gaps and unanswered doubts could persist too long before being detected
- Tool changes (e.g., Figma updates) created avoidable friction during key workshops
- A single negative experience could meaningfully affect satisfaction if not addressed quickly and transparently
Goals
- Improve student satisfaction and perceived learning quality across the cohort
- Increase clarity, confidence, and participation during lessons and project work
- Reduce recurring friction in difficult topics through better structure and practice
- Improve feedback responsiveness: detect issues earlier and act faster
- Create a repeatable improvement process that would scale across cohorts
Constraints and approach
The bootcamp pace is intense and leaves limited room for large structural changes. Improvements needed to be incremental, high-impact, and implementable within the weekly cadence.
Key decisions:
- Use weekly survey data and retrospectives as the primary measurement loop
- Prioritize issues by frequency and impact, not by personal preference
- Treat each iteration as an experiment: change, measure, refine
- Improve reliability by increasing preparation depth and reducing “unknowns” in delivery
- Ensure individual feedback was addressed without losing the overall cohort rhythm
Key improvements (iteration highlights)
1) Strengthening community and cross-cohort interaction
Survey feedback and direct conversations indicated a consistent desire for more interaction beyond the cohort. I worked with the school to revive a previously discontinued social event and reposition it around student needs. This improved community connection and addressed a recurring feedback point without changing the academic structure.
2) Deepening difficult content through structured reinforcement
Data and student feedback showed certain topics required more time and practice. I implemented two layers of improvement:
- Immediate reinforcement sessions to support the current cohort
- Curriculum-level adjustments for future cohorts: clearer progression, simplified explanations where appropriate, and more practical exercises designed for retention
3) Earlier detection of confidence gaps and unanswered questions
A key learning from a challenging cohort was that uncertainty and unresolved doubts can stay invisible for too long if the process relies only on end-of-week signals. I introduced lightweight daily checkpoints and reinforced a culture of questions, making it easier to surface blockers early and address them before they affected the overall experience.
4) Increasing reliability in tool-based workshops
A failed Figma workshop highlighted how fast-moving tools can create friction when updates change interaction patterns. I adjusted preparation practices to include testing workflows on the day of delivery and maintaining a mitigation plan (alternative steps, backup examples) for live sessions.
5) Raising the standard of instructional preparedness
To reduce uncertainty during teaching moments, I adopted a stricter preparation routine for every module—including topics I had taught before—by reviewing recent updates, revisiting edge cases, and improving explanation clarity. This improved consistency across cohorts and reduced moments where uncertainty could impact confidence.
Outcomes
- Satisfaction scores improved by approximately ~10% in the cohort that triggered the most changes, exceeding the class average
- Continued improvement across subsequent cohorts, reaching near ~100% satisfaction in later classes
- Reduced recurring friction in previously challenging topics through better structure and practice
- Improved classroom confidence and participation by detecting doubts earlier and addressing them systematically
- A repeatable improvement loop established: feedback → prioritization → iteration → measurement
What this demonstrates
- Applying UX methods beyond screens: designing and improving a service experience
- Comfort with continuous improvement loops based on qualitative and quantitative signals
- Strong facilitation and leadership across multiple “teams” working in parallel
- Ability to translate feedback into actionable changes with measurable outcomes
- Focus on reliability and trust: reducing uncertainty, improving clarity, and building confidence
Next steps
Continue adapting workshops to evolving tool ecosystems with proactive validation and backups
Formalize the improvement framework into a reusable playbook for future instructors
Expand peer learning rituals across cohorts to strengthen community outcomes