Using UX methods to improve student satisfaction, course delivery, and day-to-day learning outcomes

Summary

In 2023, while working as a Lead UX/UI Design Teacher at Ironhack, I treated the classroom as a product experience: students had goals, constraints, feedback loops, and measurable outcomes. Ironhack already operated with weekly satisfaction surveys and structured rituals (daily standups and weekly retrospectives). My challenge was to use that existing system and continuously improve the learning experience based on evidence, not assumptions.

Over the year, I applied a UX approach—research, prioritization, iteration, and validation—to refine lesson delivery, increase engagement, and resolve recurring friction points. The result was a measurable increase in satisfaction and a more resilient teaching process that improved from cohort to cohort.

Timeline: 2023 — full year

Context

Ironhack runs intensive bootcamps where students learn through morning instruction and afternoon project work. Each cohort functions like multiple small product teams working in parallel, often with different projects and levels of confidence. The learning experience depends not only on the curriculum, but also on facilitation quality, clarity of expectations, psychological safety, and the effectiveness of feedback cycles.

The program included two key rituals that influenced the experience:

My Role

Lead UX/UI Design Teacher, responsible for course delivery, coaching project teams, and improving the learning experience using survey data, qualitative feedback, and continuous iteration.

The problem

The program had a strong foundation, but the learning experience was not consistently predictable across cohorts and moments in the curriculum:

Goals

Constraints and approach

The bootcamp pace is intense and leaves limited room for large structural changes. Improvements needed to be incremental, high-impact, and implementable within the weekly cadence.

Key decisions:

Key improvements (iteration highlights)

1) Strengthening community and cross-cohort interaction

Survey feedback and direct conversations indicated a consistent desire for more interaction beyond the cohort. I worked with the school to revive a previously discontinued social event and reposition it around student needs. This improved community connection and addressed a recurring feedback point without changing the academic structure.

2) Deepening difficult content through structured reinforcement

Data and student feedback showed certain topics required more time and practice. I implemented two layers of improvement:

3) Earlier detection of confidence gaps and unanswered questions

A key learning from a challenging cohort was that uncertainty and unresolved doubts can stay invisible for too long if the process relies only on end-of-week signals. I introduced lightweight daily checkpoints and reinforced a culture of questions, making it easier to surface blockers early and address them before they affected the overall experience.

4) Increasing reliability in tool-based workshops

A failed Figma workshop highlighted how fast-moving tools can create friction when updates change interaction patterns. I adjusted preparation practices to include testing workflows on the day of delivery and maintaining a mitigation plan (alternative steps, backup examples) for live sessions.

5) Raising the standard of instructional preparedness

To reduce uncertainty during teaching moments, I adopted a stricter preparation routine for every module—including topics I had taught before—by reviewing recent updates, revisiting edge cases, and improving explanation clarity. This improved consistency across cohorts and reduced moments where uncertainty could impact confidence.

Outcomes

What this demonstrates

Next steps

Continue adapting workshops to evolving tool ecosystems with proactive validation and backups

Formalize the improvement framework into a reusable playbook for future instructors

Expand peer learning rituals across cohorts to strengthen community outcomes