From Data Overload to Action-driven Dashboard

Reworking on the client project under ambiguity to streamline a decision-driven analytics dashboard experience (specific names have been reenacted for NDA purpose)

Design

Design

Desktop App

Desktop App

My role

My role

product designer

product designer

timeline

timeline

Feb - March 2025

Feb - March 2025

industry

industry

EduTech

EduTech

Overview

We were assigned to design the analytics dashboard for educators on this AI- supported learning platform. After this 4-week client project, by reframing the design problem from data visualization to decision support, I turned a chart-stacked dashboard into a coherent educator workflow, one where teachers can spot cross-class learning gaps, and deploy an AI-supported remedial plan in a single connected flow.

design showcase

A more action-driven, informative analytics dashboard. Better visual, scannability and usability.

After: a decision-ready workspace that replace passive charts with categorized student performance, CTA buttons, AI-drafted nudges and suggested remedial work.

the problem

Slogging through Ambiguity = A lack of visibility into the full user story

The brief handed us a feature set - no user stories, no personas, no defined use cases. Without a clear picture of who we were designing for or how to measure sucess, we were building to spec, not solving a problem.

That ambiguity surfaced fast in testing: teachers couldn't tell what each chart was supposed to show them, let alone what action to take next. We shipped a functional dashboard that failed at its core job.

context & Design opportunity

The charts looked right but users struggles to tell what it does for them, why?

V1's charts looked credible but lacked purpose: risk-level tags labeled students without giving teachers the context to act on them. The signal was there, but the story wasn't.

The gap crystallized when a teacher asked mid-interview: "What am I supposed to do after seeing this?" That question reframed the entire problem. Every feature had been designed as a data display, not a decision-making tool. I took that as the design problem to own independently after the project wrapped.

๐Ÿ™‹ uncovering the underlying user problem

Teachers want to see who needs help early enough to give meaningful help without burning out.

Instead, I dug into the interviews we did. 5 participants aggregated to reveal insights, including high school, elementary school, 3 professors. In spite of workload and other differences, their most likely common problems is:


"I want the workflow of giving feedback to be straightforward and action-driven."

Teachers need a connected and simple workflow to help with feedback from student performance.

Teachers need a connected and simple workflow to help with feedback from student performance.

๐Ÿ’๐Ÿปuser personas

Identifying educators who are focused on reasoning-supported insights & who values efficiency

2 personas were created to identify with the individual needs that I discovered after synthesizing interview insights.

  • the professor is focused on coherent logic and reasoning-supported insights

  • the teacher values immediacy and less workload aside from/considering administrative tasks.

๐Ÿ–Œ๏ธ hmw questions

How might we aggregating student data vs. AI-supported coaching feedback

primary, a single connected platform

How might we aggregate scattered student performance data in a single, connected dashboard so educators can decide who to follow up with and what kind of support to give before students fall behind?

secondary, more ai-focused

How might we harness AI as a teaching assistant to translate student performance data into personalized, ready-to-act coaching feedback for the teacher to take action?

How might we aggregate scattered student performance data in a single, connected dashboard so educators can decide who to follow up with and what kind of support to give before students fall behind?

How might we harness AI as a teaching assistant to translate student performance data into personalized, ready-to-act coaching feedback for the teacher to take action?

๐Ÿ”„ information architecture

Mapping out a circular journey to rewire the Information architecture

Rather than re-designing around the given feature list, I mapped the sample scenarios into parallel feature tracks to identify design scope across each area and integrated them into a user journey.

๐Ÿ”„ Reframing Strategy

Mapping out a circular journey to rewire the Information architecture

Rather than re-designing around the given feature list, I mapped the sample scenarios into parallel feature tracks to identify design scope across each area and integrated them into a user journey.

Based on 4 sample scenarios, what looked like a disconnected menu of features revealed a circular loop that occurs between analytics and action: a teacher creates content, assigns it, tracks progress, identifies gaps, and acts. That sequence became the spine.

early-stage brainstorming to rescope the design solutions

I then reconstructed the core workflow as a five-stage user journey:


Create, Assign, Track, Identify, Act.


the dashboard focused on the identify stage so features like exporting/reporting to parents and operational utilities were deliberately set aside to keep the primary decision flow coherent rather than comprehensive.

๐Ÿ–Œ๏ธ design scoping

From Data to Deployed: How the Identify โ†’ Act Flow Turns Performance Gaps Into Packaged Support

Based on the 5-stage workflow, a stronger user story is then identified for teachers entering the dashboard: helping teachers identify outliers who need help so they can quickly package remedial work via the AI support tools.

Also, the user flow following a 2-step progression accounts for the time frame between assigning work and receiving the report, including potential student collaboration or reminders for those who haven't submitted yet.

๐Ÿ“’ solutions #1 - two-step filtering tabs

2-tab filtering to to bring the right data to the right moment

Managing assignments across multiple classes creates a constant tension: teachers need both a wide-angle view and a close-up lens, but switching between them is slow and disorienting. The two-tab system resolves this by anchoring each view to a distinct intent.

  • Filter by Class โ€” for course-level management. Teachers instantly see all assignments tied to a specific class, with direct access to actions like exporting progress reports for that cohort.

  • Filter by Assignment โ€” for daily performance monitoring. Teachers get an immediate "who, what, when, and why" snapshot of a recently sent task across every class โ€” without navigating away.

๐Ÿ“’ solutions #2 - smart nudging system

Supporting quick nudges when low-performance students have been detected

A communication tool allows teachers to send reminders to specific categories of students, such as those who "have not yet submitted" or were "absent," using smart recipient selection rather than manual cherry-picking. Also available for adjusting tones for different nudge needs.

๐Ÿ“’ solutions #3 - AI-driven teacher support

One-click for sending out remedial work for students who struggles

This feature helps suggesting next steps, such as triggering a study planner or remedial work for a group of students across different classes who struggled with the same lesson, allowing for customizing the order and group delivery immediately or scheduling.

๐Ÿ“’ iterations

Refining the product experience to make the circular journey intuitive for teachers

After reviewing the user flows & journeys, the existing screen designs and having conversations with educators, it was time to finally get on the drawing board to design with intentionality.

Challenge 1: Restructuring class charts to be visually scannable and more informative with progress bars

Used a completion bar to provide more context visually and added a cta button (view assignment) to provide a shortcut route to the filter by assignment page for educators who might want to check the assignment page first, so as to align with more action-first users' mental models.

Challenges 2: introducing action-driven student list and color-coded visual hierarchy

The student list evolved from a passive status report into an action surface, with checkboxes, filters, and per-row controls that let teachers group outliers and deploy a nudge or study plan without ever leaving the list or losing the context.

Challenge 3: Integrated smart categories for immediate action

For the remedial work page, it evolves from generating a text-based remedial suggestion to sequencing activity types and time estimates visible before anything is sent, so teachers can review and steer the content rather than approving it blindly.

๐Ÿง  Key learnings and next steps

โœ… What went well:

  • Deciding quickly on a solution and moving into the design process is tempting considering time constraint, but design thinking has to stay in the why behind the whole system.

  • When early test revealed friction and confusion, it's where to reshape and reframe the user problem as well as the design's direction. A focus on surfacing actions within a purposeful user journey goes a long way than adding beautiful charts.

๐Ÿคฏ What could be improved:

  • In the future iterations, educators can customize the unit of the classes/courses based on their personal needs in the settings

  • Rather than manually editing each content piece, a future iteration would replace manual content edits with a lightweight axis UI: adjust intensity (e.g: difficulty level, depth) and learning dimension (e.g: focus areas, topic scope), and the AI regenerates accordingly. Less authoring, more calibrating.