Skilljar Dashboard Home Page

Customer Enhancements and Principled Tradeoffs

The redesigned Dashboard Home Page changed the way Skilljar admins start their day. What used to be a static, ignored page is now a workspace where users can see underperforming courses, resume unfinished work, and monitor academy performance at a glance.
This was more than a UI refresh. The home page now guides users to the right signals at the right time. Feedback shows admins completing tasks faster, focusing on priority content, and feeling confident the system is helping them make decisions. Leadership can log in and understand program health without digging through reports. Content creators spend less time hunting for tasks and more time taking action that matters.
It required judgment, not just ideation. Every choice balanced user value, engineering constraints, and long-term flexibility. The launch delivered immediate impact while positioning the dashboard to evolve responsibly over time.​​​​​​​
Projected Impact
Estimates are based on observed customer workflows, average task frequency, and active academy usage data.
30 minutes saved per admin per month 
Eliminated a recurring manual enrollment export workflow previously required to monitor trends. 
25 seconds saved per interaction
Reduced course editing from several navigational steps to one, compounding across frequent use.
3 days → 30 seconds to insights
Analysis that previously required reviewing thousands of enrollment records and aggregating common sequences can now be surfaced instantly in a single interaction.
Before
Before this work, the Skilljar dashboard home page functioned primarily as a visual extension of the left-hand navigation. Large buttons linked to areas customers could already access elsewhere, offering no new information or even guidance. The page provided no sense of momentum, no signal of what mattered, and no help prioritizing work.
Usage data and customer feedback confirmed what we suspected: customers ignored the home page altogether.
My Role
I led this project end-to-end with a specific mandate: ensure we were building the right thing for the right level of investment. That meant framing the problem and defining success alongside product and engineering leadership before a single pixel was pushed. 
From there I led customer research to identify high-value use cases, facilitated sessions across product, engineering, sales, and customer success to align on what mattered, and designed through iteration rather than declaration. Throughout, I worked closely with engineering to stay within scope and made principled tradeoffs to protect both focus and long-term flexibility.
Research and Validation
To start, I facilitated sessions with internal stakeholders across product, sales, and customer success to align customer needs with business objectives, demo considerations, and technical feasibility.  These were two 90-minute sessions where these stakeholders were encouraged to dream and theorize. We wanted to say "Yes, and..."
Once these sessions were complete I worked with our engineering team to fully understand constraints, and sharpen the persona that we would focus on with other members of the product team.
With clear objectives and a focused persona in mind, I led structured research sessions with customers across key roles. We walked seven participants through multiple high-level concepts for the dashboard and explored 20+ potential widget variations, gathering feedback on what would be most useful. Using the Kano model to gauge reactions we were able to prioritize the features that would deliver the most value.

In the Kano model, a feature that the participant would "Like it" if it were present and "Dislike it" if it were missing moves to the top of the list to be worked on. Conversely, if they would "Like it" if it were there, and "Tolerate it" if it were not it is an indicator that the feature can wait.

During this process I learned several key elements that would shape what the team did next. 
First, customers did not like "magic widgets" they needed to understand and be able to explain how and where data was coming from. If something was abstract customers did not like it.
Second, customers already had a very good idea of what courses were doing well. They needed to understand what courses were doing poorly so they could act on them. Negative signals on the dashboard were seen as very useful.
Third, customers had an expectation that the dashboard would provide insights that they could act on. It wasn't enough to know how many enrollments and completions they had, they needed to know what to do next.
This process produced a clear, prioritized set of concepts that informed the design phase and ensured our work was both customer-centered and achievable within budget.
What We Built
After research and stakeholder alignment, we landed on three widgets focused on our core persona: the academy admin and content creator.
Track Academy Trends gave admins a rolling three-month view of enrollments and completions at a glance. Before this widget, seeing that data meant navigating to a report, downloading a CSV, and building a chart manually. Now it's on the dashboard at login. If they want more detail, one click takes them to the full report. This saves customers around 30 minutes every month of manual work.
Fine Tune Your Courses flipped the original approach on its head. Our instinct was to surface courses doing well. Research told us that was wrong; customers already knew what was working. What they needed was to see what wasn't. This widget surfaces low-enrollment courses automatically and gives admins a simple way to edit, remove, or dismiss them. It didn't just save time. It created a behavior that didn't exist before.
Continue Editing is the simplest of the three, and the one that surprised us most in testing. It's the last five courses worked on, plus a Create New Course button. Actions a person could already do, but buried several clicks deep. Customers got genuinely excited about this in sessions. The lesson: you don't always need a technological leap to deliver real value. This saved our customers roughly 25 seconds every time they took this action.
Making Tradeoffs
Working within a time-boxed engineering window and focused on the content creator persona, we made principled tradeoffs to maximize value while minimizing risk. Here’s how we approached three major tradeoffs during this project:
Tradeoff 1: Enrollment Trends – “Customers want it, but it’s big to do, so not yet.”
During the interviews several customers mentioned wanting a longer timeline on enrollments and completions. We ultimately limited enrollment trends to a three-month view. Displaying longer timelines would have required rebuilding backend infrastructure, delaying the launch. Instead, we linked users to the full enrollments report for extended data, delivering immediate value while leaving room for future enhancements. This kept the redesign lightweight, usable, and aligned with Phase 1 goals.
Tradeoff 2: Customization – “It’s neat, but we don’t have enough widgets or personas to justify it.”
While some customers and internal stakeholders requested customizable widgets, the first iteration didn’t support enough roles or widget types to make it meaningful. We deferred customization until the core widgets were validated in real world use. This allowed us to launch and continue to evaluate needs, ensuring that future customization would be informed and impactful.

Tradeoff 3: AI Insights – “We NEED this, but rushing would be risky.”
Participants wanted actionable insights powered by AI, but launching alongside the redesign would have introduced legal, technical, and trust risks. While customers were eager for actionable AI insights, we decided to sequence this as a follow-up release, ensuring legal review, engineering validation, and design rigor before launch.​​​​​​​ This decision allowed us to launch on time, maintain reliability, and prepare a foundation for a thoughtful, trustworthy AI experience.
Making a Fast Follow Real: AI Insights

Early thinking on the AI insights feature, this flyout appears when the Explore Insights button is selected. 

In product work, “fast follow” often means “never.” We knew rushing AI into the initial dashboard release would compromise quality, but we also knew customers wanted actionable insights. The opportunity was real.

After some discussions about pros, cons and timelines I recommended sequencing AI as a dedicated follow-up release, ensuring it received the legal review, engineering validation, and design rigor it required, without jeopardizing the dashboard launch. This was a difficult choice to make because there was clear signal for this capability.
Rather than ship a black-box solution, I worked closely with a PM to ensure that recommendations would be accurate, explainable, and trustworthy. We partnered to design a structured prompting system aligned to our reporting data model, ensuring insights were relevant, accurate, and repeatable. For customers, this surfaced as a simple “Explore Insights” action. Powerful behind the scenes, effortless on the surface.

Early questions and queries using representative data and AI generated mockups.
These were used to rapidly test, validate, and refine prompts before diving into the UI design.

Early on, we explored more robust options like conversational analytics, putting those aside for the first version due to feasibility and hallucination concerns. We landed on a flyout as the right design pattern to protect computing resources and not overwhelm users with too much data. This would allow us to have a dedicated question with a robust prompt behind the scenes. This would open when the user selected the Explore Insights button. Typography, spacing, and motion were honed through rounds of iteration and critique to make every interaction feel intentional.
Recommendations referenced specific metrics, explained observed patterns, and communicated uncertainty when appropriate. We added visible guardrails and lightweight usability enhancements, including a one-click copy feature to make insights easy to share.
The final flyout design centered on a question we knew would deliver immediate value: "What are the most common paths that a learner takes through the academy?" This seemingly simple question (which course do learners take first, what do they take next, and what comes after that) was deceptively complex to answer before this.
The way learners actually moved through content had made this analysis difficult in the past, and it had taken customers days to figure out, if they attempted it at all. Enterprise customers with significant BI investments could surface this insight, and we'd even set it up for some of them. They loved it. That's how we knew it was the right first question to tackle for everyone.
Before launch, we secured legal approval, implemented quality controls for prompt outputs, and instrumented structured feedback to measure trust and usefulness from day one.​​​​​​​
The result transforms analysis that once required days of digging into seconds and a button press. Dozens of columns and potentially thousands of rows were distilled into a focused page of insights, enabling customers to take faster, more informed actions.
Because we sequenced the work intentionally, the fast follow didn’t die. 
A Note on Craft
It's easy to get lost in all of the things that "have" to get done on a project like this. Direct customer asks, balancing scope, managing RBAC, legal reviews etc. It's easy in those cases to get so focused on those elements that you forget about Design Craft.
When I work on a new feature I like to pause about midway through and ask questions like "What is an existing component in our design library that we could upgrade?" "Is there some design debt that we can pay off?" Sometimes timelines are just not in your favor, and you just have to use what you've got. 
In this case, we were able to update our empty states across the Skilljar Dashboard. Now they had personality. Instead of just telling you "that" you could do a task they told you "why" you would want to do that task. 
I got to draw new illustrations and contribute to the Gainsight global component library in a way that was both informational and... well... just fun.

Throughout the process, I looked for ways to raise the bar on design craft. One example was creating new empty states and drawing custom icons to make each interaction feel purposeful and polished.

Outcome
The redesigned dashboard home page fundamentally changed how work begins inside Skilljar. What had been a passive navigation surface is now an operational control center where admins log in to understand what requires action, not hunt for links. 
Negative performance signals are surfaced immediately, unfinished work is resumable in a single click, and insights are one interaction away. 
"I'm excited that leadership can just log in and see completion and enrollment numbers."
"I like the new admin dashboard [and] being able to see which courses have low engagement at a glance to know where to update every time you log into Skilljar. "
That shift in posture, from navigation to operational intelligence, was significant enough that the home page was selected to lead our January 2026 release week. It became the clearest expression of where the product is headed: toward faster decision-making, more visible performance signals, and actionable guidance embedded directly into the workspace.
Back to Top