Skilljar Dashboard Home Page

Customer Enhancements and Principled Tradeoffs

The redesigned Dashboard Homepage changed the way Skilljar admins start their day. What used to be a static, ignored page is now a workspace where users can see underperforming courses, resume unfinished work, and monitor academy performance at a glance.
This was more than a UI refresh. The homepage now guides users to the right signals at the right time and sets up the dashboard for future features like AI-powered insights that are trusted and actionable. Feedback shows admins completing tasks faster, focusing on priority content, and feeling confident the system is helping them make decisions. Leadership can log in and understand program health without digging through reports. Content creators spend less time hunting for tasks and more time taking action that matters.
I led this redesign end-to-end because it required judgment, not just ideation. Every choice balanced user value, engineering constraints, and long-term flexibility. The launch delivered immediate impact while positioning the dashboard to evolve responsibly over time.
"The smarter dashboard homepage with AI insights, real-time actionable tips to improve enrollments, refresh stale content, that's going to help in showcasing program value to the company."
Before
Before this work, the Skilljar dashboard home page functioned primarily as a visual extension of the left-hand navigation. Large buttons linked to areas customers could already access elsewhere, offering no new information or even guidance. The page provided no sense of momentum, no signal of what mattered, and no help prioritizing work.
Usage data and customer feedback confirmed what we suspected: customers ignored the home page altogether.
My Role
I led this project end-to-end with a specific mandate: ensure we were building the right thing for the right level of investment.
My responsibilities included:
• Framing the problem and defining success with product and engineering leadership
• Leading customer research to identify high-value dashboard use cases
• Facilitating stakeholder sessions across product, engineering, sales, and customer success
• Designing and iterating on wireframes and mockups
• Partnering closely with engineering to stay within scope and budget
• Making principled tradeoffs to protect focus and long-term flexibility
Research and Validation
To start, I facilitated sessions with internal stakeholders across product, sales, and customer success to align customer needs with business objectives, demo considerations, and technical feasibility.  These were two 90-minute sessions where these stakeholders were encouraged to dream and theorize. We wanted to say "Yes, and..."
Once these sessions were complete I worked with our engineering team to fully understand constraints, and sharpen the persona that we would focus on with other members of the product team.
With clear objectives and a focused persona in mind, I led structured research sessions with customers across key roles. We walked seven participants through multiple high-level concepts for the dashboard and explored 20+ potential widget variations, gathering feedback on what would be most useful. Using the Kano model to gauge reactions we were able to prioritize the features that would deliver the most value.

In the Kano model, a feature that the participant would "Like it" if it were present and "Dislike it" if it were missing moves to the top of the list to be worked on. Conversely, if they would "Like it" it if were there, and "Tolerate it" if it were not it is an indicator that the feature can wait.

During this process I learned several key elements that would shape what the team did next. 
First, customers did not like "magic widgets" they needed to understand and be able to explain how and where data was coming from. If something was abstract customers did not like it.
Second, customers already had a very good idea of what courses were doing well. They needed to understand what courses were doing poorly so they could act on them. Negative signals on the dashboard were seen as very useful.
Third, customers had an expectation that the dashboard would provide insights that they could act on. It wasn't enough to know how many enrollments and completions they had, they needed to know what to do next.
This process produced a clear, prioritized set of concepts that informed the design phase and ensured our work was both customer-centered and achievable within budget.
Making Tradeoffs
Working within a time-boxed engineering window and focused on the content creator persona, we made principled tradeoffs to maximize value while minimizing risk. Here’s how we approached three major tradeoffs during this project:
Tradeoff 1: Enrollment Trends – “Customers want it, but it’s big to do, so not yet.”
During the interviews several customers mentioned wanting a longer timeline on enrollments and completions. We ultimately limited enrollment trends to a three-month view. Displaying longer timelines would have required rebuilding backend infrastructure, delaying the launch. Instead, we linked users to the full enrollments report for extended data, delivering immediate value while leaving room for future enhancements. This kept the redesign lightweight, usable, and aligned with Phase 1 goals.
Tradeoff 2: Customization – “It’s neat, but we don’t have enough widgets or personas to justify it.”
While some customers and internal stakeholders requested customizable widgets, the first iteration didn’t support enough roles or widget types to make it meaningful. We deferred customization until the core widgets were validated in real world use. This allowed us to launch and continue to evaluate needs, ensuring that future customization would be informed and impactful.

Tradeoff 3: AI Insights – “We NEED this, but rushing would be risky.”
Participants wanted actionable insights powered by AI, but launching alongside the redesign would have introduced legal, technical, and trust risks. While customers were eager for actionable AI insights, we decided to sequence this as a follow-up release, ensuring legal review, engineering validation, and design rigor before launch.​​​​​​​ This decision allowed us to launch on time, maintain reliability, and prepare a foundation for a thoughtful, trustworthy AI experience.
Making a Fast Follow Real: AI Insights

Early thinking on the AI insights feature, this flyout appears when the Explore Insights button is selected. 

In product work, “fast follow” often means “never.” We knew rushing AI into the initial dashboard release would compromise quality, but we also knew customers wanted actionable insights. The opportunity was real.

I recommended sequencing AI as a dedicated follow-up release, ensuring it received the legal review, engineering validation, and design rigor it required, without jeopardizing the dashboard launch. My PM partner and I would actively pursue it as the "next" item to work on. This was a difficult choice to make because there was clear signal for this capability.
Delivering it responsibly required cross-functional alignment. Legal needed clarity on data handling. Engineering needed confidence in model performance and reliability. Product and design needed assurance that recommendations would be accurate, explainable, and trustworthy.
Rather than ship a black-box solution, I partnered closely with a PM to ensure the system was transparent by design. We partnered to design a structured prompting system aligned to our reporting data model, ensuring insights were relevant, accurate, and repeatable. For customers, this surfaced as a simple “Explore Insights” action. Powerful behind the scenes, effortless on the surface.

Questions and queries using representative data and AI generated mockups.
These were used to rapidly test, validate, and refine prompts before diving into the UI design.

We quickly landed on a flyout as the right design pattern to protect computing resources and not overwhelm users with too much data. This would open when the user selected the Explore Insights button. Typography, spacing, and motion were honed through rounds of iteration and critique to make every interaction feel intentional.
Recommendations referenced specific metrics, explained observed patterns, and communicated uncertainty when appropriate. We added visible guardrails and lightweight usability enhancements, including a one-click copy feature to make insights actionable.
Before launch, we secured legal approval, implemented quality controls for prompt outputs, and instrumented structured feedback to measure trust and usefulness from day one.
​​​​​​​The result transformed analysis that once required hours, sometimes days, of digging through reports into seconds. Dozens of columns and potentially thousands of rows were distilled into a focused page of insights, enabling customers to take faster, more informed actions.
Because we sequenced the work intentionally, the fast follow didn’t die. 

Throughout the process, I looked for ways to raise the bar on design craft. One example was creating new empty states and drawing custom icons to make each interaction feel purposeful and polished.

Outcome
The redesigned dashboard homepage fundamentally changed how work begins inside Skilljar. What had been a passive navigation surface is now an operational control center where admins log in to understand what requires action, not hunt for links. 
Negative performance signals are surfaced immediately, unfinished work is resumable without search, and insights are one interaction away. 
"I'm excited that leadership can just log in and see completion and enrollment numbers."
"I like the new admin dashboard [and] being able to see which courses have low engagement at a glance to know where to update every time you log into Skilljar. "
That shift in posture, from navigation to operational intelligence, was significant enough that the homepage was selected to lead our January 2026 release week. It became the clearest expression of where the product is headed: toward faster decision-making, more visible performance signals, and actionable guidance embedded directly into the workspace.

The home page was the first item announced during release week.

Equally important, we structured this release to set up what comes next. The research, principled tradeoffs, and instrumentation were not just about launching a better homepage, they were about defining the dashboard as a behavioral and strategic center of the platform. Before release, we identified the leading indicators that would tell us whether behavior had meaningfully changed, including engagement with surfaced performance signals, task resumption from the homepage, and interaction with AI insights. Those signals now guide how and where investment happens next.
Because we sequenced AI intentionally and avoided overbuilding in Phase 1, the dashboard is positioned to evolve responsibly. 
Rather than locking the team into an overextended solution, we created a foundation that can expand with confidence, informed by real usage, real trust signals, and real customer needs.

Back to Top