Knowledge Checks

Led the end-to-end design of Spekit's Knowledge Checks feature, crafting a seamless learning validation experience that was successfully adopted by 115+ customers.

Project overview

In today's digital workplace, information overload isn't just a buzzword—it's a critical business challenge. With thousands of workflows, processes, and resources to master, employees can quickly become overwhelmed, leading to slower onboarding and reduced productivity.

Knowledge Checks transforms this challenge into an opportunity. Our lightweight solution seamlessly integrates into your team's daily workflow, reinforcing learning through quick, targeted assessments that boost retention without adding to the cognitive load. By making knowledge validation effortless and engaging, we help your employees master essential information at their own pace, right where they work.

My contributions

As the sole designer for the Learning Squad, I spearheaded the complete development lifecycle of Knowledge Checks—from initial concept to successful beta launch. My comprehensive role encompassed:

  • Leading opportunity discovery and market research to identify key user needs
  • Conducting extensive user research and validation sessions to refine the solution
  • Driving iterative design improvements based on qualitative and quantitative feedback
  • Ensuring design quality and consistency across all product touchpoints
  • Orchestrating a successful beta program that engaged key stakeholders and early adopters

Glossary

GA: General availability

FE: Front-end

BE: Back-end

QA: Quality assurance

MVP: Minimum viable product

LMS: Learning management system

Timeline
Lifespan: January - October 2022
Launch date: August 2, 2022
The team
x1 Project manager
x1 Product designer (that’s me 👋🏻)
x3 FE engineers
x1 BE engineer
x1 QA engineer

Problem to be solved

Employers don’t have insight into the effectiveness of content created, and what content is increasing employee engagement and performance. Similarly, employees lack a way to measure the information they know and need to know.

Competitive analysis

Our market research revealed a critical gap in learning assessment tools. While traditional Learning Management Systems (LMS) offer knowledge evaluation capabilities, they rely on an outdated, summative approach—delivering assessments only after training is complete. This disconnected model, where testing happens outside the natural flow of work, fails to capture learning moments when they matter most.

Scope planning

Despite shifting organizational priorities requiring a rapid pivot to Knowledge Checks, we implemented a strategic approach to ensure success. Our team immediately established a phased development roadmap that balanced speed with quality. We broke down the complex initiative into manageable iterations, allowing us to maintain momentum while validating our approach at each stage.

MVP ideation

To maximize impact while minimizing risk, we strategically chose an MVP (Minimum Viable Product) approach for Knowledge Checks. This decision allowed us to:

  • Validate core assumptions with real users early in the development cycle
  • Deliver immediate value to our customer base
  • Gather crucial user insights to inform our product roadmap
  • Accelerate time-to-market for essential features
  • Maintain development agility through rapid iteration cycles

Rather than pursuing a full-featured release that might miss the mark, our MVP strategy focused on building the critical features that would solve our users' most pressing knowledge validation challenges. This approach ensured we could quickly adapt to user feedback and evolving business needs while maintaining momentum.

Content creation

We prioritized simplicity and intuitiveness in the creation experience, recognizing that cognitive overload is the enemy of effective learning tools. Our design approach leveraged clear visual hierarchy, with auto-focused primary fields guiding users through a natural flow. Using progressive disclosure, we initially presented only essential configuration options while making advanced features accessible through intentional interaction. This thoughtful reduction of complexity ensured that both novice and advanced users could efficiently create Knowledge Checks without feeling overwhelmed, while maintaining access to powerful customization options when needed.

Error validation

Our multilayered validation strategy for Knowledge Checks minimizes user frustration by catching errors early and providing clear paths to resolution. We implemented a three-tiered approach: real-time field-level validation catches issues as they occur, tab-specific indicators help users locate problems across sections, and comprehensive page alerts provide a high-level error summary. This research-backed validation system ensures users always know exactly what needs fixing and how to fix it—transforming a typically frustrating experience into a guided correction process that maintains user confidence and momentum.

Taking a Knowledge Check

We seamlessly integrated Knowledge Checks into Spekit's Chrome extension, enabling true just-in-time learning validation directly within users' workflows. This strategic implementation allows employees to validate their knowledge while actively referencing relevant content—eliminating the context-switching that plagues traditional learning platforms. By embedding assessments in the browser where work happens, we've removed the friction between learning, validation, and application, dramatically increasing both engagement and knowledge retention.

Results

We designed the Knowledge Checks results page to transform assessment data into actionable insights, helping users visualize their progress and identify knowledge gaps. This psychology-informed approach not only validates achievement but creates natural pathways back to learning content—turning each assessment into a catalyst for continued engagement.

Building a design system

The development of Knowledge Checks catalyzed a crucial transformation in Spekit's design infrastructure. Our analysis revealed a critical lack of consistency across the platform, including over 70 disparate button variants—a clear symptom of our missing design system. To solve this, we partnered with our development team to implement our first comprehensive design system, leveraging Chakra UI and Storybook.

This strategic shift from ad-hoc design to systematic components required extensive documentation and standardization, but laid the foundation for scalable, consistent user experiences across the platform. By creating detailed component specifications and usage guidelines, we not only streamlined the Knowledge Checks development process but established a robust design framework that would benefit all future product initiatives.

Prototyping the end-to-end experience

To validate our vision and align stakeholders, we developed a comprehensive prototype that demonstrated Knowledge Checks' seamless integration within the broader platform ecosystem. This high-fidelity prototype served multiple critical functions: it enabled rapid iteration of user flows, facilitated meaningful stakeholder feedback during development sprints, and provided clear visual direction for the engineering team. By mapping the complete user journey, we could effectively demonstrate both individual features and their cohesive interaction with existing platform capabilities—ensuring alignment between design vision and business objectives throughout the development process.

External beta program

To validate Knowledge Checks before full release, we conducted a structured two-week beta program with four key customers through discovery sessions, asynchronous user testing, and wrap-up meetings. This methodical approach yielded both qualitative and quantitative insights that validated our core assumptions and generated a prioritized roadmap of post-launch enhancements, directly influencing our launch strategy and product roadmap.

Common feedback themes

Reinforcement

“It’s a useful way to encourage the conversion of information into knowledge. It’s an interesting way to start thinking about reading something and then using it.”

“We will be able to test our users more frequently without it feeling like a test.”

“Increases engagement as a gentle forcing function.”

Centralization

“We don’t want our staff to go to another application just to take a quiz. We’re not going to get Lessonly just so they can have quizzes because we want everything to be in one space so they don’t get lost, especially in a remote environment.”

Simplicity

“I think the way you designed it is for it to be purposefully lightweight. You shouldn’t add complexity for the sake of adding complexity.”

“I think, to be honest, you guys have crafted a tool where it’s really simple to create quizzes, which is great as an admin. There’s not a huge amount of overhead that I’ve got to do as an admin to get that set up and published. So that’s nice and simple. I really like that.”

Timesaving

“They’re quick an not time consuming for the creator and end-users who are taking the quiz. We’re super busy so time is very important for us.”

“Time is the biggest thing. I made the quiz in two seconds between meetings. That’s what I love about Spekit, I can create things on the fly. It’s quick and easy. From the trainer perspective, it’s hard to keep up with a knowledge base so the time aspect is important in that sense.”

Motivation

“We sent out the quiz and people responded immediately. It was very self service. I don’t even think we told them that we’re going to do this and they took the quiz on their own. So I think that’s a good positive indicator.”

“I rolled it out without any warning to our advisors on the phone who help other call center advisors and they were thrilled. They just loved it. They were so excited about it.” Users are taking quizzes without having to be asked.”

Post launch enhancements

Through rigorous post-launch user testing and direct customer feedback, we identified critical areas for optimization in Knowledge Checks. Our data revealed specific pain points in the user experience, leading us to prioritize three key enhancements: improved question customization, streamlined reporting capabilities, and a live preview functionality. These strategic improvements not only addressed immediate customer needs but also strengthened the feature's long-term sustainability and scalability for enterprise users.

Sketching the solution

While digital tools dominate today's design landscape, hand sketching remains an invaluable yet underutilized technique for rapid ideation. During the enhancement phase, I leveraged this traditional approach, translating complex user workflows into quick iterative sketches. These rough visualizations helped crystallize the vision for the improved experience, allowing me to explore multiple solutions quickly before moving to high-fidelity digital designs.

Preview mode

A critical insight from employer feedback highlighted the need for a seamless preview functionality within the Knowledge Check creation workflow. We leveraged our existing Chrome extension architecture to implement a dual-pane interface: the authoring environment remained active on the left while a real-time preview rendered on the right. This split-screen approach enabled administrators to simultaneously craft assessments and experience them from an end-user perspective, streamlining the content refinement process and ensuring quality before deployment.

Multiple select questions

While the initial release of Knowledge Checks featured single-select questions as our foundational assessment type, user feedback revealed a clear need for more sophisticated evaluation methods. Based on this insight, we expanded our question framework to include multiple-select capabilities, enabling administrators to craft more nuanced assessments that could effectively measure complex concepts and validate deeper understanding. This enhancement significantly broadened the platform's assessment capabilities, allowing organizations to better verify comprehensive knowledge retention.

Analytics

To enhance training oversight, we developed a robust analytics dashboard that provided granular insights into employee assessment performance. The solution integrated two key components: a comprehensive data table displaying individual metrics such as completion timestamps, assessment scores, and completion status, alongside an executive summary featuring key performance indicators. This dual approach to data visualization empowered administrators to both quickly gauge overall training effectiveness and drill down into detailed user-level performance metrics, streamlining the process of identifying knowledge gaps and validating learning outcomes.

Adoption metrics

Our two-month post-launch analysis revealed compelling growth trends across all key performance indicators. The implementation of strategic enhancements directly correlated with increased user engagement, higher completion rates, and broader feature adoption across our customer base. This upward trajectory validated our iterative improvement strategy and demonstrated the growing impact of Knowledge Checks in strengthening organizational learning programs.

114
customers have the Knowledge Checks feature flag enabled
38
customers have published at least one Knowledge Check
107
Knowledge Checks have been published in customer orgs
741
unique viewers have been assigned at least one Knowledge Check
353
unique viewers have completed at least one Knowledge Check

Top 3 takeaways

Content relevance matters

Performance analytics revealed a direct correlation between content quality and assessment outcomes. Regular content updates and clarity improvements consistently led to higher completion rates and improved learning outcomes.

Pay attention to user engagement and interaction

Through behavioral analytics, we tracked key metrics including participation rates, completion patterns, and drop-off points within the assessment flow. This data-driven approach revealed critical insights into user interaction patterns, enabling us to optimize the learning experience.

Keep your eye on market competitiveness

In the learning technology space, rigorous competitive analysis shaped Knowledge Checks' market positioning. By identifying both gaps and opportunities in existing solutions, we developed distinctive features that addressed underserved market needs while building upon industry best practices.