Analytics Overview

Redesigning Snyk Analytics: Turning Security Data Into Decisions
‍
Snyk Analytics is the central reporting hub used by security stakeholders to understand vulnerability posture, developer behavior, and remediation performance across their organisation.The team identified an opportunity to create an Analytics homepage - a central place where security stakeholders could track progress, generate reports, and address different use cases.

My role: Partnered with PMs, engineers, and fellow designers, facilitating stakeholder workshops to validate the idea and align on direction.

  1. 🧒 Role

    Product Designer

  2. πŸ™Œ Collaborator

    Product Managers, Design System Designer

  3. πŸ—“οΈ Date

    2025

Problem

The existing Analytics experience was data-rich but insight-poor.
πŸ“ˆ No centralised view that told a coherent performance story
πŸ“ˆ High cognitive load too many charts, filters, and technical terms competing for attention.
πŸ“ˆ Unclear value for leadership teams struggled to answer basic questions like β€œAre we improving?” or β€œWhere are the biggest risks?”

Outcome

πŸ“ˆ Reframed the Analytics roadmap from β€œWhich charts should we show?” to β€œWhat decisions must this data enable?”, changing how leadership and product prioritized what to build
πŸ“ˆ Established a repeatable analytics model now used to structure and evaluate new data modules
πŸ“ˆ Shifted the product toward decision-led insights, rather than raw data or chart volume‍

My Contribution

I led the research and reframed Analytics around decision-making, not data presentation.

Discovery opportunity

Identify which metrics are most valuable for AppSec users to measure performance, and how they prefer to explore data when answering questions or proving impact.

Figure 1: Initial production design

Methods

Run workshop sessions with customers
β€’Β  Top-of-mind metrics exercise ( Left) Β β†’ uncover which numbers users instinctively track or report first.
β€’ Β Wireframe Design review (Β right) Β β†’ tested early explorations with internal stakeholders and select customers.

Figure 2: Design methods diagram

From Patterns to a New Analytics Model (JTBD)

Through interviews, workflow mapping, and real dashboard reviews, I identified five core Jobs-to-be-Done that Analytics needed to support:
1. Strategic evaluation
2. Investment justification
3. Investigation & prioritization
4. Monitoring & reporting
5. Collaboration

Using these insights, I created Jobs-to-Be-Done models for our two primary personas:

Figure 3: Strategic persona JTBD

Figure 4: Operational persona JTBD

Analytics model

Zooming out from the JTBD, a universal analytics flow emerged:

πŸ“ˆ Question - Each begins with an intent or a prompt.
β€’ Strategic persona: β€œAre we improving overall?”
β€’
Operational persona: β€œWhere are the risks?”
‍
πŸ“ˆ
Explore Data - They navigate metrics or drill deeper into details.
‍
πŸ“ˆ Summarize & Interpret - They review trends, patterns, or comparisons to form an understanding of the situation.
‍
πŸ“ˆ Β Act on Insights - They take action based on findings.
β€’ Strategic persona: Shares or presents summaries to leadership.
β€’ Operational persona: Prioritizes issues, assigns tasks, or communicates progress.

Figure 5: Analytic model storyboard

This model became the backbone for the new Analytics homepage, enabling leadership to understand overall security posture while allowing AppSec teams to drill into risk and prioritize what to fix.

Initial Sketch

I sketched multiple flows exploring how AI could simplify the experience:
πŸ“ˆ Ask a question β†’ AI insights
πŸ“ˆ Search field β†’ AI insights
πŸ“ˆ Hybrid: predefined dashboards + Β AI insights
πŸ“ˆ Hybrid: filters + Β AI insights

Figure 6: Initial sketch

Interaction Flow

The early interaction model mapped the user journey:
πŸ“ˆ Explore data β†’ drill into relevant report
πŸ“ˆ Ask a question β†’ result β†’ refine β†’ drill further
πŸ“ˆ Select a saved view β†’ interpret β†’ act
This visualized the shift toward a question-first analytics experience.

Figure 7: Interaction flow (early draft)

Data Visualization Principles

To ensure clarity and consistency, I defined four principles now used across Snyk’s analytics visuals:

πŸ“ˆ Β Highlight what matters β€” Reduce noise; emphasize trends
πŸ“ˆ Β Support exploration β€” Layered drill-down paths
πŸ“ˆ Β Respect hierarchy β€” Clear reading order and scannability
πŸ“ˆ Β Standardize visuals β€” Consistent chart types and interactions

Figure 8: Experimenting with different visualization approaches to balance clarity and actionability.

Prototype

This prototype became the reference model for next-generation Analytics. It aligned product, engineering, and leadership around a single mental model for how insights should be discovered and acted on:
πŸ“ˆ A question-first analytics flow
πŸ“ˆ Inline AI chat for ad-hoc queries and saved custom views
πŸ“ˆ Layered drill-through that reduced cognitive overload

We explored AI-driven, question-first analytics, we had to design a model that could also work within the existing data architecture. While not all prototype features shipped immediately due to architectural constraints, several product decisions β€” including the emphasis on top-level security metrics and question-first workflows β€” were directly informed by this model.
‍