Vendor Scorecard: Redesigning a legacy supplier performance workflow into NIQ standards
Modernized a legacy Precima scorecard into the NIQ design system while prototyping comparison features that helped clients evaluate supplier performance more clearly.


Overview
Duration
3 months
My Role
Lead Product Designer
Team Role
2 Product Managers
2 BI Developers
2 Data Scientists
3 Engineers
Scope
Research, UX, UI, Prototyping, Testing
Tools
Figma
Excel
Highcharts
Jira
Industry
Retail supply chain
Vendor performance management
Platform
Web (Desktop-first)
Users Impacted
Used by regional vendor and retail teams during recurring performance reviews
Inventory Managed
Designed to support standardized supplier evaluation across multiple regions
The Challenge
The supplier scorecard already existed in the older Precima environment, but its structure and presentation no longer matched the direction of NIQ’s newer design standards. At the same time, business stakeholders wanted to demonstrate more advanced comparison functionality to clients — including side-by-side chart comparison and clearer sparkline labeling.
The challenge was therefore twofold: redesign the legacy scorecard into a more consistent NIQ experience, and prototype new comparison patterns that could better support client review conversations.
Activate-aligned redesign direction

Business Requirement
My task was to redesign the legacy supplier scorecard using NIQ design standards and create a prototype that demonstrated comparison functionality for clients — including chart comparison patterns and sparkline label interactions.
The goal was to make the scorecard feel more modern, more consistent, and more effective for performance review conversations.
Bring the legacy scorecard into NIQ design patterns and visual standards.
Improve comparison workflows
Support side-by-side chart comparison and clearer metric interpretation.
Use interaction concepts like sparkline labels to communicate future product value.
Business requirements and KPI logic

Design Principles
The redesign was guided by three principles: align the experience to NIQ standards, make supplier performance easier to compare, and prototype interactions that communicated value more clearly in client conversations.
1
Aligned
Bring the experience into NIQ standards
The legacy supplier scorecard needed to evolve beyond older Precima patterns into a more modern, system-consistent NIQ experience. This meant updating structure, component behavior, and hierarchy to align with the broader supply-chain suite.
2
Comparable
Make supplier performance easier to compare
The scorecard needed to support clearer comparison across metrics, timeframes, and supplier views. KPI layout, chart structure, and comparison patterns were designed to help users interpret change faster.
3
Communicable
Prototype for clearer client conversations
The work also needed to help stakeholders communicate future product value to clients through interaction concepts like chart comparison and sparkline labels.
Research & Insights
To redesign the scorecard effectively, I looked beyond the interface itself. I reviewed the legacy reporting structure, business requirements, metric definitions, and workflow patterns to understand how users moved from high-level KPI monitoring into comparison, investigation, and detailed validation.
The goal was not simply to modernize the UI, but to clarify how dense supplier-performance data should be structured for faster review, deeper comparison, and more confident decision-making.

Information Architecture & User Flows
The scorecard needed to support three behaviors: first-time orientation, repeat monitoring, and deep investigation.
Rather than treating every user the same, I structured the experience around how people actually moved through the report. New users needed a clear path from KPI summary to deeper evidence. Returning users needed to quickly check what changed. Power users needed faster filtering, richer comparison, and detailed validation.
This led to a layered information architecture built around four core views: Overview, Performance, Trends, and Compliance Details. Together, these views supported a consistent progression from high-level scanning to focused comparison, then into evidence-backed investigation.

Designed to help first-time users understand the report structure, review active filters, scan top-level KPIs, and move into supporting evidence only when needed.

Optimized for deeper analysis through faster filter refinement, multi-dimensional comparison, tooltip-based interpretation, and detailed validation in compliance views.

Built for repeat monitoring, helping users resume context quickly, compare periods, review key changes, and validate whether follow-up action was needed.
The architecture was designed to support different levels of familiarity while keeping the reporting flow consistent: scan → compare → investigate → validate.
KPI and Reporting Architecture
The redesign needed to support more than a visual refresh. It had to organize executive KPI summaries, comparison analysis, trend evaluation, and detailed validation into a structure that felt clear, layered, and review-ready.

CORE DECISIONS
Executive KPI Summary
Surfaced high-level signals like OSA rate, missed sales, flags, value sales, and operational health.
Comparison and Trend Analysis
Supported current vs previous period review, standard vs promo analysis, and comparison across performance dimensions.
Detailed Validation
Connected summary signals to geo analysis, compliance details, root causes, and tabular evidence.
Low-Fidelity Concepts and Iterations
Before moving into high-fidelity reporting screens, I explored how the scorecard should be structured at a lower level — from KPI grouping and chart hierarchy to comparison layouts and detailed evidence views.
These concepts helped test how users would move between summary signals, trend analysis, and tabular validation. The goal was to find a structure that felt modular, scalable, and easier to interpret before refining it into the final NIQ Discover-style reporting experience.
Convergence
After exploring multiple directions, I converged on a more balanced reporting model that combined the strengths of each concept without overcommitting to any single one.
The KPI-first concept improved top-level orientation, but pushed deeper comparison too far down. The comparison-first concept made trend analysis stronger, but weakened quick scanability. The investigation-first concept brought evidence forward, but felt too deep too early for first-pass review. The modular reporting direction provided the strongest foundation because it created a more scalable structure across summary, comparison, and detailed validation.
The final direction moved toward a layered reporting flow: orient first, compare second, investigate third, and validate with detailed evidence last. This structure aligned better with how users actually consumed the scorecard — starting with high-level health signals, then moving into period comparison, trend interpretation, and deeper drilldown only when needed.

Kept from KPI-First
Clear executive summary and stronger top-level orientation

Kept from Comparison-First
Trend and period analysis moved higher in the workflow

Kept from Investigation-First
Evidence and detailed validation remained essential

Kept from Modular Reporting
A more scalable, system-friendly structure across report modules
Final NIQ Discover Experience
The final direction translated the scorecard into a more structured NIQ Discover-style reporting experience — balancing executive KPI visibility, comparison workflows, trend analysis, and detailed evidence views within one system.
Executive KPI Overview
Surfaced top-level performance signals such as OSA rate, missed sales, flags, and value sales for faster orientation.

Comparison and Trend Analysis
Supported comparison across periods and performance dimensions, helping users understand what changed and where deeper investigation was needed.

Trends Views
Supported comparison across periods and performance dimensions, helping users understand what changed and where deeper investigation was needed.

Compliance Details
Enabled evidence-backed validation through detailed records, sorting, filtering, and drilldown-ready tables.

Impact & Outcome
The redesign created a more structured reporting workflow across summary, comparison, and validation. Rather than functioning as a single dashboard, the scorecard supported recurring supplier-performance reviews with clearer scanability, stronger comparison, and deeper investigation.
4+
Regional teams
Adoption
Adopted across recurring vendor review workflows, helping standardize how performance was monitored.
Validation
The final direction was shaped through iterative review and feedback on how the scorecard supported real reporting behaviors. The strongest signals were consistent: users needed faster top-level orientation, comparison had to become more visible, and detailed tables still needed to remain accessible for validation.
Rather than validating a single screen in isolation, the focus was on whether the overall reporting flow worked — from KPI summary, to comparison, to evidence-backed follow-up.
The evolving structure made the scorecard easier to scan at the top level while still supporting deeper analytical follow-up.

Jimmy Martins
— Director of Design, NIQ
Compliance Levels
Orders OTIF
85.67%
21pp
Line Items OTIF
32.67%
21pp
%On Time
85.67%
21pp
Fill Rate
85.67%
21pp
NSV
--
NA
What improved
The summary layer became easier to scan. KPI grouping and status blocks made it clearer what needed attention first, without forcing users into detailed analysis too early.

What stayed essential
Detailed tables and compliance views remained critical. Even with stronger overview and trend views, users still needed evidence-backed records to validate what changed and why.

What feedback reinforced
Comparison had to be more visible in the workflow. Trend and period-based analysis became more useful once it moved higher in the reporting structure instead of being buried below summary content.

— Retail Category Manager, Walmart
Comparison vs. simplicity
Problem:
Adding more comparison made the first view feel heavier.
Decision:
Made comparison easier to access without letting it dominate the overview.
Trade-off:
The report became more analytically useful, but slightly less minimal at first glance.
What Worked Well
Layering summary, comparison, and validation improved scanability.
Surfacing comparison earlier made performance shifts easier to interpret.
Keeping detailed evidence accessible preserved trust in the workflow.
What I'd Do Differently
Explore richer contextual guidance for repeat users.







