Skip to main content

Supplier scoreboard

Phase 2 — Scoring engine not yet active

The Supplier Scoreboard page is visible in the portal, but the automated scoring engine that calculates dimension scores and overall supplier ratings is not yet active. The page will show an empty state until the scoring engine is turned on for your account.

This page describes the full scoring methodology as it will work when the engine is active. Contact support@traceable.digital to be notified when scoring becomes available.

The Supplier Scoreboard provides a quantitative view of each supplier's performance across the three dimensions that most directly affect your ability to build complete, accurate, and up-to-date DPPs: their responsiveness, the quality of the data they provide, and the ongoing validity of their compliance documentation.

Navigate to Suppliers > Scoreboard to access the full scoreboard view.


What the Scoreboard Measures

The scoreboard evaluates each active supplier across three dimensions. Each dimension produces a score between 0 and 100. The three dimension scores are combined into a single Overall Supplier Score using a weighted average.


Dimension 1: Response Rate (40% weight)

Response rate measures how reliably a supplier responds to data requests within the agreed timeframe.

Calculation: The percentage of data requests sent to this supplier that were submitted within the due date, over a rolling 12-month window.

Response Rate = (Requests submitted on time / Total requests sent) × 100
  • A request counts as "submitted on time" if the supplier submits their response on or before the due date.
  • A request that is submitted late (after the due date) counts as a partial response: it contributes 50 points to the calculation rather than 100, to distinguish between a late response and no response.
  • A request that was cancelled before the due date (e.g., because the product was discontinued) is excluded from the calculation.
  • New suppliers with fewer than 3 completed requests display a "Insufficient data" indicator instead of a score, as a score based on 1 or 2 requests would not be statistically meaningful.

Dimension 2: Data Completeness (40% weight)

Data completeness measures the quality of submitted responses — specifically, whether the supplier provides values for all the fields they were asked about.

Calculation: Across all submitted responses in the rolling 12-month window, the average percentage of requested fields that were filled with a value (not left blank).

Data Completeness = (Total accepted/submitted field values / Total requested fields) × 100
  • A field counts as completed whether the response is accepted or still awaiting review — completeness measures whether data was provided, not whether it was accepted.
  • If a supplier provides a value for a field but adds a note saying "value unknown — please advise", this still counts as a completed field. Completeness measures presence of a response, not accuracy.
  • Fields that the supplier marks as "Not applicable to our product" are excluded from the denominator (they are not counted as missing fields).

Dimension 3: Certificate Validity (20% weight)

Certificate validity measures the proportion of compliance certificates that this supplier has uploaded via the Supplier Portal (in response to your data requests) that are currently within their stated validity period.

Calculation: The percentage of documents uploaded by this supplier that have an expiry date set in the future.

Certificate Validity = (Documents with expiry date in future / Total documents with expiry date) × 100
  • Only documents with an expiry date are included in this calculation. Documents without an expiry date (such as a datasheet without a renewal requirement) are excluded from the denominator.
  • The calculation is based on the current date at the time the scoreboard is viewed — it recalculates dynamically as certificates expire.
  • A certificate that expires within the next 30 days is flagged with an "Expiring Soon" indicator on the supplier record, even though it is still counted as valid.

How Scores Are Calculated

The Overall Supplier Score is a weighted average of the three dimension scores:

Overall Score = (Response Rate × 0.40) + (Data Completeness × 0.40) + (Certificate Validity × 0.20)

Example:

  • Response Rate: 85
  • Data Completeness: 70
  • Certificate Validity: 90

Overall Score = (85 × 0.40) + (70 × 0.40) + (90 × 0.20) = 34 + 28 + 18 = 80

The rolling 12-month window means the score reflects recent behaviour. A supplier who was poor 18 months ago but has improved significantly in the past year will show a healthy score today. Conversely, a previously strong supplier who has recently become unresponsive will see their score decline over the following weeks as poor recent requests enter the rolling window.


What a Good Score Looks Like

Score RangeStatusInterpretation
80 – 100HealthyThe supplier is reliably responsive, provides thorough data, and maintains current certificates. This is the baseline expectation for a mature supplier relationship.
60 – 79Needs AttentionOne or more dimensions are underperforming. This typically warrants a conversation with the supplier to understand the cause — it may be a temporary resource issue, a process gap, or misaligned expectations about due dates.
0 – 59At RiskSignificant performance gap across one or more dimensions. Suppliers in this band create compliance risk: late or incomplete data delays DPP completion, and expired certificates reduce your own compliance score and may constitute a gap in your supply chain due diligence.

An overall score below 60 does not mean you should immediately terminate the supplier relationship — particularly if the supplier is a sole source for a critical component. It is a trigger to investigate and address the root cause. The dimension breakdown makes it clear whether the problem is responsiveness, data quality, or certificate management.


Using the Scoreboard to Identify At-Risk Suppliers

The scoreboard view supports several ways to surface suppliers that need attention:

Sort by Score

Click the Overall Score column header to sort suppliers from lowest to highest. The lowest-scoring active suppliers appear at the top.

Filter by Dimension

Use the dimension filter buttons to see only suppliers who are underperforming on a specific dimension. For example, clicking Certificate Validity < 80 shows all suppliers who have at least one expiring or expired certificate.

Expiring Certificates Alert

The scoreboard highlights any supplier with a certificate expiring within 30 days with an amber badge. This allows you to proactively follow up with the supplier before the certificate lapses rather than discovering the gap after the fact.

Trend Indicators

Each supplier's score includes a trend arrow (up, down, or stable) showing whether their score has improved or declined compared to the same date 90 days ago. A downward trend on a supplier who is currently in the Healthy band is an early warning sign worth monitoring.


Exporting Scoreboard Data

To export the scoreboard data for use in reporting or internal supplier management processes:

  1. Navigate to Suppliers > Scoreboard.
  2. Apply any filters you want to apply to the export (the export reflects the currently filtered view).
  3. Click Export CSV.

The CSV export includes:

  • Supplier name, type, and internal reference
  • Overall score
  • All three dimension scores
  • Number of data requests sent and submitted (in the rolling window)
  • Number of documents uploaded and number expired/expiring soon
  • Date the score was last recalculated

The export is a point-in-time snapshot. Scores are recalculated dynamically each time the scoreboard is viewed, so the export reflects the state at the moment of download.

Scoreboard data can be used in supplier review meetings, quarterly business reviews, and as supporting evidence in supply chain due diligence documentation required under Article 72 of EU Battery Regulation 2023/1542.