Scorecards
Use Scorecards to establish best practices, track migration, promote accountability among teams, enforce standardization across entities, or define maturity standards.
In a Scorecard, entities are scored against rules. Rules can reference entities' metadata within Cortex, and can pull data from third-party integrations. Use levels and points in Scorecards to gamify the process and encourage developers to make progress toward goals.

You can configure Scorecards directly in the Cortex UI, without needing to manage them solely through code. This makes it easier to get started, iterate over time, and involve more teammates without requiring deep context. If you prefer a GitOps approach, that's supported as well. See Scorecards as code for more information.
On this page, learn about viewing and evaluating Scorecards. For step by step instructions on creating Scorecards, see Creating a Scorecard.
Common Scorecard use cases
For information about common use cases and examples, see Scorecard examples.
Looking to dive deeper? Check out the Cortex Academy course on Scorecards, available to all Cortex customers and POVs.
Viewing and editing Scorecards
View Scorecards list
Click Scorecards in the main nav in Cortex.
All of your organization’s Scorecards are listed under All. Under Mine, you’ll find Scorecards you’ve created, as well as Scorecards that evaluate entities you own.

Edit a Scorecard
If you have not disabled UI editing, then you can edit a Scorecard in the Cortex UI. You can edit the name, description, levels, rules, draft status, filter criteria, and the entity's being evaluated by the Scorecard. You cannot edit a Scorecard's unique identifier.
You must have the Edit Scorecard
permission.
To edit:
Navigate to the Scorecard in Cortex.
Click Edit in the upper right corner of the Scorecard page:
Click "Edit" in the upper right corner. Make changes to your Scorecard, then at the bottom of the page, click Save Scorecard.
Any time you edit and save your Scorecard, Cortex will automatically begin reevaluating the relevant entities to reflect your changes.
Review a Scorecard
When reviewing scores, identify where teams or entities have the most friction. Can you automate any processes via Cortex Workflows to improve efficiency?
When you navigate to a Scorecard's page in Cortex, you can see high-level information at the top of the page:
Median level entities have achieved
Percent of entities at the highest level
Percent of entities without level
Number of entities per level
The scheduled Scorecard calculation
Whether notifications or rule exemptions are enabled
On the side the page, you can see:
A description of the Scorecard
The total number of entities being scored
The number of scheduled rules
The next evaluation time for the Scorecard

On the Scorecard page, click through each of the tabs to learn more:
Under Scores, view each entity that is being scored via this Scorecard, the entity's current overall level, and its points or number of rules passed per level.
Point-based Scorecards will display progress based on points, while level-based Scorecards will display the number of passing rules.

For example, in the screen shot above, the first two entities achieved 50 points out of 50 possible points within the Bronze level of the Scorecard. They also each achieved 3 out of 3 possible points under the Silver level.
View more about each entity
Click an entity to open a side panel with more information:

Next to the entity name: A link to its entity details page
At the top of the panel: The entity's current level, score, rank, percentile, and an overview of its rule progress
Under the Rules tab: A list of failing rules, passing rules, and rules not evaluated yet
Under the Exemptions tab: A list of exempted rules
Under the Progress tab: A graph showing the entity's progress
Filter and sort the scores list
In the upper right corner of the list, click Filter to narrow the scope of the list. You can filter by domain, failing rules, groups, levels, owners, passing rules, and teams.
Click Level to select a different method to sort by, and choose whether to sort ascending or descending.
Share the Scorecard
Need to show progress to leadership? Click Share in the upper right corner. This copies a link to your clipboard, which you can share with anyone who has access to view this Scorecard in your workspace.
Scorecard examples
Gamification motivates developers to not only progress through the levels of a Scorecard, but to maintain the quality of their entities over time.
See the Scorecard examples page to view common use cases and see how others are motivating their teams with Scorecards.
Scorecard evaluation
After you create and save a Scorecard, Cortex automatically evaluates the entities that the Scorecard applies to. How often the Scorecard is evaluated depends on the evaluation window you set when configuring the basic fields of the Scorecard.
Scorecard evaluations are also automatically initiated after Scorecard definition changes are processed via GitOps or other means. This evaluation is of lower priority than a manual evaluation and may take longer to return results. To see the re-evaluation results more quickly, you can cancel an automatic evaluation and trigger a manual evaluation.
When processing a Scorecard, the individual entity evaluations are spread out over the evaluation window you configured. The evaluation will finish before the next evaluation time is reached. Also note that CQL filters require their own independent evaluation, which can increase the time required to process the evaluation.
Trigger an evaluation
To manually trigger a Scorecard evaluation:
Navigate to the Scorecard's page in Cortex.
In the details panel on the right side of the page, click Evaluate now.
Cancel an evaluation
If a Scorecard is in the process of being evaluated, you can cancel it:
Navigate to the Scorecard's page in Cortex.
Click Cancel at the top of the page.
Configuring Scorecard settings
Admins can configure settings and view Scorecard rule exemptions under Settings > Scorecards.
See Scorecard settings for more information.
Troubleshooting and FAQ
Cortex offers a wide variety of out-of-the-box integrations compatible with Scorecard rules. Sometimes, a Scorecard may have multiple integration queries with different endpoints, each with its own rate limits. While we strive to manage these rate limits efficiently, please keep the following points in mind regarding rule failures and third-party integrations:
Are errors logged in the UI?
Yes, any errors will be displayed alongside the rule for the relevant entity in the UI.
Are the rules related to the errors retried?
If we encounter a rate limit error, we will retry the rule for most integrations. This process occurs up to five times with a backoff period between attempts. If the rule fails after five retries, the last recorded score will be used.
How are scores affected by missing values due to API rate limiting?
If all retries are still affected by rate limit errors, we will use the last known score for the rule. This also applies to 5xx errors from upstream APIs and any unexpected errors from the Cortex side. If a new rule fails without a previous score, the rule will fail and display a 429 error.
Last updated
Was this helpful?