# Review and evaluate Scorecards

Reviewing a Scorecard's evaluation gives you insight into patterns across teams or entities, and areas to prioritize for improvement.

<div align="left"><figure><img src="/files/kin9iHz2D9vDZn4ARv1C" alt="" width="375"><figcaption></figcaption></figure></div>

When an entity is failing a rule in a Scorecard, Cortex provides the information needed to review and remediate the issue efficiently. From there, teams can [take action on the failing rule](#take-action-on-a-scorecard). Resolving these failures not only improves the health and maturity of individual entities but also strengthens the overall reliability and compliance posture of the engineering organization.

## Scorecard evaluation

After you create and save a Scorecard, Cortex automatically evaluates the entities that the Scorecard applies to. How often the Scorecard is evaluated depends on the evaluation window you set when [configuring the basic fields of the Scorecard](#step-1-configure-the-basic-scorecard-fields).

Scorecard evaluations are also automatically initiated after Scorecard definition changes are processed via GitOps or other means.

When processing a Scorecard, the individual entity evaluations are spread out over the evaluation window you configured. The evaluation will finish before the next evaluation time is reached. Also note that CQL filters require their own independent evaluation, which can increase the time required to process the evaluation.

<details>

<summary>Manually trigger an evaluation</summary>

To manually trigger a Scorecard evaluation in the Cortex UI:

1. Navigate to the Scorecard's page in Cortex.
2. In the details panel on the right side of the page, click **Evaluate now**.\\

   <div align="left"><figure><img src="/files/MMWYtHa3gnpw6LHR6CbF" alt="Click &#x22;Evaluate now&#x22; in the right side of a Scorecard" width="563"><figcaption></figcaption></figure></div>

You can also kick off a [re-evaluation of a specific entity via the API](/api/readme/scorecards.md#post-api-v1-scorecards-tag-entity-entitytag-scores).

</details>

<details>

<summary>Cancel an evaluation</summary>

If a Scorecard is in the process of being evaluated, you can cancel it:

1. Navigate to the Scorecard's page in Cortex.
2. Click **Cancel** at the top of the page.\\

   <div align="left"><figure><img src="/files/5Bic3nzNQJ2pYv0NdQY5" alt="Click &#x22;Cancel&#x22; at the top of a Scorecard" width="563"><figcaption></figcaption></figure></div>

</details>

## Review a Scorecard

### Reviewing with Cortex MCP

Use Cortex MCP to quickly gain insights into the performance of your Scorecard. Ask questions in natural language, such as: *What is going on with my DORA Metrics Scorecard?* or *What steps can I take to improve my Eng Intelligence Scorecard scores?*

<details>

<summary>Example MCP Scorecard assessment</summary>

First, ask a question about a Scorecard. The MCP will display which Cortex APIs are being used to find an answer:

<figure><img src="/files/vUoNQwFGgEfXw7V2iEqa" alt="" width="563"><figcaption></figcaption></figure>

Next, it will provide context to help you improve your scores:

<figure><img src="/files/GDuwHvRvuO31Cvp7RBnY" alt="" width="521"><figcaption></figcaption></figure>

For this example, after listing out the priorities, it also gave us a list of quick wins and a suggestion for a systematic approach:

<figure><img src="/files/CS6fnxjyDJ4S8pUJXUOe" alt=""><figcaption></figcaption></figure>

</details>

From there, [take action](#take-action-on-a-scorecard) to remediate the issues and start meeting your defined standards.

Learn how to get started in [Cortex MCP](/get-started/mcp.md).

### Reviewing in the Cortex UI

<figure><img src="/files/JF3ITESu7dP7Z4xvSMG3" alt="View basic information on the Scorecard page"><figcaption></figcaption></figure>

When you navigate to a Scorecard's page in Cortex, see high-level information, including the median levels achieved per entity, percent of entities at the highest level, and percent of entities that haven't achieved a level. In the side panel, see basic information, including the total number of services being evaluated, scheduled rules, and the next evaluation time.

{% hint style="success" %}
When reviewing scores, identify where teams or entities have the most friction. Can you automate any processes via [Cortex Workflows](/streamline/workflows.md) to improve efficiency?
{% endhint %}

#### Share the Scorecard

Need to show progress to leadership? Click **Share** in the upper right corner. This copies a link to your clipboard, which you can share with anyone who has access to view this Scorecard in your workspace.

<figure><img src="/files/pf1Plh9p18ZVmEJB7ilV" alt="Click Share in the upper right."><figcaption></figcaption></figure>

#### Scorecard tabs

<div align="left"><figure><img src="/files/pru0s0vXFmwrRjX2wjvF" alt="" width="563"><figcaption></figcaption></figure></div>

On the Scorecard page, click through each of the tabs to learn more about the Scorecard:

<details>

<summary>Scores</summary>

Under **Scores**, view each entity that is being scored via this Scorecard, the entity's current overall level, and its points or number of rules passed per level.

Point-based Scorecards will display progress based on points, while level-based Scorecards will display the number of passing rules.

<figure><img src="/files/XT2uZn5AE5gogMOFBFbi" alt="The Scores tab shows scores and levels per entity."><figcaption></figcaption></figure>

For example, in the screen shot above, the first two entities achieved 50 points out of 50 possible points within the Bronze level of the Scorecard. They also each achieved 3 out of 3 possible points under the Silver level.

**View more about each entity**

Click an entity to open a side panel with more information:

<figure><img src="/files/EIXJi6EhtCEyGX0piUOv" alt="Click into an entity to view more information."><figcaption></figcaption></figure>

* Next to the entity name: A link to its entity details page
* At the top of the panel: The entity's current level, score, rank, percentile, and an overview of its rule progress
* Under the Rules tab: **A list of failing rules**, passing rules, and rules not evaluated yet
  * Review this section closely to understand areas you can [take action](#take-action-on-a-scorecard) on.
* Under the Exemptions tab: A list of exempted rules
* Under the Progress tab: A graph showing the entity's progress

**Filter and sort the scores list**

In the upper right corner of the list, click **Filter** to narrow the scope of the list. You can filter by domain, failing rules, groups, levels, owners, passing rules, and teams.

Click **Level** to select a different method to sort by, and choose whether to sort ascending or descending.

</details>

<details>

<summary>Overview</summary>

Under **Overview**, see which entity is the highest ranking, which entity is most at-risk, which rule is the most at-risk, the most improved entity, the entity with the worst drop, and the most-moved rule.

Under those metrics, see a graph displaying the progress of all rules. Click **All rules** above the graph to filter it by specific rules.

<figure><img src="/files/DV4kI67i7iFm5tHi8L8M" alt=""><figcaption></figcaption></figure>

By default, the **Trends and changes** blocks show data from the last month. Click **Last month** to select a different timeframe.

</details>

<details>

<summary>Rules</summary>

Under **Rules**, view all of the Scorecard's rules, per level. Expand a rule to see its CQL expression:

<figure><img src="/files/CwodZ8Kq40mXnN5JJOJH" alt=""><figcaption></figcaption></figure>

Note that if you did not set a name for a rule when you created it, it will display here as a CQL expression.

</details>

<details>

<summary>Reports</summary>

Under **Reports**, click to view a [Bird's Eye report](/improve/reports/birds-eye.md), [Progress report](/improve/reports/progress-report.md), or [Report card](/improve/reports/report-card.md).

<figure><img src="/files/Vyqhr5C7kU8OlVzKwUBq" alt=""><figcaption></figcaption></figure>

</details>

<details>

<summary>Exemptions</summary>

Under **Exemptions**, view a list of all rule exemptions that have been requested.

Each request includes its current status, the rule, an expiration date, the reason for the request, and the user who requested it.

**How exemptions affect scores**

When a rule is exempted, entities are not marked as passing or failing — the rule simply does not apply to those entities.

**Approving or denying exemptions**

You must have the the `Configure Scorecard Exemptions` permission to approve or deny exemptions.

To approve an exemption, click the checkmark icon. To deny an exemption, click the X icon:

<figure><img src="/files/mVD8Y1fdVqvsjHnjVyqO" alt=""><figcaption></figcaption></figure>

</details>

<details>

<summary>Initiatives</summary>

Under **Initiatives**, see all [Initiatives](/improve/initiatives.md) associated with the Scorecard.

<figure><img src="/files/tHjgk037E3oTk0t2nTIS" alt=""><figcaption></figcaption></figure>

</details>

## Take action on a Scorecard

A failed Scorecard rule gives teams actionable goals to make incremental improvements. Teams can focus on addressing the highest-impact failures first, track progress over time, and celebrate measurable wins, which reinforces a culture of improvement.

To remediate failed rules in a Scorecard:

1. Identify the failing rules

   * View the [Scores tab](#scores) on a Scorecard page to view a list of failing rules per entity. You can group by level to understand which entities have reached each level of the Scorecard.\
     In this example, the group of entities has met the Bronze level, but they are failing rules for Silver and Gold:

   <div align="left"><figure><img src="/files/fSJ3M8LhGbdfZ5CRQuY1" alt="" width="563"><figcaption></figcaption></figure></div>
2. Understand the cause

   * While viewing scores, click into an entity to open a side panel with more information about why rules are failing.
     * When [configuring a rule](/standardize/scorecards/create.md#step-3-create-a-rule), add a failure message to give clear steps on how you can remediate the rule if it fails.

   <div align="left"><figure><img src="/files/wTKV8oL0nZwFVgMd0CfL" alt="" width="533"><figcaption></figcaption></figure></div>

   * When linking to a Cortex workflow that can be run to remediate a failing rule, Cortex will automatically pass in the entity context to the entity selector (if applicable) on the workflow, so it will be selected by default. You can also set this up manually by appending either the Cortex entity tag or entity ID to the workflow URL from a Scorecard or plugin (e.g. [?entity={{context.entity.tag}}](https://app.getcortexapp.com/admin/workflows/3403?entity=%7B%7Bcontext.entity.tag%7D%7D)).

   <div align="left"><figure><img src="/files/KvGmVKZdrf8xLDYhLKsg" alt="When linking to a Workflow from a failed Scorecard rule, Cortex automatically selects the entity to use when running the Workflow." width="375"><figcaption></figcaption></figure></div>

   * [Request exemptions](/standardize/scorecards/rule-exemptions.md) if needed: In some cases, a rule may not apply to an entity. You can request to exempt an entity from the rule.
3. Track and prioritize issues
   * Use the [Engineering homepage](/streamline/homepage.md) to see a prioritized to-do list of failing rules across your owned entities and Scorecards. This helps you focus on addressing the most impactful issues first.

     <figure><img src="/files/2Nlp5xbEwmXjIPJupouN" alt=""><figcaption></figcaption></figure>
4. Remediate the issues
   * Address the underlying issue for each failed rule. This could involve adding missing documentation, configuring a monitoring integration, or any other action required to meet the rule's criteria.
     * In the example screen shot above, the rule failed because PagerDuty is not configured for the entity. You can follow the [instructions in Cortex's PagerDuty documentation](/ingesting-data-into-cortex/integrations/pagerduty.md#editing-the-entity-descriptor) to ensure that a PagerDuty service, schedule, or escalation policy is configured in the entity's descriptor YAML.\
       ![](/files/e21UGoYqxmRY0810Ctmz)
   * [Use Initiatives](/improve/initiatives.md) to set specific deadlines for higher priority tasks. Cortex will notify owners and team members when deadlines are approaching, helping ensure accountability.
5. Re-evaluate
   * The Scorecard will automatically [re-evaluate](#scorecard-evaluation) based on its schedule (or you can manually trigger an evaluation), allowing you to see the progress made and any remaining failing rules.\\

     <div align="left"><figure><img src="/files/o15xDVPMeP848HgCnUcd" alt=""><figcaption></figcaption></figure></div>

### Scalable remediation

When reviewing your Scorecard evaluations, you may notice trends — some teams struggling in specific areas, or some entities often not meeting standards.

In addition to remediating individual rules, you may want to think about how you can improve larger processes to drive excellence in a scalable way across your organization.

#### Use Workflows

Look for frequent challenging areas across teams. For example, is a team consistently failing to add runbooks to new services? Are there any failing rules relating to onboarding processes? When a standard is frequently being missed, it indicates an opportunity to automate that process with a [Cortex Workflow](/streamline/workflows.md).

You can use Workflows to streamline a set of repeatable steps, reducing human error and speeding efficiency for your teams.

{% columns %}
{% column %}

<div align="left"><figure><img src="/files/mnVBAmw8csTSrQ3VWC7K" alt="" width="344"><figcaption></figcaption></figure></div>
{% endcolumn %}

{% column %}
For example, you could create a Workflow to [streamline the process of Scaffolding new services](https://github.com/cortexapps/hippocampus/blob/master/standardize/scorecards/broken-reference/README.md), and include a User Input step that requires your users to ensure a runbook has been documented.
{% endcolumn %}
{% endcolumns %}

#### Analyze and troubleshoot trends

While viewing a group of entities that are failing rules, you can also look at organizational ways to support your teams or adjust processes.

**Team-specific patterns**

* Are all of the failing entities owned by a newer team?
  * Ensure that the newer team members are not feeling overwhelmed and have completed any necessary training.
  * Ensure that the team understands their scope and priorities.
  * Use [Eng Intelligence](/improve/eng-intelligence.md) dashboards and metrics visualizations to identify which parts of the software development lifecycle could be improved per team.
* Does one team own more entities than other teams?
  * Ensure the workload is spread fairly across your teams.
  * If one teams owns more due to tribal knowledge, it is a signal to ensure team members are documenting their knowledge. It could also help you identify areas to conduct training sessions.
  * When knowledge has been documented, make sure it is being [linked on related entities](/ingesting-data-into-cortex/entities-overview/entities/external-docs.md).
* Are some teams or services successful across several areas?
  * When viewing teams that are successful or entities that meet all of their standards, find ways to recognize these achievements publicly. Public recognition reinforces good practices and motivates your team.

**Service lifecycle patterns**

* Do older, legacy services have low scores, while newer ones align with standards?
  * This highlights modernization opportunities.
  * Learn about using Cortex for modernization in the [Cortex Academy course "Migrations and Modernizations,"](https://academy.cortex.io/courses/migrations-and-modernizations) available to all customers and POVs.

**Adoption over time**

* Are scores trending upward across the quarter in line with company initiatives?
  * Tracking Scorecard trends across time shows whether your internal initiatives (like a push for better test coverage) are effective.
  * To promote specific standards, you can [create an Initiative in Cortex based on a Scorecard](/standardize/scorecards/create.md#create-initiative-from-scorecard), which will encourage remediation by a deadline.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.cortex.io/standardize/scorecards/evaluate.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
