Scorecard examples
Last updated
Was this helpful?
Last updated
Was this helpful?
The Scorecard use cases and examples on this page are based on engineering teams across a wide spectrum of sizes and maturity levels:
Learn about Scorecard use cases in Cortex Academy
Learn more about leveraging Cortex to achieve operational readiness in the
Learn more about planning and managing migrations using Cortex in the
Scorecards are often aspirational. For example, an SRE team may define a Production Readiness Scorecard with 20+ criteria that they think their services should meet to be considered "ready" for SRE support.
The engineering team may not be resourced to actually meet those goals, but setting objective targets helps drive organization-wide cultural shifts and sets a baseline for conversations around tech debt, infrastructure investment, and service quality.
See a DORA Metrics Scorecard as an example:
Assume the following for an entity:
Its last commit was within 24 hours
There were zero rollbacks in the last 7 days
The ratios of incidents and rollbacks to deploys in the last 7 days are both zero
It hasn't averaged at least one deploy per day in the last week
In this case, the entity would have No Level.
It's passing all of the rules in the Bronze level, but failing a rule in the Steel level. Because of this, it has not achieved the Steel level, the first one that an entity can pass.
Once the entity averages at least one deploy a day in the last week, it will automatically achieve the Bronze level, since it's passing the other four rules required for that level.
This kind of gamification motivates developers to not only progress through the levels, but to maintain the quality of their entities over time.
We recommend making each level of the Scorecard achievable, even if challenging, to keep developers motivated.
You can add as many levels as you want to a Scorecard. You can also add as many rules to each level as makes sense for the Scorecard, but keep in mind that an entity must pass all rules in a given level in order to progress to the next one.
Cortex users commonly define Scorecards across several categories:
Development Maturity: Ensure services and resources conform to basic development best practices, such as established code coverage, checking in lockfiles, READMEs, package versions, and ownership.
Operational Readiness: Determine whether services and resources are ready to be deployed to production, checking for runbooks, dashboards, logs, on-call escalation policies, monitoring/alerting, and accountable owners.
Operational Maturity: Monitor whether services are meeting SLOs, on-call metrics look healthy, and post-mortem tickets are closed promptly, gauging if there too many customer-facing incidents.
Security: Mitigate security vulnerabilities, achieve security compliance across services, measure code coverage
Best Practices: Define organization-wide best practices, such as infrastructure + platform, SRE, and security. For example, the Scorecard might help you ensure the correct platform library version is being used.
Best practices are unique to every organization and every application, so make sure to work across teams to develop a Scorecard measuring your organization's standards.
The following example uses JavaScript best practices: