Automate AI Maturity
To configure your Cortex workspace for AI Maturity, we recommend the following actions:
Connect Data: Ingest data and ensure ownership is assigned to your entities
Standardize: Configure a Scorecard to enforce standards for your AI tools
Streamline: Automate AI tooling requests via Workflows, and streamline your abilities to obtain information from Cortex using Cortex MCP
Improve: Review Eng Intelligence metrics and look for trends relating to AI tool usage
Use Cortex features to drive AI Maturity
Expand the tiles below to learn about configuring Cortex features to drive AI Maturity.
Step 1: Ingest data and solve ownership 🔌
For AI initiatives to succeed, it is crucial to import your services, resources, infrastructure, and other entities, and to have clear visibility into the ownership of your entities.
Connecting your entities to Cortex establishes a single source of truth across your engineering organization. It enables the ability to get timely information from Cortex MCP, track progress via Scorecards, automate Workflows, and gain insights from Eng Intelligence.

Setting ownership of entities ensures that every service and system is clearly linked to accountable teams or individuals, enabling faster incident response, reducing handoff friction, and making it possible to enforce standards consistently.
Relevant integrations
To focus on driving AI Maturity, Cortex recommends integrating with tools that provide visibility and control over code, deployments, monitoring, on-call, and documentation. This allows you to see the impact that AI tooling has on efficiency across engineering processes.
Make sure you have configured integrations for the following categories:
Version control: Azure DevOps, Bitbucket, GitHub, GitLab
Observe metrics like PR size, review times, and merge frequency to see if AI is accelerating workflows
Project management: GitHub, Jira, Azure DevOps, ClickUp
Compare lead time for AI-influenced work items vs. traditional work
External docs: Cortex also recommends linking to runbooks and documentation for your entities, ensuring your users have access to critical information.
Use Scorecards to ensure that your entities contain external docs links to AI diagnostics.
With your data in Cortex, you have a jumping-off point to start driving AI maturity.
Step 2: Configure Scorecards to track AI adoption and standards 📋
Action Item: Create a Scorecard for AI Maturity
Scorecards automate the process of checking whether services meet criteria for your AI Maturity goals. Cortex's AI Maturity template includes a set of predefined rules which can be customized based on your organization's requirements, infrastructure, and goals. It is structured into three levels — Bronze, Silver, and Gold — with each representing increasing levels of AI Maturity.
The Scorecard template contains rules that check for industry best practices, such as verifying that AI tool instructions and rules are in your repositories.
Step 2.1: Create the Scorecard and configure the basics
On the Scorecards page in your workspace, click Create Scorecard.
On the
AI Maturity
template, click Use.
Configure basic settings, including the Scorecard's name, unique identifier, description, and more.
Learn about configuring the basic settings in the Creating a Scorecard documentation.
Step 2.2: Review and modify the rules
While Cortex's template is based on common industry standards, you may need to adjust the rules based on which tools you use and how your organization prioritizes standards and requirements. You can reorder, delete, and edit rules, you can add more rules to a level, and you can assign more points to a rule to signify its importance.
When adding or changing the template rules, you can select from a list of available pre-built rules. Behind each rule is a Cortex Query Language (CQL) query; you can also write your own queries to further refine your rules.
Step 3: Automate processes via Workflows ⚙️
Action item: Configure Workflows
You can use Workflows to streamline and standardize processes relating to your AI adoption initiatives.
Workflow to automate AI tool requests
You can use Workflows to automate the process of requesting access to AI tools. See an example Workflow for automating GitHub Copilot access in Dev onboarding: User access to Copilot.
Workflows to establish adherence to best practices
When Scaffolding new services, you can use templates to ensure that every new service starts with baseline standards (e.g., runbooks include AI-driven diagnostics, services integrate AI-based anomaly detection, etc.).
See the documentation on registering a Scaffolder template and configuring a Scaffolder block.
Step 4: Configure Cortex MCP 🤖
Action Item: Configure Cortex MCP
Cortex MCP can significantly help boost efficiency by providing instant, conversational access to critical service and team information directly from your MCP client. Including adoption of Cortex MCP in your AI initiatives helps your team move faster:
It provides real-time, structured answers: Ask questions like "Who is on call for backend-server?" or "Give me all the details for parser-service." MCP fetches the data in real time from Cortex's API, ensuring accurate and up-to-date information about service health, ownership, and operational readiness.
It gives you quick access to your centralized data: Efficiency goals can be slowed down by uncertainty over who owns models, pipelines, and other AI services. Use the MCP to quickly find out which teams are accountable for each tool in your environment.
It enables quick access to Scorecard details: If you implement Scorecards to measure AI-related initiatives, you can use the MCP to understand quickly how healthy your services are and how you can improve scores.
Step 5: Review Eng Intelligence metrics attributable to AI 📈
Action Items:
Review Eng Intelligence metrics and establish baselines.
Configure a user label to identify teams using AI tools.
Use Eng Intelligence features — DORA dashboard, Velocity Dashboard, and Metrics Explorer — to understand how well teams are performing before and after AI initiatives have been implemented.

Review baselines in areas such as incident frequency and time to resolution, and review the trends over time as AI initiatives (such as AI-assisted coding) are adopted.
This visibility helps leaders prove ROI and justify further investment in AI tooling, while also giving insight into where teams can improve efficiency across the software development lifecycle.
Create user labels
If you have some teams who are adopting AI tools before others, you can apply user labels to those teams. User labels allow you to group users into cohorts to analyze metrics based on different factors. Compare metrics between engineers who use AI tools and those who don't to understand the value that AI tools provide.
Last updated
Was this helpful?