Automate AI Governance

To configure your Cortex workspace for AI Governance, we recommend the following actions:

Use Cortex features to drive AI Governance

Expand the tiles below to learn about configuring Cortex features to drive AI Governance.

Step 1: Ingest data and solve ownership 🔌

For AI initiatives to succeed, it is crucial to import your services, resources, infrastructure, and other entities, and to have clear visibility into the ownership of your entities.

Connecting your entities to Cortex establishes a single source of truth across your engineering organization. It enables the ability to get timely information from Cortex MCP, track progress via Scorecards, automate Workflows, and gain insights from Eng Intelligence.

In addition to the built-in entity types that Cortex supports, you can create custom entity types to represent your AI-related entities. For example, you might want to create a type called "AI Models" to categorize models.

Setting ownership of entities ensures that every service and system is clearly linked to accountable teams or individuals, enabling faster incident response, reducing handoff friction, and making it possible to enforce standards consistently.

We recommend adding a group called ai-enabled to your AI-enabled services. This will allow you to filter your AI-enabled services in the relationship graph, Scorecards, and other places in Cortex where filters are supported.

Relevant integrations

To focus on driving AI Governance, Cortex recommends integrating with tools that provide visibility and control over code, project management, CI/CD, observability, and documentation. This enables visibility into which AI tools are being used and how they're managed.

Make sure you have configured integrations for the following categories:

With your data in Cortex, you have a jumping-off point to start driving AI Governance.

Step 2: Configure Scorecards to track AI governance 📋

Scorecards automate the process of checking whether services meet criteria for your AI Governance goals. Cortex's AI Governance template includes a set of predefined rules which can be customized based on your organization's requirements, infrastructure, and goals. It is structured into three levels — Bronze, Silver, and Gold — with each representing increasing levels of governance.

The Scorecard template contains rules that check for industry best practices, such as:

  • PR reviews required from codeowners, ensuring humans review AI-written code

  • Automated security testing in CI/CD

  • Incident response plan for AI security

Prerequisites

One of the rules in this template requires a custom data field on entities to track whether entity owners have read and reviewed the ATLAS Matrix. Before using this rule, create a custom data field named owners-reviewed-mitre-atlas-matrix.

Step 2.1: Create the Scorecard and configure the basics

  1. On the Scorecards page in your workspace, click Create Scorecard.

  2. On the AI Governance template, click Use.

    Click Create Scorecard, then click the AI Governance template.
  1. Configure basic settings, including the Scorecard's name, unique identifier, description, and more.

Step 2.2: Review and modify the rules

While Cortex's template is based on common industry standards, you may need to adjust the rules based on which tools you use and how your organization prioritizes standards and requirements. You can reorder, delete, and edit rules, you can add more rules to a level, and you can assign more points to a rule to signify its importance.

When adding or changing the template rules, you can select from a list of available pre-built rules. Behind each rule is a Cortex Query Language (CQL) query; you can also write your own queries to further refine your rules.

Step 3: Automate processes via Workflows ⚙️

You can use Workflows to streamline and standardize processes relating to your AI governance initiatives.

Workflows to establish adherence to best practices

  • When Scaffolding new services, you can use templates to ensure that every new service starts with baseline standards (e.g., runbooks include AI-driven diagnostics, services integrate AI-based anomaly detection, risk assessment files are included, etc.).

    • As a best practice, your Scaffolder template should include files for:

      • AI service configuration security

      • AI security documentation and guidelines

      • Data privacy and PII protection measures

      • External AI vendor risk assessments

      • AI model access controls and authentication

Note that the AI Governance Scorecard template checks for the files listed above.

Step 4: Configure Cortex MCP 🤖

Cortex MCP can significantly help boost efficiency by providing instant, conversational access to critical service and team information directly from your MCP client. Use Cortex MCP to ask questions about visibility, compliance, accountability, and risks.

  • It provides real-time, structured answers: Ask questions like "What AI models are currently deployed in our environment?" or "Give me next steps for my AI Governance Scorecard." MCP fetches the data in real time from Cortex's API, ensuring accurate and up-to-date information about service health, ownership, and operational readiness.

  • It gives you quick access to your centralized data: Efficiency goals can be slowed down by uncertainty over who owns models, pipelines, and other AI services. Use the MCP to quickly find out which teams are accountable for each tool in your environment.

  • It enables quick access to Scorecard details: If you implement Scorecards to measure AI-related initiatives, you can use the MCP to understand quickly how healthy your services are and how you can improve scores.

Step 5: Review Eng Intelligence metrics attributable to AI 📈

When focusing on an AI governance use case, you might want to know whether AI tools are improving efficiency without sacrificing reliability.

Use Eng Intelligence features — the DORA dashboard, Velocity Dashboard, and Metrics Explorer — to understand how well teams are performing before and after AI initiatives have been implemented.

Review trends in Eng Intelligence graphs and metrics.

Review baselines in areas such as cycle time, incident frequency, and time to resolution, and review the trends over time as AI initiatives are adopted.

This visibility gives insight into where teams can improve efficiency across the software development lifecycle and whether the addition of AI tooling is impacting delivery and reliability.

AI Governance in action

Learn about what ongoing AI Governance looks like in AI Governance in action.

Last updated

Was this helpful?