# Measure AI impact

You can leverage the [Eng Intelligence](/improve/eng-intelligence.md) features in Cortex to measure the impact of AI.&#x20;

### Copilot Dashboard

The [Copilot Dashboard](/improve/eng-intelligence/dashboards/ai-impact.md) provides insights into the adoption and impact of AI coding tools (starting with GitHub Copilot) across your engineering teams. It allows you to:

* **View AI adoption rates by team:** See how many developers are actively using Copilot and how adoption trends over time.
* **Correlate AI usage with engineering performance:** Compare key metrics (cycle time, deployment frequency, incident frequency, merged PRs, PR size, time to resolution, and work items completed) between teams or users who use AI tools and those who do not.
* **Track AI adoption over time:** Monitor how Copilot usage changes and how it affects delivery and reliability metrics.
* **Filter and analyze:** Use filters for time range, team, user, and more to drill into the data that matters most to you.

These insights help you identify whether AI tools are improving efficiency, accelerating delivery, or introducing new risks, and allow you to take targeted action based on real data.

### Metrics Explorer

[Metrics Explorer](/improve/eng-intelligence/metrics-explorer.md) includes AI-specific metrics such as:

* **AI adoption rate:** Percentage of licensed seats that are active users of AI coding tools.
* **Active AI users:** Number of users who used AI tools in a given period.

You can group any Eng Intelligence metric by the Cortex-generated "AI Usage" user label to compare performance between AI users and non-AI users.

## Drive continuous Improvement

You can use insights from the Copilot Dashboard and Metrics Explorer to understand what your teams are doing well and areas where they could improve. Use this data to inform your next steps and promote continuous improvement.

For example, if you notice a spike in incidents among AI users, you could [create an Initiative](/improve/initiatives.md) to improve test coverage or code review practices.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.cortex.io/guides/ai-excellence/measure-ai-impact.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
