Copilot Dashboard (Beta)

The Copilot Dashboard provides insight into Copilot adoption and engagement across your engineering teams, including visualizations for impact by team AI adoption rate, impact of Copilot users vs. non-Copilot users, and AI adoption trends. Use these insights to identify issues and drive continuous improvement.

The Copilot Dashboard contains three charts: team AI adoption rate, AI user metrics, and overall adoption trends.

The Copilot Dashboard is available for cloud customers in the beta program.

Using the Copilot Dashboard

Prerequisites

Before getting started:

  • You must have already configured the GitHub integration.

    • If you configured it before October 14, 2025, you must update your permissions to enable the ability to pull in Copilot data. Your fine-grained access token should contain the admin:org read permission. See this KB article for more information.

  • In Cortex, you must have the View Eng Intelligence permission to view the dashboard, and you must have the Configure Eng Intelligence permission to configure it.

View the Dashboard

Navigate to Eng Intelligence > Dashboards to see the full list of Cortex-built and Custom Dashboards available in your workspace. Click the Copilot Dashboard.

The Copilot Dashboard contains multiple charts that you can filter and compare with key engineering metrics:

Impact by team AI adoption rate

View the impact that a team's AI adoption rate has on their cycle time, deployment frequency, incident frequency, merged PRs, PR size, time to resolution, and work items completed.

Hover over the data in the graph to see more information about a team and its metrics:

Hover over the Team AI adoption rate chart to learn more about a metric point.

Eng Intelligence compares active users against Copilot seats once daily.

Copilot users vs. Non-Copilot users

See the impact that using AI, or not using AI, has on cycle time, merged PRs, PR size, time to resolution, and work items completed. Cortex automatically creates an "AI Usage" user label that is attached to any team member who was active in Copilot within the last 7 days.

Hover over the data in the graph to see more information about the metrics:

Hover over the chart to learn more about a metric point.
Copilot adoption over time

View your overall Copilot adoption metrics over time:

See Copilot adoption metrics in the graph.

Filter and configure the Dashboard

Overlay a data point

To overlay a data point, click the dropdown in the upper left corner of a chart:

Click the dropdown in the upper left to choose a data point.

Filter the chart

Apply filters to further configure your dashboard by:

  • Time range: In the upper right corner of the page, click the time range filter to apply a different time range. By default, the dashboard shows data from the last 30 days.

  • Display: In the upper right corner of the page, click Display and choose whether to view the graphs by day, week, or month.

  • Operation: In the upper right corner of a graph, click Average to open the dropdown menu for operations. You can choose from average, max, median, min, P95, and sum.

  • Other filters: In the upper right corner of a graph, click Filter to apply filters for entity type, entity, group, owner, repository, reviewer, reviewer user label, status, team, user, and user label.

Driving continuous improvement

Use the Copilot Dashboard to track and correlate key AI adoption metrics with engineering performance metrics. These insights allow you to identify issues in your processes, which enables the ability to take action and drive improvements.

Example scenario: You view your dashboard and notice that teams who have higher rates of AI adoption have a lower average cycle time (the time it takes for a single PR to go through the entire coding process). However, you also see a spike in incident frequency for those teams.

  • Identify the issue: You conclude that their process is more efficient, but as a tradeoff, they're shipping code that causes more incidents. You brainstorm with the affected engineering teams to learn how their processes have changed since adopting AI. You learn that SonarQube code coverage is lower than it has been previously.

  • Take action: You already have an AI Readiness Scorecard launched, and you see that the rule "Test coverage minimum met" is failing. You can create an Initiative on that Scorecard to ask developers to meet a particular rule by a specified deadline. In this case, you create an Initiative that asks the developers to pass the failing rule (i.e., they need to ensure higher than 80% test coverage) by the end of the month.

    • Initiatives and Scorecard tasks appear as to-do items on a developer's homepage in Cortex.

  • Review the dashboard: As developers work toward meeting the requirements, check back in on the dashboard for a real-time update into their progress.

Last updated

Was this helpful?