Internally hosted integrations
Integrating internally hosted services into Cortex
You can use Cortex Axon Relay to source data from an internally-hosted system and reflect that data in Cortex. Cortex Axon is a framework that can be used to build jobs that run in your environment and securely send data to Cortex. Everything that is possible via a direct API integration can also be achieved through Axon, enabling flexible and secure deployment models without sacrificing capability.
Use Axon Relay to allow Cortex to access internally-hosted integrations for Bitbucket, GitHub, GitLab, Jira, Prometheus, and SonarQube.
You can also use Axon Relay to call internal service endpoints via a Workflow in Cortex.
How it works

Cortex Axon uses an open-source project published by Snyk called Snyk Broker. Snyk Broker uses WebSockets to create a secure tunnel between the internal network and cloud-hosted Cortex. HTTP requests are redirected through this WebSocket channel to the Axon agent. You do not need to open inbound firewall ports, as the tunnel is initiated from the internal network.
When deploying Axon, you provide your API tokens or credentials as secrets stored on infrastructure you own, within your network. Axon securely holds these credentials and uses them to proxy requests to third-party integrations. Your sensitive information stays inside your virtual private cloud (VPC) and is never exposed to Cortex cloud services.
This is what the process looks like:
On the Cortex side, you register the integration, using an alias name you provide.
The Cortex Axon Docker container is started with your Cortex API key, the integration type, and the alias.
The Cortex Axon agent connects to the Cortex service, authenticates, and registers itself with the integration type and alias name.
The agent starts an instance of the snyk-broker client process, and uses configuration details from the
/registercall (the registration in the previous step) to connect to the Cortex backend instance of the snyk-broker server.Once this is established, API calls made on the Cortex side are relayed to the internal network, and the responses are relayed back to the Cortex service.
How to use Cortex Axon
Axon is composed of an agent which runs in a Docker container (cortex-axon-agent) and integrates with Kubernetes, creating a secure tunnel between the broker and Cortex.
Prerequisites
Before getting started:
Create an API key in Cortex.
Create authentication credentials for the integration you're configuring.
Step 1: Set up the Cortex Axon agent
Step 1.1: Configure the Relay in Cortex
In Cortex, click Integrations. Search for the integration you are setting up, then click +Install.

For the configuration type, select Relay.
In the side panel, enter an alias and configure any other necessary fields. At the bottom, click Save.
Step 1.2: Create a .env file and a docker-compose.yml file
Locally on your machine, create a file called
.env. Inside the file, add contents for the integration you are configuring:
See the variables for your integration in the README.
For example, for GitLab you would add:
CORTEX_API_TOKEN=your_cortex_token
GITLAB_TOKEN=your_gitlab_tokenMore information will be coming soon on Kubernetes deployments.
Locally on your machine, create a file called
docker-compose.yml. Inside the file, add contents for the integration you are configuring:
services:
axon:
image: ghcr.io/cortexapps/cortex-axon-agent:latest
env_file: .env
env:
- GITHUB_API=api.github.com
- GITHUB_GRAPHQL=api.github.com/graphql
command: [
"relay",
"-i", "github",
"-a", "github-relay", # this is the alias you set up in the Cortex UI
# if you are using a Github App token, add the following line
# "-s", "app",
]For integrations other than GitHub, use the following format.
Make sure to replace INTEGRATION under env:, and any mentions of integrationName, with the name of the integration you are configuring.
services:
axon:
image: ghcr.io/cortexapps/cortex-axon-agent:latest
env_file: .env
env:
- INTEGRATION_API=api.integrationName.com
command: [
"relay",
"-i", "integrationName",
"-a", "integrationName-relay", # this is the alias you set up in the Cortex UI
]Step 2: Run the agent
Run the agent in a production environment
In a production environment, you will use a Helm chart, provided by Cortex.
Run the agent in a sandbox environment
In your CLI, run the command
docker compose up.You should see the agent start and connect to Cortex.
Verify that your agent is working:
In Cortex, go to Integrations then navigate to your integration's settings page.
Next to the Relay configuration you set up in the previous steps, click the play icon to test the integration.
If you watch the logging output in your CLI, you should see the agent receive the request and forward it to your internal service.
The page in Cortex should display a success message.
Examples
See examples of using Axon with unsupported tools in the Cortex Axon repository.
Last updated
Was this helpful?