(Out-of-the-box discovery of Compliance Rules and their metadata)


The LeanIX VSM SonarQube integration offers the automated out-of-the-box creation and updating of LeanIX Compliance Rule Fact Sheets. In this way, we provide Compliance information that can be linked to Software Artifacts to understand what Compliance Rules a Software Artifact is breaking. VSM is focused on the high-level information of Compliance Rules.

Integrate with SonarQube to:

  • Get holistic overview of your organization's Compliance Rules
  • Understand how many Compliant rules your Software Artifacts are breaking and insights into more such information


The SonarQube Integration works with the LeanIX Integration Hub to scan your company data, process it into an LDIF and automatically trigger the Inbound Integration API processor.


Activating the integration

Please reach out to your CSE/M to have the integration activated in your workspace.


This integration needs running a docker image in your environment with certain env variables to be passed, more details can be found in the below sections.


  1. In your VSM workspace go to 'Administration > Integration Hub'
  2. Create a new data source by clicking on 'New Data source'
  3. Enter the following details:
    1. Connector Name: 'vsm-sonarqube-connector'
    2. Data source name: any name to identify the data source.
  4. Enter the following attributes in the connector template:
    1. sonarqubeUrl – The URL of the SonarQube instance the connector should connect to,
      e.g.: http://localhost:9000
    2. sqMaxPageSize – Optional technical parameter to set the pagesize for obtaining data
      from SonarQube. This is not mandatory for the working of the connector.
    3. sqToken – A SonarQube token that grants the connector access to the SonarQube API for
      retrieving data
    "name": "vsm-sonarqube-datasource",
    "connectorConfiguration": {
        "sonarqubeUrl": ""
    "secretsConfiguration": {
        "sqToken": "********"
    "bindingKey": {
        "connectorType": "leanix-vsm-connector",
        "connectorId": "leanix-sonarqube-connector",
        "connectorVersion": "1.0.0",
        "processingDirection": "inbound",
        "processingMode": "full",
        "lxVersion": "1.0.0"

Since SonarQube is usually deployed on premise, this particular integration requires one more step to successfully bring in data. Please find the details below


"self-start" mode in Integration Hub

SonarQube connector starts in "self-start" mode. The Integration Hub Datasource cannot be started or scheduled manually. The actual scheduling should be done by making sure that connector setup in your environment run periodically (e.g. Kubernetes CronJob, Manual trigger, etc..)

Setup for External Execution of the Connector

The connector docker image is available in our public Azure container registry. The execution of this docker image can be scheduled in the environment in which the SonarQube instance to be scanned is available.

Prerequisites for Docker image execution

  • Docker CLI is available in your environment. You can test this by running the command
    docker -v in your terminal. The output should be something like
    Docker version 20.10.5, build 55c4c88 . Docker version above .If you do not have Docker installed on your host
    machine, you can install it from here

Docker Image Pull

To pull the external-executable-sonarqube-connector from our public Azure container
registry, please run the below command in your terminal

docker pull

Setting up environment variables for Docker run

We suggest creating an environment file with required environment variables mentioned below. This file path needs to be passed as an input to the Docker container that will run the external-executable-SonarQube-connector image.

  1. LX_HOST: The LeanIX host name the connector is going to connect to, e.g.:“
  2. LX_APITOKEN: A valid token obtained from the workspace the data will be delivered to on the host specified under LX_HOST“.
  3. LX_DATASOURCE_NAME: The name of the vsm-sonarqube-connector DataSource in the respective workspace that will be triggered by the connector.
LX_HOST={your domain}
LX_DATASOURCE_NAME={Integration hub datasource name}

Running the Docker Container

Run the below command to execute the connector. Please replace ./sonarqube_env.list with the path to the actual environment file you created above.

docker run --pull always  --env-file ./sonarqube_connector_env.list

The connector should start, and you should be able to see something similar in docker stdout

> [email protected] start-self /usr/src/app
> node SonarQubeConnectorSelfStart

Attempting to self start via Integration Hub.
Successfully fetched the access token.
Successfully initiated the self start. Progress can also be checked in Sync Logging.
Updated IN_PROGRESS status to Integration Hub with status 200 and message: Starting to get data from sonarqube and process the LDIF!


Sync Logging

Open "Sync Logging" tab to get understand the progress of your current integration run. Sync Logging also provides information on previous integration runs


Accessing SonarQube server

Above docker run command may need to be tweaked according to your network settings for docker to access the SonarQube server instance. For example, if docker and SonarQube server is running in the same host network --network host flag should be included in above command

Synchronized Data

Below you find how objects fetched from SonarQube are translated into LeanIX Factsheets and attributes on them.


LeanIX Value Stream Management


Software Artifact Fact Sheet


Compliance Rule Fact Sheet

Issues in a Project

Relation between Software Artifact and Compliance Rule Fact Sheets with count

Rule Language

Tag on the Compliance Rule Fact Sheet

Project Dashboard Link

Link available on Software Artifact Fact Sheet as a resource

Rule Dashboard Link

Link available on Compliance Rule Fact Sheet as a resource

Tag on Compliance Rule Fact Sheets: SonarQube

Mapping discovered APIs to Software Artifacts

Our integration will automatically discover all Rules you have in SonarQube and create Compliance Rule Fact Sheets (as detailed above). Currently, information stored in SonarQube does not allow LeanIX to connect the Compliance Rule Fact Sheets to Software Artifacts (e.g. Microservices) automatically. To leverage a semi-automatic mapping mechanism follow the steps below:

  1. Retrieve & Copy the 'Project Key' information of a particular project from your SonarQube instance
  1. In the to-be-linked Software Artifact Fact Sheet paste the project key from 1) into the field SonarQube Project Key
  1. Run the connector in a one-off run as per Configuration to instantly link the factsheets. As part of your scheduled connector run, the processors will attempt to match Compliance Rule & Software Artifact Fact Sheets automatically.

Removing irrelevant data

  • If one of your projects in SonarQube no longer breaks a particular Compliance Rule, then the relation between respective Software Artifact and Compliance Rule is removed.
  • If one your compliance rule in SonarQube instance in no longer tracked (deleted), then the Compliance Rule Fact Sheet is also archived in the workspace.

Extending the Integration

Integration also supports Integration API execution groups via Integration Hub to enable users to add custom processors. To process the data correctly, you need to add a custom processor set.

Sample data source configuration with execution groupSample data source configuration with execution group

Sample data source configuration with execution group

"executionGroup": "vsmSonarqubeInbound"

Unique execution group name for the integration is vsmSonarqubeInbound

The integration API will pick up your processors and merge them with the base processors at execution time. Make sure to set the Integration API run number accordingly.

For more information on the execution groups visit:


How do I generate SonarQube token?

  1. Go to your org's SonarQube Home page
  2. Go to "My Account" and open "Security" tab
  3. Enter a name for the token and click "Generate Token"
  4. A new token is generated and shown in the UI.

Did this page help you?