cki.kcidb.datawarehouse_submitter

Webhook responsible for submitting KCIDB data from GitLab jobs to DataWarehouse

CKI pipeline results are stored in an artifact called kcidb_all.json using the KCIDB format schemed data.

The datawarehouse_submitter listens to pipeline-herder herder.build messages and submits the corresponding results to DataWarehouse.

The service can be deployed locally by running

python -m cki.kcidb.datawarehouse_submitter

Environment variables

Name Secret Required Description
CKI_DEPLOYMENT_ENVIRONMENT no no Define the deployment environment (production/staging)
GITLAB_TOKENS no yes URL/environment variable pairs of GitLab instances and private tokens as a JSON object
GITLAB_TOKEN yes yes GitLab private tokens as configured in gitlab_tokens above
CKI_METRICS_ENABLED no no Enable prometheus metrics. Default: false
CKI_METRICS_PORT no no Port where prometheus metrics are exposed. Default: 8000
DATAWAREHOUSE_URL no yes URL to DataWarehouse.
DATAWAREHOUSE_TOKEN_SUBMITTER yes yes Token for DataWarehouse.
RABBITMQ_HOST no yes AMQP host
RABBITMQ_PORT no yes AMQP port, TLS is used for port 443
RABBITMQ_USER no yes AMQP user
RABBITMQ_PASSWORD yes yes AMQP password
RABBITMQ_CAFILE no yes AMQP CA file path
RABBITMQ_CERTFILE no yes AMQP certificate + private key file path
WEBHOOK_RECEIVER_EXCHANGE no yes AMQP exchange to receive messages
DATAWAREHOUSE_SUBMITTER_QUEUE no yes AMQP queue name that gets hooked up to the exchange
DATAWAREHOUSE_SUBMITTER_ROUTING_KEYS no yes AMQP routing keys for the messages sent to the queue

Manual triggering

Results can also be submitted manually for a single GitLab job via

CKI_LOGGING_LEVEL=INFO \
  CKI_DEPLOYMENT_ENVIRONMENT=production \
  python3 -m cki.kcidb.datawarehouse_submitter JOB_URL