Artifact storage

How the CKI pipeline manages storage for various kinds of artifacts

Data consumed and produced by the CKI pipeline is mainly stored in S3 buckets.

Pipeline artifact storage

GitLab pipeline jobs can store up to 1 GB on gitlab.com. As kernel build jobs can produce kernel RPMs bigger than that, pipeline artifacts are stored on S3 via artifacts_mode=s3 and the BUCKET_ARTIFACTS pipeline variable.

S3 Bucket Specifications

To reference S3 buckets, two different formats are used.

Pipeline

In pipeline code, the following format is used:

BUCKET_SOFTWARE="https://endpoint.url/name-of-bucket/prefix/"
BUCKET_NAME_OF_BUCKET_PROXY="https://proxy.url"
BUCKET_NAME_OF_BUCKET_AWS_ACCESS_KEY_ID="cki_temporary"
BUCKET_NAME_OF_BUCKET_AWS_SECRET_ACCESS_KEY="super_secret"

The optional read-only proxy URL will be used in any user-facing URLs for the bucket, but not for any of the pipeline S3 API calls, which will hit the original endpoint. The access key id and secret access key should be exposed on the runner if needed. Path-style access is used everywhere.

Deployment

in deployment-all, the following format is used:

BUCKET_SOFTWARE="http://localhost:9000|cki_temporary|super_secret|software|subpath/"

Here, connection and file configuration are folded into one variable.