GitLab CI/CD in Detail: Stages, Jobs, Pipelines for Modern Software
Fabian Peter 8 Minuten Lesezeit

GitLab CI/CD in Detail: Stages, Jobs, Pipelines for Modern Software

GitLab CI/CD: Automation in the Modern SDLC
compliance-campaign-2026 gitlab ci-cd pipelines automation devops
Ganze Serie lesen (40 Artikel)

Diese Serie erklärt systematisch, wie moderne Software compliant entwickelt und betrieben wird – von EU-Regulierungen bis zur technischen Umsetzung.

  1. Compliance Compass: EU Regulations for Software, SaaS, and Cloud Hosting
  2. GDPR: Privacy by Design as the Foundation of Modern Software
  3. NIS-2: Cyber Resilience Becomes Mandatory for 18 Sectors
  4. DORA: ICT Resilience for the Financial Sector Starting January 2025
  5. Cyber Resilience Act: Security by Design for Products with Digital Elements
  6. Data Act: Portability and Exit Capability Become Mandatory from September 2025
  7. Cloud Sovereignty Framework: Making Digital Sovereignty Measurable
  8. How EU Regulations Interconnect: An Integrated Compliance Approach
  9. 15 Factor App: The Evolution of Cloud-Native Best Practices
  10. 15 Factor App Deep Dive: Factors 1–6 (Basics & Lifecycle)
  11. 15 Factor App Deep Dive: Factors 7–12 (Networking, Scaling, Operations)
  12. 15 Factor App Deep Dive: Factors 13–15 (API First, Telemetry, Auth)
  13. The Modern Software Development Lifecycle: From Cloud-Native to Compliance
  14. Cloud Sovereignty + 15 Factor App: The Architectural Bridge Between Law and Technology
  15. Standardized Software Logistics: OCI, Helm, Kubernetes API
  16. Deterministically Checking Security Standards: Policy as Code, CVE Scanning, SBOM
  17. ayedo Software Delivery Platform: High-Level Overview
  18. ayedo Kubernetes Distribution: CNCF-compliant, EU-sovereign, compliance-ready
  19. Cilium: eBPF-based Networking for Zero Trust and Compliance
  20. Harbor: Container Registry with Integrated CVE Scanning and SBOM
  21. VictoriaMetrics & VictoriaLogs: Observability for NIS-2 and DORA
  22. Keycloak: Identity & Access Management for GDPR and NIS-2
  23. Kyverno: Policy as Code for Automated Compliance Checks
  24. Velero: Backup & Disaster Recovery for DORA and NIS-2
  25. Delivery Operations: The Path from Code to Production
  26. ohMyHelm: Helm Charts for 15-Factor Apps Without Kubernetes Complexity
  27. Let's Deploy with ayedo, Part 1: GitLab CI/CD, Harbor Registry, Vault Secrets
  28. Let's Deploy with ayedo, Part 2: ArgoCD GitOps, Monitoring, Observability
  29. GitLab CI/CD in Detail: Stages, Jobs, Pipelines for Modern Software
  30. Kaniko vs. Buildah: Rootless, Daemonless Container Builds in Kubernetes
  31. Harbor Deep Dive: Vulnerability Scanning, SBOM, Image Signing
  32. HashiCorp Vault + External Secrets Operator: Zero-Trust Secrets Management
  33. ArgoCD Deep Dive: GitOps Deployments for Multi-Environment Scenarios
  34. Guardrails in Action: Policy-Based Deployment Validation with Kyverno
  35. Observability in Detail: VictoriaMetrics, VictoriaLogs, Grafana
  36. Alerting & Incident Response: From Anomaly to Final Report
  37. Polycrate: Deployment Automation for Kubernetes and Cloud Migration
  38. Managed Backing Services: PostgreSQL, Redis, Kafka on ayedo SDP
  39. Multi-Tenant vs. Whitelabel: Deployment Strategies for SaaS Providers
  40. From Zero to Production: The Complete ayedo SDP Workflow in an Example

TL;DR

  • GitLab CI/CD is much more than a build tool: When used correctly, it becomes the central backbone of your delivery process – from commit to production, including documentation and audit trail.
  • Core components like .gitlab-ci.yml, stages, jobs, and artifacts allow you to model a clear, reproducible supply chain – including build (Kaniko, SBOM), test (pytest, integration, Trivy), package (push to Harbor, signature), deploy (GitOps with ArgoCD).
  • Modern requirements from the Cyber Resilience Act, NIS‑2, DORA, and GDPR can be directly embedded in GitLab CI/CD: SBOM generation, CVE scanning, signed images, audit trails, and clean secrets management.
  • A well-thought-out pipeline structure makes compliance an integrated feature of your software supply chain – rather than a downstream control mechanism.
  • ayedo supports you in orchestrating GitLab CI/CD, Harbor, ArgoCD, and your Kubernetes ecosystem to enhance delivery speed, security, and compliance equally.

GitLab CI/CD as the Backbone of Your Delivery Process

Today, those responsible for software manage not just source code but entire supply chains: source code management, build processes, container images, policy checks, deployments, audits. GitLab CI/CD brings these threads together in one place.

Instead of loosely linking individual tools, with GitLab CI/CD you define:

  • when each step is executed,
  • what artifacts result from it,
  • which quality and compliance gates precede it,
  • and how changes reproducibly reach your production environment.

The pipeline thus becomes the formal, verifiable description of your delivery process. This is not only operationally valuable but also a strong lever in audits in the context of NIS‑2, DORA, and the upcoming Cyber Resilience Act.


GitLab CI/CD Basics: .gitlab-ci.yml, Stages, Jobs, Artifacts

.gitlab-ci.yml as Central Contract

The .gitlab-ci.yml in the repository is the “contract” between your team and GitLab:

  • It describes the entire pipeline – from tests to builds to deployment.
  • Every change is versioned and can be reviewed within the framework of merge requests.
  • The review and change history of the pipeline itself become part of your audit trail.

Organizationally, it makes sense to clearly define responsibility for this file: e.g., ownership in the platform team, review obligation with product teams.

Stages: The Logical Structure of Your Supply Chain

Stages structure the pipeline into successive phases. A common pattern for modern container-based development:

  1. test – Quality assurance (unit, integration, and security tests)
  2. build – Build container images and generate SBOM
  3. package – Publish and sign images and charts
  4. deploy – GitOps sync to target environments

Within a stage, multiple jobs can run in parallel. Only when all jobs of a stage are successfully completed does the next one begin. This way, you model explicit “quality gates”, for example: no build as long as tests or security scans fail.

Jobs: Specific Tasks with Clear Responsibility

A job describes a specific task, e.g.:

  • “Run unit tests with pytest”
  • “Build container image with Kaniko”
  • “Scan image for CVEs with Trivy”
  • “Package Helm chart and push to Harbor
  • “Trigger ArgoCD sync for staging”

Each job:

  • runs in a defined container image (your “build environment”),
  • has a clear script (e.g., shell commands),
  • can generate artifacts,
  • and can be specifically repeated or manually triggered.

Artifacts: The Evidence of Your Pipeline

Artifacts are files that a job generates and passes on to subsequent jobs or human reviewers – particularly important for compliance:

  • Test reports (JUnit, coverage)
  • Security reports (Trivy, SAST/DAST)
  • SBOM files (e.g., CycloneDX, SPDX)
  • Packaged artifacts (Helm charts, manifests)

In a world shaped by the Cyber Resilience Act and NIS‑2, artifacts are not only technically useful but also evidence: They document that defined checks have actually taken place.


A Practical Pipeline: From Kaniko to ArgoCD

Below, we outline a typical, proven pipeline structure for containerized applications on a modern platform with Kubernetes, GitLab, Harbor, and ArgoCD.

Stage “build”: Container Image and SBOM

In the build phase, it’s about reproducible, secure images.

Kaniko for Container-Native Builds

Instead of running a Docker daemon in the CI, the pipeline uses a Kaniko image as the build environment. The job:

  • pulls the application source code,
  • builds the container image from it,
  • tags it with meaningful tags (commit hash, SemVer, possibly build number),
  • and pushes it into a project in Harbor.

Important from a compliance perspective:

  • The image is built from a clearly defined base (trusted base image).
  • The build logs are reproducible and viewable in the job log.
  • The base images used can be part of your approved catalog.

SBOM Generation in the Build

Parallel or directly afterwards, another job generates a Software Bill of Materials (SBOM) for the image just built, e.g., in CycloneDX or SPDX format. The SBOM is:

  • stored as an artifact,
  • linked with the image tag,
  • and ideally stored together with the image in Harbor.

The Cyber Resilience Act demands transparency over software components; automated SBOM generation in the build is the most practical way to achieve this.

Stage “test”: Quality and Security

The test stage can execute multiple jobs in parallel.

Unit and Integration Tests with pytest

A job runs, for example, unit tests with pytest in the application container or a corresponding test image, another job runs integration tests (e.g., against a temporary database).

Best Practices:

  • Test jobs save their reports as artifacts.
  • Coverage metrics are fed back into GitLab so you can see trends over time.
  • Merge requests are only approved if defined quality criteria are met.

Security Scans with Trivy

Another job uses Trivy or a similar tool for vulnerability scanning:

  • Either against the freshly built container image,
  • or directly against the SBOM (SBOM-based scanning).

Trivy reports are stored as artifacts. In a compliance context, you can define:

  • which severity levels (CVSS scores) are tolerated,
  • for which severity levels a merge or deployment is blocked,
  • and how exceptions (temporarily accepted vulnerabilities) are documented.

This directly bridges to requirements from NIS‑2 and DORA for systematic vulnerability management.

Stage “package”: Registry, Signatures, and Helm Charts

In the package phase, the results of the previous stages are transformed into reusable units.

Push to Harbor and Signature with Notary

A job handles publishing the container image in Harbor:

  • Images are pushed into dedicated projects (e.g., “dev”, “staging”, “prod”).
  • Metadata like SBOM references are stored alongside the actual image.

Subsequently, another job signs the image – for instance, via Notary or COSIGN:

  • A cryptographic signature is created, ensuring that the image originates from your pipeline.
  • Downstream systems (e.g., admission controllers in the cluster) can enforce that only signed images are deployed.

Signed container images are a central component for a trustworthy supply chain in the sense of the Cyber Resilience Act.

Helm Charts Packaging

For Kubernetes-based deployment, Helm charts are built and stored in the same registry:

  • Chart versions are automatically derived from Git metadata.
  • The chart references the previously built and signed images.
  • Charts can also be signed to ensure that deployments only use approved versions.

Stage “deploy”: GitOps with ArgoCD

Instead of triggering deployments directly from the CI pipeline, GitOps is gaining ground: Your pipeline only changes Git repositories, while ArgoCD synchronizes these changes to target clusters.

A typical deploy job:

  • updates the image tags or chart versions in the GitOps repository (e.g., via a commit on an “environment” repo),
  • optionally triggers an ArgoCD sync or health check,
  • documents in the job log which version was deployed where.

Advantages:

  • Every change to an environment is traceable as a Git commit – including author, timestamp, and comment.
  • Rollbacks are standardized: ArgoCD can revert to previous commits.
  • Audit requirements from NIS‑2 and DORA for traceability and recoverability are met at the process level.

Compliance-by-Design: CRA, NIS‑2/DORA, and GDPR in the Pipeline

Cyber Resilience Act: SBOM, CVE Scanning, Signed Images

The Cyber Resilience Act was passed at the EU level in 2024 and comes into effect 20 days after publication in the Official Journal; most obligations will become applicable after a transition period of 36 months. GitLab CI/CD is the ideal vehicle to pragmatically implement these obligations:

  • Transparency over components: SBOM generation in the build as a mandatory step.
  • Vulnerability management: Automated CVE scanning of images and SBOMs; defined thresholds block releases in case of critical vulnerabilities.
  • Supply chain integrity: Signed container images and charts, enforced by policy engines in the cluster.

Instead of relying on manual documentation, your pipeline automatically generates this evidence – consistently and auditable.

NIS‑2 and DORA: Audit Trails and Control Mechanisms

The NIS‑2 directive (Directive (EU) 2022/2555) came into force on 16.01.2023; member states must transpose it into national law by 17.10.2024. The DORA regulation (Regulation (EU) 2022/2554) applies directly to financial companies in the EU from 17.01.2025.

Both regimes require robust processes and traceable controls. GitLab CI/CD supports you in this:

  • Transparent change paths: Every pipeline is linked to a commit, a merge request, and an author.
  • Approval workflows: Deployments in sensitive environments can be tied to manual confirmations (manual jobs) or merge request approvals.
  • Automated logging: Job logs, artifacts, and environment information serve as an audit trail, without teams having to maintain additional tables.

GitLab thus becomes an important piece of your overarching compliance strategy.

GDPR: No Secrets in Git, Clean Handling of Personal Data

The GDPR (DSGVO) has been in effect since 25.05.2018 and also addresses technical and organizational measures in software development.

Key points for your pipeline:

  • No plaintext secrets in Git: Credentials, tokens, private keys do not belong in the repository. Instead:
    • GitLab CI variables (masked, protected),
    • external secret stores (e.g., Vault) and their integration via the External Secrets Operator in the cluster.
  • Log hygiene: Job logs should not contain personal data. This concerns

Ähnliche Artikel