Docker Actions bieten maximale Flexibilität: beliebige Programmiersprachen, volle Kontrolle über System-Dependencies und eine isolierte, reproduzierbare Ausführungsumgebung. Der Preis dafür ist Startup-Zeit – jeder Action-Aufruf muss einen Container starten. In diesem Kapitel entwickeln wir Docker Actions von der Konzeption bis zur Optimierung.
Docker Actions sind die richtige Wahl, wenn JavaScript nicht ausreicht:
| Aspekt | JavaScript Action | Docker Action |
|---|---|---|
| Startup-Zeit | ~1 Sekunde | 10-60 Sekunden |
| Sprache | JavaScript/TypeScript | Beliebig |
| Dependencies | npm packages | Alles (apt, pip, etc.) |
| Runner-Kompatibilität | Alle (Linux, macOS, Windows) | Nur Linux |
| Isolation | Keine (läuft auf Runner) | Voll (Container) |
| Image-Größe | Klein (gebundelt) | Variabel (MB bis GB) |
| Debugging | Node.js Debugging | Container-Logs |
Die Startup-Zeit ist der größte Nachteil. Bei einer Action, die in einem Workflow 20-mal aufgerufen wird, summiert sich das schnell. Für einmalige, komplexe Operationen ist es akzeptabel.
Eine Docker Action besteht aus mindestens drei Dateien:
my-docker-action/
├── action.yml # Metadaten und Konfiguration
├── Dockerfile # Container-Definition
├── entrypoint.sh # Einstiegspunkt (oder .py, etc.)
└── src/ # Optionale Quelldateien
└── main.py
name: 'Security Scanner'
description: 'Scannt Repository auf Sicherheitslücken mit custom Tools'
author: 'Security Team'
branding:
icon: 'shield'
color: 'red'
inputs:
severity:
description: 'Minimum severity to report (low, medium, high, critical)'
required: false
default: 'medium'
config-file:
description: 'Path to scanner configuration'
required: false
default: '.security-scan.yml'
fail-on-findings:
description: 'Fail the action if findings are detected'
required: false
default: 'true'
outputs:
findings-count:
description: 'Number of security findings'
report-path:
description: 'Path to the generated report'
sarif-path:
description: 'Path to SARIF output for GitHub Security tab'
runs:
using: 'docker'
image: 'Dockerfile'
args:
- ${{ inputs.severity }}
- ${{ inputs.config-file }}
- ${{ inputs.fail-on-findings }}Der entscheidende Unterschied zu JavaScript Actions:
using: 'docker' statt using: 'node20'. Das
image-Feld kann auf ein Dockerfile im
Repository oder ein vorgebautes Image verweisen.
# Basis-Image wählen
FROM python:3.12-slim
# Metadata
LABEL maintainer="security-team@example.com"
LABEL org.opencontainers.image.source="https://github.com/myorg/security-scanner"
# System-Dependencies installieren
RUN apt-get update && apt-get install -y --no-install-recommends \
git \
curl \
jq \
&& rm -rf /var/lib/apt/lists/*
# Python-Dependencies
COPY requirements.txt /requirements.txt
RUN pip install --no-cache-dir -r /requirements.txt
# Action-Code kopieren
COPY src/ /action/src/
COPY entrypoint.sh /action/entrypoint.sh
# Ausführbar machen
RUN chmod +x /action/entrypoint.sh
# Arbeitsverzeichnis setzen
WORKDIR /github/workspace
# Entrypoint definieren
ENTRYPOINT ["/action/entrypoint.sh"]Der Entrypoint ist das Script, das beim Container-Start ausgeführt wird. Es empfängt die Inputs als Argumente:
#!/bin/bash
set -euo pipefail
# Inputs aus Argumenten
SEVERITY="${1:-medium}"
CONFIG_FILE="${2:-.security-scan.yml}"
FAIL_ON_FINDINGS="${3:-true}"
echo "Starting security scan..."
echo " Severity threshold: $SEVERITY"
echo " Config file: $CONFIG_FILE"
echo " Fail on findings: $FAIL_ON_FINDINGS"
# Hauptlogik aufrufen
python /action/src/scanner.py \
--severity "$SEVERITY" \
--config "$CONFIG_FILE" \
--output "/github/workspace/security-report.json"
# Ergebnis auswerten
FINDINGS=$(jq '.findings | length' /github/workspace/security-report.json)
# Outputs setzen
echo "findings-count=$FINDINGS" >> "$GITHUB_OUTPUT"
echo "report-path=security-report.json" >> "$GITHUB_OUTPUT"
# Bei Findings und aktiviertem Flag: Fehlschlagen
if [[ "$FINDINGS" -gt 0 && "$FAIL_ON_FINDINGS" == "true" ]]; then
echo "::error::Found $FINDINGS security issues"
exit 1
fi
echo "Scan completed successfully"Docker Actions können Inputs auf drei Arten empfangen:
1. Als Argumente (args)
# action.yml
runs:
using: 'docker'
image: 'Dockerfile'
args:
- ${{ inputs.severity }}
- ${{ inputs.timeout }}# entrypoint.sh
SEVERITY="$1"
TIMEOUT="$2"2. Als Environment Variables
# action.yml
runs:
using: 'docker'
image: 'Dockerfile'
env:
INPUT_SEVERITY: ${{ inputs.severity }}
INPUT_TIMEOUT: ${{ inputs.timeout }}# entrypoint.sh
SEVERITY="${INPUT_SEVERITY:-medium}"
TIMEOUT="${INPUT_TIMEOUT:-300}"GitHub setzt automatisch INPUT_<NAME> für alle
definierten Inputs (uppercase, Bindestriche werden zu Underscores). Man
kann sich auch darauf verlassen:
# Automatisch verfügbar für input "fail-on-findings"
FAIL_ON_FINDINGS="${INPUT_FAIL_ON_FINDINGS:-true}"3. Kombination (empfohlen)
runs:
using: 'docker'
image: 'Dockerfile'
args:
- '--severity'
- ${{ inputs.severity }}
- '--config'
- ${{ inputs.config-file }}
env:
GITHUB_TOKEN: ${{ inputs.token }}Named Arguments sind selbstdokumentierend, Secrets gehören in Environment Variables.
Outputs werden in die Datei geschrieben, auf die
$GITHUB_OUTPUT zeigt:
#!/bin/bash
# Einfacher Output
echo "version=1.2.3" >> "$GITHUB_OUTPUT"
# Output mit Sonderzeichen
VALUE="String with spaces and 'quotes'"
echo "message=$VALUE" >> "$GITHUB_OUTPUT"
# Multiline Output
EOF=$(dd if=/dev/urandom bs=15 count=1 status=none | base64)
echo "changelog<<$EOF" >> "$GITHUB_OUTPUT"
cat CHANGELOG.md >> "$GITHUB_OUTPUT"
echo "$EOF" >> "$GITHUB_OUTPUT"In Python:
import os
def set_output(name: str, value: str) -> None:
output_file = os.environ.get('GITHUB_OUTPUT')
if output_file:
with open(output_file, 'a') as f:
f.write(f"{name}={value}\n")
set_output('findings-count', '42')
set_output('report-path', 'security-report.json')Neben $GITHUB_OUTPUT gibt es weitere spezielle
Dateien:
| Variable | Zweck | Beispiel |
|---|---|---|
GITHUB_OUTPUT |
Outputs setzen | echo "key=value" >> $GITHUB_OUTPUT |
GITHUB_ENV |
Env Vars für folgende Steps | echo "MY_VAR=value" >> $GITHUB_ENV |
GITHUB_PATH |
PATH erweitern | echo "/custom/bin" >> $GITHUB_PATH |
GITHUB_STEP_SUMMARY |
Job Summary | echo "## Results" >> $GITHUB_STEP_SUMMARY |
#!/bin/bash
# Variable für nachfolgende Steps setzen
echo "SCAN_TIMESTAMP=$(date -Iseconds)" >> "$GITHUB_ENV"
# Tool zum PATH hinzufügen
echo "/action/bin" >> "$GITHUB_PATH"
# Summary schreiben
{
echo "## Security Scan Results"
echo ""
echo "| Severity | Count |"
echo "|----------|-------|"
echo "| Critical | 0 |"
echo "| High | 3 |"
echo "| Medium | 12 |"
} >> "$GITHUB_STEP_SUMMARY"GitHub mountet das Repository automatisch nach
/github/workspace. Dieses Verzeichnis ist der
Standard-Arbeitsort:
WORKDIR /github/workspace#!/bin/bash
# Repository-Dateien sind direkt verfügbar
if [[ -f "package.json" ]]; then
echo "Node.js project detected"
fi
# Dateien erstellen (für Artifacts etc.)
mkdir -p reports
./scanner --output reports/scan-results.json
# Dateien sind nach Container-Ende im Workspace verfügbargraph LR
subgraph "GitHub Runner"
A[/home/runner/work/repo/repo]
B[/home/runner/work/_temp]
C[/home/runner/work/_actions]
end
subgraph "Docker Container"
D[/github/workspace]
E[/github/workflow]
F[/github/file_commands]
end
A -->|mount| D
B -->|mount| E
| Host | Container | Inhalt |
|---|---|---|
| Repository-Checkout | /github/workspace |
Ausgecheckter Code |
| Workflow-Temp | /github/workflow |
Temporäre Workflow-Dateien |
| Event Payload | /github/workflow/event.json |
Trigger-Event als JSON |
#!/bin/bash
# Event-Payload lesen
EVENT_NAME="$GITHUB_EVENT_NAME"
EVENT_PATH="$GITHUB_EVENT_PATH"
echo "Event: $EVENT_NAME"
# Bei Pull Requests
if [[ "$EVENT_NAME" == "pull_request" ]]; then
PR_NUMBER=$(jq -r '.pull_request.number' "$EVENT_PATH")
PR_TITLE=$(jq -r '.pull_request.title' "$EVENT_PATH")
echo "PR #$PR_NUMBER: $PR_TITLE"
fiIn Python:
import os
import json
def load_event():
event_path = os.environ.get('GITHUB_EVENT_PATH')
if event_path and os.path.exists(event_path):
with open(event_path) as f:
return json.load(f)
return {}
event = load_event()
if 'pull_request' in event:
pr = event['pull_request']
print(f"PR #{pr['number']}: {pr['title']}")Entwickeln wir eine praxisnahe Docker Action in Python, die Lizenz-Compliance prüft.
license-checker/
├── action.yml
├── Dockerfile
├── entrypoint.sh
├── requirements.txt
├── src/
│ ├── __init__.py
│ ├── main.py
│ ├── scanner.py
│ └── reporter.py
└── tests/
└── test_scanner.py
name: 'License Compliance Checker'
description: 'Scans dependencies for license compliance issues'
author: 'Legal & Engineering'
branding:
icon: 'check-circle'
color: 'green'
inputs:
allowed-licenses:
description: 'Comma-separated list of allowed licenses (e.g., MIT,Apache-2.0,BSD-3-Clause)'
required: false
default: 'MIT,Apache-2.0,BSD-2-Clause,BSD-3-Clause,ISC,CC0-1.0,Unlicense'
denied-licenses:
description: 'Comma-separated list of explicitly denied licenses'
required: false
default: 'GPL-2.0,GPL-3.0,AGPL-3.0,LGPL-2.1,LGPL-3.0'
package-manager:
description: 'Package manager to scan (auto, npm, pip, cargo, go)'
required: false
default: 'auto'
fail-on-violation:
description: 'Fail the workflow if license violations are found'
required: false
default: 'true'
exclude-packages:
description: 'Packages to exclude from scanning (one per line)'
required: false
default: ''
outputs:
total-packages:
description: 'Total number of packages scanned'
violation-count:
description: 'Number of license violations found'
unknown-count:
description: 'Number of packages with unknown licenses'
report-path:
description: 'Path to the detailed JSON report'
runs:
using: 'docker'
image: 'Dockerfile'
env:
INPUT_ALLOWED_LICENSES: ${{ inputs.allowed-licenses }}
INPUT_DENIED_LICENSES: ${{ inputs.denied-licenses }}
INPUT_PACKAGE_MANAGER: ${{ inputs.package-manager }}
INPUT_FAIL_ON_VIOLATION: ${{ inputs.fail-on-violation }}
INPUT_EXCLUDE_PACKAGES: ${{ inputs.exclude-packages }}FROM python:3.12-slim AS builder
# Build-Dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
gcc \
libffi-dev \
&& rm -rf /var/lib/apt/lists/*
# Python-Dependencies in virtuelle Umgebung
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
FROM python:3.12-slim
# Runtime-Dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
git \
jq \
curl \
# Für npm license scanning
nodejs \
npm \
# Für cargo license scanning
cargo \
&& rm -rf /var/lib/apt/lists/* \
&& npm install -g license-checker
# Virtuelle Umgebung vom Builder kopieren
COPY --from=builder /opt/venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
# Action-Code kopieren
COPY src/ /action/src/
COPY entrypoint.sh /action/
RUN chmod +x /action/entrypoint.sh
WORKDIR /github/workspace
ENTRYPOINT ["/action/entrypoint.sh"]requests>=2.31.0
pyyaml>=6.0
toml>=0.10.2
packaging>=23.0
#!/bin/bash
set -euo pipefail
echo "::group::License Compliance Checker"
echo "Scanning dependencies for license compliance..."
echo "::endgroup::"
# Python-Script ausführen
python /action/src/main.py
# Exit-Code vom Python-Script übernehmen
exit $?#!/usr/bin/env python3
"""
License Compliance Checker - Main Entry Point
"""
import os
import sys
import json
from pathlib import Path
from scanner import LicenseScanner
from reporter import Reporter
def get_input(name: str, default: str = '') -> str:
"""Liest einen Action-Input aus der Environment Variable."""
env_name = f"INPUT_{name.upper().replace('-', '_')}"
return os.environ.get(env_name, default)
def get_multiline_input(name: str) -> list[str]:
"""Liest einen mehrzeiligen Input als Liste."""
value = get_input(name, '')
if not value:
return []
return [line.strip() for line in value.split('\n') if line.strip()]
def set_output(name: str, value: str) -> None:
"""Setzt einen Action-Output."""
output_file = os.environ.get('GITHUB_OUTPUT')
if output_file:
with open(output_file, 'a') as f:
f.write(f"{name}={value}\n")
def log_error(message: str, file: str = None, line: int = None) -> None:
"""GitHub Actions Error Annotation."""
if file and line:
print(f"::error file={file},line={line}::{message}")
elif file:
print(f"::error file={file}::{message}")
else:
print(f"::error::{message}")
def log_warning(message: str) -> None:
"""GitHub Actions Warning."""
print(f"::warning::{message}")
def write_summary(reporter: Reporter) -> None:
"""Schreibt die Job Summary."""
summary_file = os.environ.get('GITHUB_STEP_SUMMARY')
if summary_file:
with open(summary_file, 'a') as f:
f.write(reporter.generate_markdown_summary())
def detect_package_manager(workspace: Path) -> str:
"""Erkennt automatisch den Package Manager."""
indicators = {
'npm': ['package.json', 'package-lock.json', 'yarn.lock', 'pnpm-lock.yaml'],
'pip': ['requirements.txt', 'Pipfile', 'pyproject.toml', 'setup.py'],
'cargo': ['Cargo.toml', 'Cargo.lock'],
'go': ['go.mod', 'go.sum'],
}
for pm, files in indicators.items():
for filename in files:
if (workspace / filename).exists():
print(f"Detected package manager: {pm} (found {filename})")
return pm
return 'unknown'
def main() -> int:
workspace = Path(os.environ.get('GITHUB_WORKSPACE', '/github/workspace'))
# Inputs laden
allowed = [l.strip() for l in get_input('allowed-licenses').split(',') if l.strip()]
denied = [l.strip() for l in get_input('denied-licenses').split(',') if l.strip()]
package_manager = get_input('package-manager', 'auto')
fail_on_violation = get_input('fail-on-violation', 'true').lower() == 'true'
exclude_packages = get_multiline_input('exclude-packages')
print(f"Allowed licenses: {', '.join(allowed)}")
print(f"Denied licenses: {', '.join(denied)}")
print(f"Excluded packages: {', '.join(exclude_packages) or 'none'}")
# Package Manager erkennen
if package_manager == 'auto':
package_manager = detect_package_manager(workspace)
if package_manager == 'unknown':
log_warning("Could not detect package manager, skipping scan")
set_output('total-packages', '0')
set_output('violation-count', '0')
set_output('unknown-count', '0')
return 0
# Scanner initialisieren und ausführen
scanner = LicenseScanner(
workspace=workspace,
package_manager=package_manager,
allowed_licenses=allowed,
denied_licenses=denied,
exclude_packages=exclude_packages
)
print(f"\n::group::Scanning {package_manager} dependencies")
results = scanner.scan()
print("::endgroup::")
# Report erstellen
reporter = Reporter(results)
report_path = workspace / 'license-report.json'
reporter.write_json_report(report_path)
# Outputs setzen
set_output('total-packages', str(results.total_packages))
set_output('violation-count', str(results.violation_count))
set_output('unknown-count', str(results.unknown_count))
set_output('report-path', str(report_path.relative_to(workspace)))
# Summary schreiben
write_summary(reporter)
# Violations loggen
for violation in results.violations:
log_error(
f"License violation: {violation.package} uses {violation.license} "
f"(denied license)",
file=violation.source_file
)
for unknown in results.unknown:
log_warning(f"Unknown license for package: {unknown.package}")
# Ergebnis
print(f"\n{'='*50}")
print(f"Total packages scanned: {results.total_packages}")
print(f"Violations found: {results.violation_count}")
print(f"Unknown licenses: {results.unknown_count}")
print(f"{'='*50}")
if results.violation_count > 0:
if fail_on_violation:
log_error(f"Found {results.violation_count} license violations")
return 1
else:
log_warning(f"Found {results.violation_count} license violations (not failing)")
return 0
if __name__ == '__main__':
sys.exit(main())"""
License scanning implementation for various package managers.
"""
import json
import subprocess
from dataclasses import dataclass, field
from pathlib import Path
@dataclass
class PackageInfo:
package: str
version: str
license: str
source_file: str = None
@dataclass
class ScanResults:
packages: list[PackageInfo] = field(default_factory=list)
violations: list[PackageInfo] = field(default_factory=list)
unknown: list[PackageInfo] = field(default_factory=list)
@property
def total_packages(self) -> int:
return len(self.packages)
@property
def violation_count(self) -> int:
return len(self.violations)
@property
def unknown_count(self) -> int:
return len(self.unknown)
class LicenseScanner:
def __init__(
self,
workspace: Path,
package_manager: str,
allowed_licenses: list[str],
denied_licenses: list[str],
exclude_packages: list[str]
):
self.workspace = workspace
self.package_manager = package_manager
self.allowed_licenses = [l.lower() for l in allowed_licenses]
self.denied_licenses = [l.lower() for l in denied_licenses]
self.exclude_packages = [p.lower() for p in exclude_packages]
def scan(self) -> ScanResults:
"""Führt den Scan für den konfigurierten Package Manager aus."""
scanners = {
'npm': self._scan_npm,
'pip': self._scan_pip,
'cargo': self._scan_cargo,
'go': self._scan_go,
}
scanner = scanners.get(self.package_manager)
if not scanner:
raise ValueError(f"Unsupported package manager: {self.package_manager}")
packages = scanner()
return self._evaluate_packages(packages)
def _scan_npm(self) -> list[PackageInfo]:
"""Scannt npm/yarn/pnpm Dependencies."""
packages = []
try:
result = subprocess.run(
['license-checker', '--json', '--production'],
cwd=self.workspace,
capture_output=True,
text=True,
timeout=300
)
if result.returncode != 0:
print(f"license-checker warning: {result.stderr}")
data = json.loads(result.stdout) if result.stdout else {}
for name_version, info in data.items():
# Format: "package@version"
if '@' in name_version:
# Handle scoped packages (@org/package@version)
parts = name_version.rsplit('@', 1)
name = parts[0]
version = parts[1] if len(parts) > 1 else 'unknown'
else:
name = name_version
version = 'unknown'
packages.append(PackageInfo(
package=name,
version=version,
license=info.get('licenses', 'UNKNOWN'),
source_file='package.json'
))
except subprocess.TimeoutExpired:
print("::warning::npm license scan timed out")
except json.JSONDecodeError as e:
print(f"::warning::Failed to parse npm license output: {e}")
return packages
def _scan_pip(self) -> list[PackageInfo]:
"""Scannt Python Dependencies."""
packages = []
try:
# pip-licenses muss installiert sein
result = subprocess.run(
['pip-licenses', '--format=json', '--with-system'],
cwd=self.workspace,
capture_output=True,
text=True,
timeout=300
)
data = json.loads(result.stdout) if result.stdout else []
for pkg in data:
packages.append(PackageInfo(
package=pkg.get('Name', 'unknown'),
version=pkg.get('Version', 'unknown'),
license=pkg.get('License', 'UNKNOWN'),
source_file='requirements.txt'
))
except FileNotFoundError:
print("::warning::pip-licenses not found, using fallback method")
packages = self._scan_pip_fallback()
except subprocess.TimeoutExpired:
print("::warning::pip license scan timed out")
return packages
def _scan_pip_fallback(self) -> list[PackageInfo]:
"""Fallback-Methode für pip ohne pip-licenses."""
packages = []
result = subprocess.run(
['pip', 'list', '--format=json'],
capture_output=True,
text=True
)
for pkg in json.loads(result.stdout or '[]'):
# Ohne pip-licenses können wir nur die Paketnamen ermitteln
packages.append(PackageInfo(
package=pkg['name'],
version=pkg['version'],
license='UNKNOWN',
source_file='requirements.txt'
))
return packages
def _scan_cargo(self) -> list[PackageInfo]:
"""Scannt Rust/Cargo Dependencies."""
packages = []
try:
result = subprocess.run(
['cargo', 'license', '--json'],
cwd=self.workspace,
capture_output=True,
text=True,
timeout=300
)
for line in result.stdout.strip().split('\n'):
if not line:
continue
try:
pkg = json.loads(line)
packages.append(PackageInfo(
package=pkg.get('name', 'unknown'),
version=pkg.get('version', 'unknown'),
license=pkg.get('license', 'UNKNOWN'),
source_file='Cargo.toml'
))
except json.JSONDecodeError:
continue
except FileNotFoundError:
print("::warning::cargo-license not found")
return packages
def _scan_go(self) -> list[PackageInfo]:
"""Scannt Go Dependencies."""
packages = []
try:
result = subprocess.run(
['go', 'list', '-json', '-m', 'all'],
cwd=self.workspace,
capture_output=True,
text=True,
timeout=300
)
# go list -json gibt mehrere JSON-Objekte aus (nicht ein Array)
import re
json_objects = re.findall(r'\{[^{}]*\}', result.stdout, re.DOTALL)
for obj_str in json_objects:
try:
pkg = json.loads(obj_str)
if pkg.get('Main'):
continue # Skip main module
packages.append(PackageInfo(
package=pkg.get('Path', 'unknown'),
version=pkg.get('Version', 'unknown'),
license='UNKNOWN', # Go hat keine eingebaute Lizenz-Info
source_file='go.mod'
))
except json.JSONDecodeError:
continue
except subprocess.TimeoutExpired:
print("::warning::go module scan timed out")
return packages
def _evaluate_packages(self, packages: list[PackageInfo]) -> ScanResults:
"""Evaluiert Pakete gegen erlaubte/verbotene Lizenzen."""
results = ScanResults()
for pkg in packages:
# Excluded?
if pkg.package.lower() in self.exclude_packages:
print(f" Skipping excluded package: {pkg.package}")
continue
results.packages.append(pkg)
license_lower = pkg.license.lower()
# Denied License?
if any(denied in license_lower for denied in self.denied_licenses):
results.violations.append(pkg)
print(f" ✗ {pkg.package}@{pkg.version}: {pkg.license} (DENIED)")
# Unknown License?
elif license_lower in ('unknown', 'unlicensed', ''):
results.unknown.append(pkg)
print(f" ? {pkg.package}@{pkg.version}: {pkg.license} (UNKNOWN)")
# Allowed License?
elif any(allowed in license_lower for allowed in self.allowed_licenses):
print(f" ✓ {pkg.package}@{pkg.version}: {pkg.license}")
else:
# Nicht explizit erlaubt, nicht verboten -> Warning
results.unknown.append(pkg)
print(f" ? {pkg.package}@{pkg.version}: {pkg.license} (not in allowed list)")
return results"""
Report generation for license scan results.
"""
import json
from pathlib import Path
from datetime import datetime
class Reporter:
def __init__(self, results):
self.results = results
self.timestamp = datetime.utcnow().isoformat()
def write_json_report(self, path: Path) -> None:
"""Schreibt einen detaillierten JSON-Report."""
report = {
'timestamp': self.timestamp,
'summary': {
'total_packages': self.results.total_packages,
'violations': self.results.violation_count,
'unknown': self.results.unknown_count,
},
'packages': [
{
'name': p.package,
'version': p.version,
'license': p.license,
'source': p.source_file,
'status': self._get_status(p)
}
for p in self.results.packages
],
'violations': [
{'name': p.package, 'version': p.version, 'license': p.license}
for p in self.results.violations
],
'unknown': [
{'name': p.package, 'version': p.version, 'license': p.license}
for p in self.results.unknown
]
}
with open(path, 'w') as f:
json.dump(report, f, indent=2)
def _get_status(self, package) -> str:
if package in self.results.violations:
return 'violation'
elif package in self.results.unknown:
return 'unknown'
return 'ok'
def generate_markdown_summary(self) -> str:
"""Generiert eine Markdown-Summary für GitHub Actions."""
lines = [
"## 📋 License Compliance Report\n",
f"Scanned at: {self.timestamp}\n",
"",
"### Summary\n",
"| Metric | Count |",
"|--------|-------|",
f"| Total Packages | {self.results.total_packages} |",
f"| ✅ Compliant | {self.results.total_packages - self.results.violation_count - self.results.unknown_count} |",
f"| ❌ Violations | {self.results.violation_count} |",
f"| ⚠️ Unknown | {self.results.unknown_count} |",
"",
]
if self.results.violations:
lines.extend([
"### ❌ License Violations\n",
"| Package | Version | License |",
"|---------|---------|---------|",
])
for v in self.results.violations:
lines.append(f"| {v.package} | {v.version} | {v.license} |")
lines.append("")
if self.results.unknown:
lines.extend([
"### ⚠️ Unknown Licenses\n",
"| Package | Version | License |",
"|---------|---------|---------|",
])
for u in self.results.unknown[:10]: # Limit to 10
lines.append(f"| {u.package} | {u.version} | {u.license} |")
if len(self.results.unknown) > 10:
lines.append(f"\n*...and {len(self.results.unknown) - 10} more*")
lines.append("")
return '\n'.join(lines)Statt das Dockerfile bei jedem Action-Aufruf zu bauen, kann man ein vorgebautes Image verwenden:
# action.yml
runs:
using: 'docker'
image: 'docker://ghcr.io/myorg/license-checker:v1.2.0'| Aspekt | Dockerfile | Pre-built Image |
|---|---|---|
| Startup-Zeit | Langsam (Build + Run) | Schneller (nur Pull + Run) |
| Reproduzierbarkeit | Abhängig von Base-Image-Updates | Exakt fixiert |
| Größe im Repo | Klein (nur Dockerfile) | Keine Artefakte |
| CI-Zeit | Build bei jedem Run | Build nur bei Release |
# .github/workflows/publish-image.yml
name: Publish Docker Image
on:
push:
tags:
- 'v*.*.*'
jobs:
build-and-push:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v4
- name: Log in to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract version
id: version
run: echo "version=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: |
ghcr.io/${{ github.repository }}:${{ steps.version.outputs.version }}
ghcr.io/${{ github.repository }}:latest
labels: |
org.opencontainers.image.source=${{ github.server_url }}/${{ github.repository }}
org.opencontainers.image.revision=${{ github.sha }}Nach dem Publishing die action.yml aktualisieren:
runs:
using: 'docker'
image: 'docker://ghcr.io/myorg/license-checker:v1.2.0'Reduziere die Image-Größe durch Multi-Stage Builds:
# Build Stage
FROM golang:1.22 AS builder
WORKDIR /build
COPY go.* ./
RUN go mod download
COPY . .
RUN CGO_ENABLED=0 go build -ldflags="-s -w" -o /scanner ./cmd/scanner
# Runtime Stage
FROM alpine:3.19
RUN apk add --no-cache ca-certificates git
COPY --from=builder /scanner /usr/local/bin/scanner
ENTRYPOINT ["scanner"]Das resultierende Image ist Megabytes statt Gigabytes groß.
| Base Image | Größe | Use Case |
|---|---|---|
ubuntu:24.04 |
~78 MB | Volle Linux-Distribution |
debian:bookworm-slim |
~74 MB | Debian ohne Extras |
python:3.12-slim |
~125 MB | Python ohne Build-Tools |
alpine:3.19 |
~7 MB | Minimales Linux |
gcr.io/distroless/base |
~20 MB | Nur Runtime, keine Shell |
scratch |
0 MB | Nur für statisch gelinkte Binaries |
# ❌ Schlecht: Jede Code-Änderung invalidiert den Cache
COPY . /app
RUN pip install -r requirements.txt
# ✓ Gut: Dependencies werden gecached
COPY requirements.txt /app/
RUN pip install -r requirements.txt
COPY src/ /app/src/Dateien, die sich selten ändern (Dependencies), sollten früh im Dockerfile kommen.
# Image bauen
docker build -t my-action .
# Container mit gemocktem GitHub-Environment starten
docker run --rm \
-e GITHUB_OUTPUT=/tmp/output \
-e GITHUB_ENV=/tmp/env \
-e GITHUB_STEP_SUMMARY=/tmp/summary \
-e GITHUB_WORKSPACE=/workspace \
-e INPUT_SEVERITY=high \
-e INPUT_FAIL_ON_VIOLATION=false \
-v $(pwd):/workspace \
-v /tmp:/tmp \
my-action#!/bin/bash
set -euo pipefail
# Debug-Modus wenn ACTIONS_STEP_DEBUG gesetzt
if [[ "${ACTIONS_STEP_DEBUG:-false}" == "true" ]]; then
set -x # Alle Befehle ausgeben
echo "::group::Debug Environment"
env | sort
echo "::endgroup::"
echo "::group::Debug Workspace"
ls -la /github/workspace/
echo "::endgroup::"
fi
# Rest des Scripts...Während der Entwicklung kann man statt des Entrypoints eine Shell starten:
docker run --rm -it \
-v $(pwd):/github/workspace \
--entrypoint /bin/bash \
my-action
# Jetzt im Container:
/action/entrypoint.sh # Manuell ausführenDocker Actions laufen nur auf Linux-Runnern. Für Cross-Platform-Actions ist das ein Problem:
# ❌ Funktioniert NICHT auf Windows/macOS
jobs:
scan:
runs-on: macos-latest
steps:
- uses: ./my-docker-action # Fehler!
# ✓ Explizit Linux verwenden
jobs:
scan:
runs-on: ubuntu-latest
steps:
- uses: ./my-docker-action # OKWenn Cross-Platform-Support nötig ist, muss man entweder eine JavaScript Action oder eine Composite Action wählen, die je nach Plattform unterschiedliche Implementierungen verwendet.
# .github/workflows/license-check.yml
name: License Compliance
on:
pull_request:
paths:
- 'package.json'
- 'package-lock.json'
- 'requirements.txt'
- 'Cargo.toml'
- 'go.mod'
push:
branches: [main]
jobs:
check-licenses:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Check license compliance
id: license-check
uses: myorg/license-checker@v1
with:
allowed-licenses: 'MIT,Apache-2.0,BSD-3-Clause,ISC'
denied-licenses: 'GPL-3.0,AGPL-3.0'
fail-on-violation: 'true'
exclude-packages: |
internal-package
legacy-tool
- name: Upload report
if: always()
uses: actions/upload-artifact@v4
with:
name: license-report
path: ${{ steps.license-check.outputs.report-path }}
- name: Comment on PR
if: failure() && github.event_name == 'pull_request'
uses: actions/github-script@v7
with:
script: |
github.rest.issues.createComment({
...context.repo,
issue_number: context.payload.pull_request.number,
body: '⚠️ License compliance check failed. Please review the license report.'
})Die Action scannt automatisch den passenden Package Manager, prüft gegen die Lizenz-Policies und generiert einen detaillierten Report. Durch die Docker-Isolation sind alle benötigten Tools (license-checker, cargo-license, etc.) garantiert verfügbar, ohne den Runner zu belasten.