Remote build cache
Bleep can cache compiled classes in S3 (or any S3-compatible storage like MinIO, Cloudflare R2) so CI pipelines skip compilation for unchanged projects.
Setup
Add to your bleep.yaml:
remote-cache:
uri: s3://my-bleep-cache/builds
region: eu-north-1
Add credentials to ~/.config/bleep/config.yaml:
remoteCacheCredentials:
accessKeyId: AKIA...
secretAccessKey: wJal...
Or use environment variables (standard AWS convention):
export BLEEP_REMOTE_CACHE_S3_ACCESS_KEY_ID=AKIA...
export BLEEP_REMOTE_CACHE_S3_SECRET_ACCESS_KEY=wJal...
Usage
# Pull cached classes before building
bleep remote-cache pull
# Build only what changed
bleep build invalidated --base origin/main | xargs bleep compile
# Push compiled classes to cache
bleep remote-cache push
How it works
Each project gets a SHA-256 digest computed from:
- Project configuration (dependencies, compiler flags, Scala version, platform, etc.)
- Source file contents (using git blob hashes for speed, with filesystem fallback)
- Resource file contents (affects the digest but resources are NOT cached)
- Transitive dependency project digests (changes propagate downstream)
Cache entries are tar.gz archives stored at s3://bucket/prefix/<project>/<digest>.tar.gz containing the compiled classes directory and zinc incremental analysis.
Pull behavior
- Checks if each project's digest matches a cached archive
- Skips projects that are already compiled locally
- Downloads and extracts matching archives
- Zinc analysis is included so subsequent incremental compilation works correctly
Push behavior
- For each compiled project, checks if the cache already has an entry for that digest
- Uploads tar.gz of classes + zinc analysis for new entries
- Skips projects that aren't compiled or are already cached
CI integration
GitHub Actions
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: bleep-build/bleep-setup-action@v1
- name: Pull cache
run: bleep remote-cache pull
env:
BLEEP_REMOTE_CACHE_S3_ACCESS_KEY_ID: ${{ secrets.BLEEP_REMOTE_CACHE_S3_ACCESS_KEY_ID }}
BLEEP_REMOTE_CACHE_S3_SECRET_ACCESS_KEY: ${{ secrets.BLEEP_REMOTE_CACHE_S3_SECRET_ACCESS_KEY }}
- name: Build changed projects
run: bleep build invalidated --base origin/${{ github.event.pull_request.base.ref }} | xargs bleep compile
- name: Push cache
run: bleep remote-cache push
env:
BLEEP_REMOTE_CACHE_S3_ACCESS_KEY_ID: ${{ secrets.BLEEP_REMOTE_CACHE_S3_ACCESS_KEY_ID }}
BLEEP_REMOTE_CACHE_S3_SECRET_ACCESS_KEY: ${{ secrets.BLEEP_REMOTE_CACHE_S3_SECRET_ACCESS_KEY }}
GitLab CI
build:
script:
- bleep remote-cache pull
- bleep build invalidated --base origin/$CI_MERGE_REQUEST_TARGET_BRANCH_NAME | xargs bleep compile
- bleep remote-cache push
S3-compatible services
The remote cache works with any S3-compatible service:
MinIO (self-hosted)
remote-cache:
uri: http://minio.internal:9000/bleep-cache/builds
region: us-east-1
Cloudflare R2
remote-cache:
uri: https://<account-id>.r2.cloudflarestorage.com/bleep-cache/builds
region: auto
AWS S3
remote-cache:
uri: s3://my-bleep-cache/builds
region: eu-north-1
Google Cloud Storage (S3-compatible)
remote-cache:
uri: https://storage.googleapis.com/my-bleep-cache/builds
region: auto
Use HMAC keys for credentials (interoperability API).
Cache expiration
Bleep doesn't manage cache expiration — configure a lifecycle rule on your bucket to automatically delete old entries.
AWS S3
aws s3api put-bucket-lifecycle-configuration \
--bucket my-bleep-cache \
--lifecycle-configuration '{
"Rules": [{
"ID": "expire-cache",
"Status": "Enabled",
"Filter": {},
"Expiration": {"Days": 30}
}]
}'
Google Cloud Storage
gsutil lifecycle set /dev/stdin gs://my-bleep-cache <<'JSON'
{
"rule": [{
"action": {"type": "Delete"},
"condition": {"age": 30}
}]
}
JSON
Cloudflare R2
Set object lifecycle rules in the R2 dashboard under Settings > Object lifecycle rules.