CI pipelines that download dependencies on every run waste
massive amounts of time. Install npm packages? 2 minutes. Download Maven dependencies? 5 minutes. Pull Docker
layers? 10 minutes. GitHub Actions caching eliminates this waste by reusing dependencies across runs—transforming
10-minute builds into 2-minute builds.
This guide covers production-ready GitHub Actions caching
strategies that can reduce build times by 50-80%. We’ll build lightning-fast CI pipelines that developers actually
enjoy using.
Why Caching Transforms CI Performance
The No-Cache Problem
Uncached CI pipelines suffer from:
- Slow builds: 10+ minutes to install dependencies
- Wasted bandwidth: Download same packages thousands of times
- CI cost: Pay for compute time downloading dependencies
- Developer frustration: Wait forever for simple changes
- Flaky builds: Network failures downloading packages
- Limited CI minutes: Waste free tier on downloads
Caching Benefits
- 50-80% faster: 10min builds become 2min builds
- Reliable builds: Less network dependency
- Lower cost: Less compute time, more free tier
- Better UX: Fast feedback on PRs
- More builds: Run tests more frequently
- Green builds: Fewer network-related failures
Pattern 1: npm/Yarn Caching
Node.js Dependency Caching
name: Node.js CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
# ❌ BAD: No caching - downloads every time
# - name: Install dependencies
# run: npm ci
# ✅ GOOD: Cache node_modules
- name: Cache node_modules
uses: actions/cache@v4
with:
path: node_modules
key: ${{ runner.os }}-node-${{ hashFiles('package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Install dependencies
run: npm ci
- name: Run tests
run: npm test
- name: Build
run: npm run build
# Results:
# Without cache: 3m 45s (2m 30s installing)
# With cache (cold): 3m 45s (first run)
# With cache (warm): 1m 15s (0s installing)
# 66% faster!
# Even better: Use setup-node built-in caching
- name: Setup Node.js with cache
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm' # Automatically caches based on package-lock.json
Pattern 2: Maven/Gradle Caching
Java Dependency Caching
name: Java CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up JDK 17
uses: actions/setup-java@v4
with:
java-version: '17'
distribution: 'temurin'
# Maven caching
- name: Cache Maven packages
uses: actions/cache@v4
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-
- name: Build with Maven
run: mvn clean install
# Gradle caching (alternative)
- name: Cache Gradle packages
uses: actions/cache@v4
with:
path: |
~/.gradle/caches
~/.gradle/wrapper
key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle*', '**/gradle-wrapper.properties') }}
restore-keys: |
${{ runner.os }}-gradle-
- name: Build with Gradle
run: ./gradlew build
# Results:
# Maven without cache: 8m 20s (5m downloading deps)
# Maven with cache: 2m 45s
# 67% faster!
# Even better: setup-java has built-in caching
- name: Set up JDK with cache
uses: actions/setup-java@v4
with:
java-version: '17'
distribution: 'temurin'
cache: 'maven' # or 'gradle'
Pattern 3: Docker Layer Caching
Speed Up Docker Builds
name: Docker Build
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# Set up Docker Buildx for caching
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
# Login to registry (for cache push)
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
# Build with cache
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: ghcr.io/${{ github.repository }}:latest
cache-from: type=registry,ref=ghcr.io/${{ github.repository }}:buildcache
cache-to: type=registry,ref=ghcr.io/${{ github.repository }}:buildcache,mode=max
# Dockerfile optimization for caching
FROM node:20-alpine
WORKDIR /app
# Copy package files first (cached if unchanged)
COPY package*.json ./
# Install dependencies (cached layer)
RUN npm ci
# Copy source code last (changes frequently)
COPY . .
# Build (only runs if source changed)
RUN npm run build
CMD ["npm", "start"]
# Results:
# First build: 8m 45s
# Subsequent builds (deps unchanged): 1m 30s
# 83% faster!
Pattern 4: Python pip/Poetry Caching
Python Dependency Caching
name: Python CI
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
# pip caching
- name: Cache pip packages
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
- name: Install dependencies
run: |
pip install -r requirements.txt
# Poetry caching (better approach)
- name: Install Poetry
uses: snok/install-poetry@v1
with:
virtualenvs-create: true
virtualenvs-in-project: true
- name: Cache Poetry virtualenv
uses: actions/cache@v4
with:
path: .venv
key: ${{ runner.os }}-poetry-${{ hashFiles('poetry.lock') }}
restore-keys: |
${{ runner.os }}-poetry-
- name: Install dependencies with Poetry
run: poetry install
- name: Run tests
run: poetry run pytest
# Results:
# Without cache: 4m 15s (2m 45s installing)
# With cache: 1m 30s
# 65% faster!
# Even better: setup-python has built-in caching
- name: Set up Python with cache
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip' # or 'poetry'
cache-dependency-path: 'requirements.txt'
Pattern 5: Multi-Stage Caching
Cache Build Artifacts
name: Multi-Stage Build
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# Stage 1: Cache dependencies
- name: Cache npm dependencies
uses: actions/cache@v4
with:
path: node_modules
key: deps-${{ hashFiles('package-lock.json') }}
- name: Install dependencies
run: npm ci
# Stage 2: Cache build output
- name: Cache build output
uses: actions/cache@v4
with:
path: dist
key: build-${{ github.sha }}
restore-keys: |
build-
- name: Build
run: npm run build
# Stage 3: Cache test coverage
- name: Cache coverage
uses: actions/cache@v4
with:
path: coverage
key: coverage-${{ github.sha }}
- name: Run tests with coverage
run: npm run test:coverage
# Upload artifacts for other jobs
- name: Upload build artifacts
uses: actions/upload-artifact@v4
with:
name: dist
path: dist
deploy:
needs: build
runs-on: ubuntu-latest
steps:
# Download cached artifacts
- name: Download build artifacts
uses: actions/download-artifact@v4
with:
name: dist
path: dist
- name: Deploy to production
run: |
echo "Deploying pre-built artifacts..."
# Deploy dist/ folder
Pattern 6: Conditional Caching
Smart Cache Invalidation
name: Smart Caching
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Full history for change detection
# Detect if dependencies changed
- name: Check for dependency changes
id: deps-changed
run: |
git diff --quiet HEAD~1 HEAD -- package-lock.json || echo "changed=true" >> $GITHUB_OUTPUT
# Only cache if dependencies haven't changed
- name: Restore cache
if: steps.deps-changed.outputs.changed != 'true'
uses: actions/cache/restore@v4
with:
path: node_modules
key: ${{ runner.os }}-deps-${{ hashFiles('package-lock.json') }}
# Install dependencies (skip if cache hit)
- name: Install dependencies
if: steps.deps-changed.outputs.changed == 'true'
run: npm ci
# Save cache only on main branch
- name: Save cache
if: github.ref == 'refs/heads/main' && steps.deps-changed.outputs.changed == 'true'
uses: actions/cache/save@v4
with:
path: node_modules
key: ${{ runner.os }}-deps-${{ hashFiles('package-lock.json') }}
# Matrix builds with shared cache
strategy:
matrix:
node-version: [18, 20, 22]
- name: Cache per Node version
uses: actions/cache@v4
with:
path: node_modules
key: ${{ runner.os }}-node${{ matrix.node-version }}-${{ hashFiles('package-lock.json') }}
Pattern 7: Monorepo Caching
Selective Caching for Workspaces
name: Monorepo CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# Detect changed packages
- name: Detect changed packages
id: changes
uses: dorny/paths-filter@v2
with:
filters: |
frontend:
- 'packages/frontend/**'
backend:
- 'packages/backend/**'
shared:
- 'packages/shared/**'
# Cache root dependencies
- name: Cache root node_modules
uses: actions/cache@v4
with:
path: node_modules
key: root-${{ hashFiles('package-lock.json') }}
# Cache frontend (only if changed)
- name: Cache frontend
if: steps.changes.outputs.frontend == 'true'
uses: actions/cache@v4
with:
path: packages/frontend/node_modules
key: frontend-${{ hashFiles('packages/frontend/package-lock.json') }}
# Cache backend (only if changed)
- name: Cache backend
if: steps.changes.outputs.backend == 'true'
uses: actions/cache@v4
with:
path: packages/backend/node_modules
key: backend-${{ hashFiles('packages/backend/package-lock.json') }}
# Install dependencies
- name: Install dependencies
run: npm ci
# Build only changed packages
- name: Build frontend
if: steps.changes.outputs.frontend == 'true'
run: npm run build --workspace=packages/frontend
- name: Build backend
if: steps.changes.outputs.backend == 'true'
run: npm run build --workspace=packages/backend
# Results:
# Full build without cache: 12m 30s
# Full build with cache: 3m 45s
# Selective build (1 package): 1m 20s
# 89% faster for small changes!
Real-World Example: Complete CI Pipeline
Production-Ready GitHub Actions
name: Production CI/CD
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
env:
NODE_VERSION: '20'
CACHE_VERSION: v1 # Bump to invalidate all caches
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# Setup with caching
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
# Install dependencies
- name: Install dependencies
run: npm ci
# Cache Playwright browsers
- name: Cache Playwright browsers
uses: actions/cache@v4
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-${{ hashFiles('package-lock.json') }}
- name: Install Playwright
run: npx playwright install --with-deps
# Run tests
- name: Run unit tests
run: npm run test:unit
- name: Run E2E tests
run: npm run test:e2e
# Upload coverage
- name: Upload coverage
uses: codecov/codecov-action@v3
build:
runs-on: ubuntu-latest
needs: test
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Install dependencies
run: npm ci
# Cache build output
- name: Cache build
uses: actions/cache@v4
with:
path: |
dist
.next/cache
key: ${{ runner.os }}-build-${{ github.sha }}
- name: Build
run: npm run build
# Upload artifacts
- name: Upload build artifacts
uses: actions/upload-artifact@v4
with:
name: build
path: dist
retention-days: 7
docker:
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
# Download pre-built artifacts
- name: Download build
uses: actions/download-artifact@v4
with:
name: build
path: dist
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: |
ghcr.io/${{ github.repository }}:latest
ghcr.io/${{ github.repository }}:${{ github.sha }}
cache-from: type=registry,ref=ghcr.io/${{ github.repository }}:buildcache
cache-to: type=registry,ref=ghcr.io/${{ github.repository }}:buildcache,mode=max
# Performance results:
# Without caching: 18m 45s
# With all caching: 3m 20s
# 82% faster!
Best Practices
- Hash lock files: Use hashFiles(‘package-lock.json’) for cache keys
- Use restore-keys: Fallback to partial matches
- Cache early, use late: Cache in separate step before install
- Built-in caching: setup-node, setup-python have cache options
- Version your caches: CACHE_VERSION env var for invalidation
- Limit cache size: GitHub has 10GB cache limit per repo
- Clean old caches: Delete unused caches regularly
Common Pitfalls
- Caching node_modules with platform differences: Use npm ci, not npm install
- No fallback keys: Cache miss rebuilds everything
- Caching build output incorrectly: Include source hash in key
- Not invalidating cache: Stale dependencies cause bugs
- Caching too much: Hit 10GB limit, old caches evicted
- Complex cache keys: Hard to debug cache misses
Cache Key Strategies
| Strategy | Key Pattern | Use Case |
|---|---|---|
| Exact match | ${{ hashFiles(‘package-lock.json’) }} | Dependencies |
| OS-specific | ${{ runner.os }}-deps-${{ hashFiles(‘…’) }} | Native modules |
| Version prefix | v1-${{ hashFiles(‘…’) }} | Manual invalidation |
| Commit-based | build-${{ github.sha }} | Build outputs |
Key Takeaways
- GitHub Actions caching reduces build times by 50-80%
- Use setup-node/setup-python built-in cache options when possible
- Cache key should include hashFiles() of lock files
- Always provide restore-keys for fallback
- Docker layer caching requires registry push/pull
- Cache dependencies separately from build outputs
- Monitor cache hit rates and adjust strategies
- Clean up old caches to stay under 10GB limit
Caching transforms GitHub Actions from frustratingly slow to blazingly fast. By avoiding repeated downloads of
the same dependencies, you cut build times by 50-80%, save CI minutes, and give developers instant feedback. The
setup is simple—a few lines of YAML—but the impact is transformative. Never run an uncached pipeline again.
Discover more from C4: Container, Code, Cloud & Context
Subscribe to get the latest posts sent to your email.