GitHub Actions Caching: 3x Faster Pipelines Without Extra Costs
Dependency caching, Docker layer caching, and custom cache strategies can drastically speed up GitHub Actions pipelines and save runner minutes.
Jean-Pierre Broeders
Freelance DevOps Engineer
GitHub Actions Caching: 3x Faster Pipelines
A typical CI/CD pipeline installs the same dependencies every time, builds the same Docker layers, and downloads the same tools. That takes time. Sometimes a lot of time. A Node.js project with 500 packages? Easily 2-3 minutes just for npm install. On every push.
Caching solves this. GitHub Actions has built-in caching capabilities that can speed up pipelines by a factor of 3 or more. But it only works when implemented correctly.
Dependency Caching: The Basic Strategy
The most common use case: caching dependencies. As long as package.json doesn't change, node_modules/ doesn't need to be reinstalled.
- name: Cache dependencies
uses: actions/cache@v4
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
The key determines when the cache is rebuilt. As long as package-lock.json stays the same, the cache gets used. When dependencies update, the cache invalidates and rebuilds.
The restore-keys work as a fallback. If there's no exact match, GitHub uses the most recent cache that matches the prefix. That means a pipeline never starts from scratch, even when dependencies change.
For Python, it works the same way:
- uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
For .NET projects, NuGet packages can be cached:
- uses: actions/cache@v4
with:
path: ~/.nuget/packages
key: ${{ runner.os }}-nuget-${{ hashFiles('**/*.csproj') }}
restore-keys: |
${{ runner.os }}-nuget-
Docker Layer Caching: Doubling Build Speed
Docker builds are slow. Every layer gets rebuilt, even when code hasn't changed. Layer caching helps, but by default it doesn't work in GitHub Actions because each job runs on a fresh runner.
The solution: Docker Buildx with cache export/import.
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Cache Docker layers
uses: actions/cache@v4
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Build Docker image
uses: docker/build-push-action@v5
with:
context: .
push: false
tags: myapp:latest
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache-new,mode=max
- name: Move cache
run: |
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
The cache-to writes to a new directory to prevent race conditions. Without the final mv, the cache can get corrupted.
The difference is noticeable. A typical Next.js app that takes 8 minutes for a full build does it in 2-3 minutes with layer caching.
Custom Cache for Build Artifacts
Dependencies aren't the only thing that can be cached. Build artifacts, generated files, and test fixtures can also go in the cache.
A Webpack build that hasn't changed doesn't need to be re-run:
- uses: actions/cache@v4
with:
path: .next/cache
key: ${{ runner.os }}-nextjs-${{ hashFiles('**/package-lock.json') }}-${{ hashFiles('**/*.js', '**/*.jsx', '**/*.ts', '**/*.tsx') }}
restore-keys: |
${{ runner.os }}-nextjs-${{ hashFiles('**/package-lock.json') }}-
${{ runner.os }}-nextjs-
For a Gatsby site, the .cache directory can be cached:
- uses: actions/cache@v4
with:
path: |
.cache
public
key: ${{ runner.os }}-gatsby-${{ hashFiles('gatsby-*.js', 'package-lock.json') }}
Multi-path Caching: Multiple Directories in One Cache
Sometimes multiple directories need to be cached. A TypeScript project with node_modules/ and build output in dist/:
- uses: actions/cache@v4
with:
path: |
node_modules
dist
key: ${{ runner.os }}-build-${{ hashFiles('**/package-lock.json', '**/tsconfig.json') }}
The pipe (|) syntax allows specifying multiple paths. The cache only rebuilds when one of the hash inputs changes.
Cache Size Limitations and Best Practices
GitHub limits total cache per repository to 10 GB. Older caches get automatically deleted when the limit is reached.
That means:
- Keep caches small. Cache
~/.npminstead ofnode_modules/. - Avoid caching large binaries that change frequently.
- Use specific cache keys to prevent old caches from sticking around.
| Cache Type | Typical Size | Speed Impact |
|---|---|---|
| npm dependencies | 50-200 MB | 1-3 min saved |
| Docker layers | 500 MB - 2 GB | 3-6 min saved |
| Build artifacts | 10-100 MB | 30s - 2 min saved |
| pip/NuGet packages | 100-500 MB | 1-4 min saved |
Cache Invalidation: When Should the Cache Rebuild?
A cache key needs to be precise enough to detect changes, but not so precise that it rebuilds every time.
Too broad:
key: ${{ runner.os }}-deps
This never invalidates. Dependencies change, but the cache stays the same. Results in outdated packages.
Too specific:
key: ${{ runner.os }}-deps-${{ github.sha }}
This invalidates on every commit. The cache never gets reused. No speed gain.
Just right:
key: ${{ runner.os }}-deps-${{ hashFiles('**/package-lock.json') }}
This only invalidates when dependencies change. Perfect.
When Caching DOESN'T Help
Not everything needs caching. Some steps are so fast that caching adds overhead instead of removing it.
Skip caching for:
- Steps that take less than 10 seconds
- Repositories with few dependencies
- Jobs that run once a day
Also: if a project updates dependencies daily, the cache constantly rebuilds. Then the gain is minimal.
Conclusion
A well-cached GitHub Actions pipeline can save 50-70% of total runtime. That means faster feedback loops, less waiting time, and lower costs for both public and private repositories.
Start with dependency caching. Add Docker layer caching if builds take long. Experiment with custom caches for build artifacts. The impact is immediately noticeable.
