In my previous blog post, Using DDEV snapshots to speed up GitHub Actions workflows, I explained how DDEV snapshots can be used to speed up continuous integration workflows for end-to-end tests. A workflow runs on the main development branch to create a DDEV snapshot to be cached and reused by subsequent jobs. If that cache can speed up continuous integration workflows, why can't it also speed up developer onboarding?
It is actually pretty simple! You will need somewhere to upload the snapshot archive, such as AWS S3 or some other storage. I leverage S3 for database dumps that I re-use with Tugboat so that a sample database can be seeded. I just never put it all together: use your CI to populate the dump artifact and automatically upload it so it can be shared across your team!
Before going further, here is the workflow from the previous blog post. Again, the trick is checking the cache-hit
output to skip running steps if there is an existing snapshot cache.
name: DDEV snapshot cache
# Only run on pushes to `main`
on:
push:
branches:
- 'main'
# Do not allow multiple runs.
concurrency: cache_ddev_snapshot
jobs:
setup_cache:
name: Set up snapshot cache
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
# Download cache
- name: Download DDEV snapshot
id: ddev-snapshot
uses: actions/cache@v3
with:
path: .ddev/db_snapshots
key: ddev-db_snapshots
# Install DDEV if cache miss and generate snapshot.
- name: Install ddev
if: steps.ddev-snapshot.outputs.cache-hit != 'true'
run: curl -LO https://raw.githubusercontent.com/drud/ddev/master/scripts/install_ddev.sh && bash install_ddev.sh
- name: Start ddev
if: steps.ddev-snapshot.outputs.cache-hit != 'true'
run: ddev start
- name: Install Drupal
if: steps.ddev-snapshot.outputs.cache-hit != 'true'
run: ddev site-install
- name: Take DDEV snapshot
if: steps.ddev-snapshot.outputs.cache-hit != 'true'
run: ddev snapshot --name ci
Then we add steps to upload the snapshot to S3. I use the aws-actions/configure-aws-credentials
action (and always have to research documentation for setting up access.) This action installs the aws
command line tool and authenticates it using secrets configured in your GitHub repository.
- name: "Configure AWS for snapshot upload"
if: steps.ddev-snapshot.outputs.cache-hit != 'true'
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: "Upload snapshot to bucket"
if: steps.ddev-snapshot.outputs.cache-hit != 'true'
working-directory: .ddev/db_snapshots
run: aws s3 sync . s3://snapshots
You now have a URL that you can share with other team members. They can download and move the snapshot into the .ddev/db_snapshots
directory to be used via ddev snapshot restore
.
Instead of having to create the snapshot and remember to keep it updating manually, your continuous integration workflow is automatically generating your snapshots!
Want more? Sign up for my weekly newsletter