
The Coolify monorepo webhook that silently skipped 3 of 4 apps
A commit to shared CSS across four Next.js apps in the same monorepo only deployed to one. The other three reported 'finished' but served stale code.
I maintain a monorepo that holds four Next.js apps: preparemescours.fr, draftmylesson.com, przygotujlekcje.pl, and creaclases.com. Each app lives in its own directory under apps/. Each has its own Dockerfile under tools/docker/<app>.Dockerfile and a BUILD_APP=<app> build argument. All four point at the same git remote and watch the same main branch.
The documented Coolify monorepo pattern says: push to main, all four webhooks fire, each app builds independently, and one failed build does not block the others. That part works. The part that does not work is the caching.
The commit that only reached one app
Commit a33cca4 touched shared layout and CSS. Specifically, it rewired the next/font configuration across all four apps. I pushed, waited a few minutes, and checked the live sites. Creaclases had the new font. The other three did not.
I pulled the live HTML from each site and grepped for the new font URL. Only creaclases had it. I checked Coolify's deployment list for each app. All four said "deployment finished" and showed the same commit SHA. I checked the running Docker containers. Creaclases had an image tag matching the new SHA. The other three had older tags.
The auto-webhook had fired for all four. But only one actually rebuilt.
How image-layer caching ate the builds
Coolify uses Docker image-layer caching by default. When a commit lands and the webhook fires, each app queues a build in its own context. The build runner checks if a cached image exists for the same base layers, same package.json, same Node version, same Dockerfile fingerprint. If it finds a match, it skips the actual docker build step and marks the deployment as finished.
The problem is that a cross-app commit changes shared files that are outside the Docker build context's cache fingerprint. The package.json did not change. The Dockerfile did not change. The Node version did not change. So three of the four builds said "I already have this image" and served the old compiled code.
The build context sees the monorepo root. But the cache key is computed from the Dockerfile and the base image layers, not from the entire source tree. Coolify's build runner does not invalidate the cache when a shared file changes. It only invalidates when the Dockerfile or its direct dependencies change.
The fix was a force-triggered deploy
I had to bypass the cache manually. Coolify's API has a force=true parameter that tells the build runner to ignore any cached image and do a full rebuild.
curl -s -X GET 'http://192.168.0.103:8000/api/v1/deploy?uuid=<APP_UUID>&force=true' \
-H 'Authorization: Bearer <TOKEN>'
I ran that for the three stale apps. Within a few minutes all four had the new font. Total wasted time: 90 minutes of users seeing stale layouts.
The root cause is a missing cache invalidation toggle
Coolify does not expose a setting that says "never cache between cross-app builds in the same monorepo." The cache is useful for single-app repos. For monorepos with shared code, it is a trap.
The pragmatic answer is a verification step. After any cross-app commit, I now grep the live HTML of each app for a marker from the commit. A changed string, a new className, a different font URL. If a marker is missing, I force-trigger the deploy for that app.
This is not a complaint about Coolify. I moved from Vercel to Coolify for cost and control, and I wrote about that migration earlier. Coolify is good. But it is not built for monorepos by default. The caching behavior is correct for single-app projects. For multi-app monorepos, it needs manual handling.
How to verify a monorepo deploy actually took
Here is the workflow I use now.
After a push, I wait for all four Coolify dashboards to show "deployment finished." Then I run a script that fetches each app's homepage and greps for a string I know changed in the commit. If any app is missing the string, I force-trigger that app.
#!/bin/bash
APPS=("preparemescours.fr" "draftmylesson.com" "przygotujlekcje.pl" "creaclases.com")
MARKER="font-family: Inter"
for APP in "${APPS[@]}"; do
if curl -s "https://$APP" | grep -q "$MARKER"; then
echo "$APP: OK"
else
echo "$APP: MISSING"
fi
done
The script takes 10 seconds. It saves me from another 90-minute cache blind spot.
The tradeoff I accepted
I could disable Docker caching entirely. That would guarantee every build is fresh. It would also make every deploy slower and more expensive. My apps are small, so the extra build time is maybe 30 seconds per app. But I do not want to burn compute on cache misses for trivial commits that change only one app.
I could also split the monorepo into separate repos. That would eliminate the cross-app cache problem. It would also multiply my CI/CD configuration, secret management, and local development setup by four. I have been down that road with Carriva and Profnova. Monorepo is better for my solo-founder workflow even with this gotcha.
For now, the verification script and the force-trigger habit are good enough. If Coolify ever adds a "disable cache for this app" toggle, I will flip it on. Until then, I check the HTML after every cross-app commit.
Another Coolify gotcha to watch for
This is not the only monorepo gotcha I have hit with Coolify. The GitHub deploy keys uniqueness issue is another one. Each app needs its own deploy key, but GitHub only allows one deploy key per repo. The workaround is to use a GitHub personal access token or a machine user. I wrote about that separately.
The pattern is the same. Coolify is a solid tool for a solo founder running multiple apps. But it assumes each app is its own repo. When you run a monorepo, you need to test the edge cases yourself. The dashboard will not tell you when a deploy is a no-op. You have to look at the live output.
The takeaway
Do not trust "deployment finished" in Coolify for monorepo apps. A finished deployment does not mean the new code is live. It means the build runner decided it did not need to rebuild. Verify with a live check after every cross-app commit. Force-trigger the ones that skipped. That is the cost of running a monorepo on a tool designed for single-app repos.
It is a small cost. The alternative is splitting repos, which has its own costs. I will take the 10-second verification script and move on.