Iām currently facing an issue with DATO CMS ā I need to schedule the release of multiple changes simultaneously, but the current process only allows for single record publishing. Additionally, with each publish, the build trigger is fired, causing inefficiencies in the build process.
Is there a feature or workaround for bulk publishing multiple changes across various records simultaneously, without triggering a build for each individual record? Any insights or tips on streamlining this process and managing build triggers more effectively would be greatly appreciated.
EDIT: Actually, please see the next post instead. It has an easier and more reliable workaround.
This blurred post lists several other inferior workarounds, just in case. But my recommendation is to use the one in the next post.
Unfortunately, I donāt believe we have that functionality built-in, but there are a few workarounds I can think ofā¦
Option 1: Incremental builds
What framework are you using for your frontend?
If itās Next.js or something similar, you should be able to use Incremental Static Regeneration (Pages router) or Data validation (App router) to just rebuild the affected page or component, instead of the whole app. IMO that would be the cleanest route and not require any changes to your workflow or build triggers.
You can also semi-automate this by creating a separate model, letās call it āRelease groupsā. Make a record in that group and have it scheduled publish a few minutes after all the other ones:
Option 4: Custom webhook on a build debounce (not recommended)
This is probably overkill and fragile. I wouldnāt recommend this, but it most closely matches your original request so Iām mentioning it just in caseā¦
Instead of a build trigger connecting directly to Vercel/Netlify/etc. and building on every publish, you can make it a custom webhook request to an external handler that youād have to write. That handler could receive X number of build requests over time, but debounce them (like keep a record in an external database or key-value store) and only publish a maximum of once every 5 min/1 hour/whatever you set.
(Or alternatively, just have it re-build every hour on a timer. Maybe add a simple diff to see if it detects any changes, like by hashing a graphQL response.)
Again, this isnāt recommendedā¦ it introduces a lot of unnecessary complexity and fragility.
I know none of these are simple solutions, but might any of them work for you? If not, Iām afraid it might be a Feature Request instead?
@claudio.gebbia , I thought about this some more, and I think there is an even simpler solution: Using a linked records inside a master record, and publishing only the master record on a schedule. I think that would be simpler than any of the above options?
Its schema is very simple. In my example I added a title field for clarity, but really it only needs one field, a āMultiple linksā field with its validations set to āPublish also the referenced recordsā:
Then, I create a new record of bulk_publish_group and just link to all the posts I actually want to schedule. Save that record and then set a scheduled publish date ONLY for the bulk_publish_group, not the individual posts: