Continuous Webhook Rebuilds

After investigating a nightly build script to update a site which has an upcoming events and past events sections that must stay correct each day, Iā€™ve finally discovered Datoā€™s webhooks feature.

It seems I can rebuild my site (hosted on Netlify) any time I want. Iā€™m thinking of just having my staging and production sites update on every record save. I can use the two environments and draft/published workflow to avoid unfinished content making it to production.

Are there any downsides to doing that continuously? My users have never really understood or cared about the deploy process, having the button just causes confusion. I think if the built site is identical Netlify wonā€™t even bother finishing the deploy, and if the sites are rebuilt many times in a day Netlify will still correctly cache unchanged pages and assets.

Is anyone doing this? Is there anything Iā€™m missing?

Update: itā€™s working pretty well. Obviously it burns through your Netlify build minutes but, theyā€™re not too expensive, and once my site leaves development updates will be less frequent.

A big problem though is the notifications. They were already a bit large and intrusive but it gets crazy if you save and publish within a few seconds, leading to four builds across two environments, plus notifications for scraping.

I would love to see a better notification design, perhaps showing a stack with the latest one on top in the bottom left, and expanding to show all if clicked?

Thank you for this report!

Yes, we need to improve how notifications stack and collapse on top of each other, I agree.

Update on this: I fixed the notification problem by deleting my environments, and just relying on the webhooks. ā€œJust wait a couple of minutes for the site to updateā€ is much easier for editors to understand anyway than the deployment menu and notifications etc.

1 Like

Big gotcha with removing the deployment environments ā€” site search no longer works due to INVALID_DEPLOYMENT_ENVIRONMENT error.

Oh yes! We do scrape the site only on deployment :frowning:

One thing that you can do is to create a deployment environment and prevent access to it to your editors. Then you can either trigger a deploy manually to make us spider the site or also you can trigger a deploy with the APIs every n hours?

1 Like

Yes thanks, thatā€™s exactly what I ended up doing (Iā€™m still doing a ā€˜continuous deploymentā€™ setup using the on-save webhooks).

What does an editor see if I deny them access to deployment environments? Do they still see the deployment dropdown menu? Do they still get lots of notifications when the deployments complete?

They shouldnā€™t see the button nor get the notifications. If they see anything please let me know as they shouldnā€™t :slight_smile: