hi… I am not sure where is the right place for this question is, so if it should be in other place, just move it. Thanks.
I wanted to know if there is any option related with rate limit, and/or is any warning when one of projects is going over the current plan. I saw many problems related with this issue in other services and I a bit worried about that. It would be awesome to know how DatoCMS is managing this issues.
Hi @web,
What do you mean by “any option” related to the rate limits?
You can see what they are here:
Either one will return a HTTP 429
error if they’re rate-limited. You can use libraries like bottleneck, limiter, or p-queue to stay within them.
You can also use our JS clients for the CMA and CDA to help with the rate limiting (they’ll auto-retry).
Does that help, or was there a specific use case you’re worried about? If you can share your specific issue, we can try to help find a solution / prevention?
Thank you, Roger, for your reply. My concerns are related to the case that, for some reason, not related to the regular traffic of the site, the request increase in an abnormal way that exceeds the amount of traffic corresponding to my current plan. I just don’t want to receive a letter saying that I have to pay a high amount of money for a traffic that is not normal. I ask this because I saw this for instance with Vercel, and because is possible to receive attacks consuming big amounts of data. I don’t know if it’s clear now.
My question was if we have any kind of protection/security by default with Data. As I understand from your answer, I guess we do, but it could be improved with the libraries you suggest.
I see what you’re saying… sorry, I thought you meant a rate limit like from our APIs. You mean more like a cost limiter option that some SaaSes have, or how to prevent runaway overage charges, right? Unfortunately, we don’t offer that right now
In the case of a customer seeing unexpectedly high usage (such as API calls from a misconfigured script or bandwidth from image hot reloading during development or looping streaming videos), we usually work with them on a case-by-case basis, both to see what optimization opportunities exist / what might be causing the issue, and where possible, we try to forgive some of the overages too via credits — our ability to do that, however, is limited by our own costs that we have to pay our upstream providers and CDNs.
In the specific case of a DDoS (distributed denial of service attack), first, usually our CDNs are pretty good at automatically mitigating those before they ever impact our API or your project. In the rare cases that they don’t, we will again work with each customer on a case by case basis to resolve it. This hasn’t happened in any major way in the several years that I’ve been with DatoCMS. That doesn’t mean it can’t, it just means we’ve taken what precautions we could, and they seem to be holding up OK so far. We’ll have to see.
But back to the topic of overages. I think a cost-limiting feature would be nice, and it’s something I would like to see implemented too. Let’s definitely keep this feature request open (please remember to vote on it).
That said, a few tips that might help:
- Pay attention to your DatoCMS emails. We send several automated notifications when it looks like you’re going to exceed your limits: https://www.datocms.com/docs/plans-pricing-and-billing/overcharges-on-api-and-bandwidth#progressive-notifications
- You can always check your usages and projected overage costs (if any) by logging into your DatoCMS dashboard
- We very strongly recommend using good caching on your frontend, either statically building your sites or using SSR + CDN caching, so that you’re not fetching data from our API on every visit. The worst offenders are frontends that directly fetch data from our API via a clientside fetch, with each visitor using several queries (and thus API calls) as they navigate around the site, or even multiple API calls on one page. That will very quickly eat up your API quota and we really suggest not doing it that way; even if you can afford to pay the overages, it’s a waste of network resources and makes the UX worse for your visitors.
- For videos, please see this guide: https://www.datocms.com/docs/streaming-videos/how-to-stream-videos-efficiently
Generally speaking, I’d say that the majority of our customer sites stay within our limits or go over only by a little during certain months. Our plans are pretty lenient in terms of their built-in limits and it’s not easy for small to medium sites to go over them. The overage charges are pretty granular, so you’re only paying for the units of overages you actually incur. Enterprise customers can negotiate all of that upfront, or as their business grows, in order to make sure their plan is right for them.
And we’re happy to work with customers on any plan to try to give them optimization tips for their frontend, depending on their stack (on a best-effort basis… as a headless CMS, we can’t officially manage your frontend for you, but we’re happy to take a look as fellow devs and give tips where we can). You can always email us at support@datocms.com for that, or post it here if it’s not confidential information.
None of the above is to make an excuse for not having a cost limiting / budget cap feature… I still would love to see that someday, if only for customers’ peace of mind. Please vote on this feature if you want to see it too
Exactly, you make it very clear now. Sorry for being a little confused. I think it would be a very important feature. Especially considering that in recent times, you can see more often this kind of attack happens. Even LLM machines are scrawling all over the website, and sometimes increase the traffic of websites.
Thanks for your support!