Is there a way to bulk insert/update records?
There is the bulk destroy that is really useful, and I am looking to find a way to do the same with inserts/updates.
It could be really helpful where large collections are populated from outside and not from the UI
Our use case is to create or update 2000 records twice a day via API, because data come from an ERP.
The only issue I found is the performance.
I am doing a sequential call to create/update API but they take around 2 to 3 seconds a call with a payload of around 18kb, and it would take around 80minutes for only 2000 records, that is not acceptable.
I am using the free plan before have the customer start with the pay one.
Do you have any suggestion on how we can speed-up it?
Looks like parallel calls are fast to send (obviously) but they become slower looking timing after the 5th call (5 to 7 seconds each)
if not possible to have a bulk feature for this scenario, it could be helpful to have the insert/update api much faster on execution…is it possible? does it depends on the plan?
seems like a lot! would you mind writing us through support some specific details of your project and API call payload so that we can investigate? Thanks!
In a serverless world, in which functions need to happen under a time limit, a bulk update method would be ideal …
I’m in a situation right now where as a routine daily operation I need to update 20-odd records, and that’s going to be around 1.5s (parallelized; a for-of loop ends up being 20s!) for those individual requests to return. Packing off an array of IDs and an object of field names and values in one call to an update endpoint would be much simpler and faster – at least I’d hope so.
This is a big missing part of DatoCms and I’m surprised this is not available. Bulk updates/inserts are basic feature of Postgres, which I believe Dato is built on. If so, please add these. It’s a real pain updating alot of data by doing some sort of loop. Invariably times out.
How do I upvote this request? Honestly, I’m shocked this is not available yet. It’s a massive, massive pain to migrate legacy content into Dato from another data source b/c you don’t support bulk insert/update or upsert, and I’m about to give up and just dump stuff into postgres. I’m busy all day writing hacky scripts to loop thru data and insert/updte one-by-one into Dato. The strange thing is that I believe Dato is built on top of postgres. Is this correct? If so, how can you not possibly offer bulk insert/update and upsert natively?
I’ll revive this request. Initial import without a dedicate bulk API is going to be much more expensive than it should be in terms of speed, error handling and API calls efficiency. It certainly is a moat around the product wrt the approach of competitors.
I just wanted to make sure you saw the workarounds above about just making several single API calls in parallel? Is that insufficient for your needs?
e.g. instead of “update {record1, record2, record3}” in a single call, just making three simultaneous parallel calls of “update record1”, “update record2”, and “update record3”. Libs like GitHub - sindresorhus/p-queue: Promise queue with concurrency control should make that pretty easy while staying within the rate limit.
I know it’s really not as nice making a proper bulk update endpoint, but it’s a workaround in the meantime.
This is not currently planned for work, but I’ll post back here if that ever changes.
Your suggestion marginally addresses the speed issue but the rest of the concerns are still valid.
What I’m looking is fundamentally different . For example an api that not allows me to import several records in a single call (call efficency) but possibily post records together with their relationships and/or with optional integrity checks (e.g. incomplete fields). This will greatly simplify massive imports…