File upload from Cloud functions

There seems to be some issues with large file uploads with the current implementation of the file upload.
createUploadPath seems to be downloading the file locally. Is there a way to circumvent this and work with streams or just a remote url?

I’ve also logged an issue for this on github (https://github.com/datocms/js-datocms-client/issues/75)
Kind regards,
Glenn

Hi @glenn.bostoen you can upload a file to DatoCMS from a remote URL, but it will be downloaded locally first, yes. That’s because the client must send a PUT request with the content to our S3 bucket.

There is also a limit of 1GB per uploaded file.

Is there a way to circumvent this […] ?

Not at the moment. If the file size is such big, is it feasable for you to store the file outside DatoCMS and reference its URL in a string field or json field in DatoCMS?

It would be nice to be able to sent the link instead of downloading it locally.
Currently i’m having issues with files of +/- 600 Mb. So I’m assuming the current implementation isn’t optimized for serverless usage.

It would be nice to be able to sent the link instead of downloading it locally

Yes. In alternative, we can chunk the file and do a multipart upload so we don’t have to load all the file in memory at once.

I’ve moved the topic in features requests. We’ll give priority to those that get more upvotes

Thank you for reporting this :slight_smile:

Hi there,

we use it in serverless like this;

  1. BACKEND: Get id & url by calling client.uploadRequest.create({ filename })
  2. FRONTEND: PUT file using XHR to the url from step 1.
  3. BACKEND: Call client.uploads.create({ path })

Martin

1 Like

I’ll be using Vimeo pro instead of the DatoCMS/Mux solution.
This seems to be more price efficient and supports the pull based upload approach

1 Like