Fetching more than a 100 records

Hi there :wave:

We are storing a list of “Application strings” for i18n localization in our application.

These strings can be as simple as “Go to next page” or more complex strings like a product description.

Each “Page” fetches the strings that belong to that page on the server and then ships that in the HTML so we can render localized HTML for SEO benefits. This is all working as expected but we have already hit the limit of 100 records per request.

I would really want to avoid doing an initial request to ask how many records there are and then do n/100 amount of queries to fetch my strings. The reason for this is I am running my app on a serverless platform where execution time is limited per function. I also want to avoid having my users wait for these dependent queries to finish.

Can anyone think of an alternative solution to this :pray: ?

Hello @siggi

Unfortunately all of the CDA requests are paginated with a max of 100 records per request. You can request up to 500 requests in a single call using the CMA, but i don’t know if that would be a good fit for you.
However i didn’t understand your use case where you have to fetch all of the strings on every page request, on your current schema you can’t filter the strings to only request the ones used on that specific page?

We don’t fetch all the strings, we only request strings required for that page but again, they have already exceeded 100. I think 500 would be a more reasonable limit for us, but using the CMA is not really an option for us. Is there no way to expand the maximum to 500 or for the CDA as well?

The only solution I can see for this right now is to do this on build time, fetch the strings, write them to a file or something, and read that on each request.

@siggi unfortunately for the CDA the 100 record limit per request can’t be raised, sorry.

But fetching strings across more than 100 records for a single page seems like a potentially problematic solution. Since they are just static strings that will be used for localisation, maybe setting them up a JSON filed inside a single related record would be a better solution to not make several requests for the same page?

Or, as you said, caching the strings at build time could also solve your problem

@siggi just adding my 2cents here :slight_smile:

I see that you would like to minimize the number of calls, but it’s not necessarily true that one single big call is faster then a few smaller ones. Actually is often true the contrary.

If you go with GraphQL you can get the number of pages straight from the first call, adding something like this:

{
  _allArtistsMeta {
    count
  }
}

So with the first call you get the first 100 records and the number of pages.

Then in parallel you can get all the remaining pages, so it should take slightly more than 2 rounds to get all the records, am I wrong?

3 Likes

Thanks for the input, I decided to go with another approach. I’m using JSON fields. It means my translation flow is a bit more complicated but I can at least fetch it in a single round trip.

I think you shouldn’t use product description as i18n string. instead, you may utilize datocms localization support. CDA can give you the localized data like so Content Delivery API - Localization - DatoCMS Docs

1 Like

Hi, just checking, is there an option to increase CDA fetch limit to 500?

Hello @itsahmedkamal no, sorry, at the moment the limit of records per request on the CDA is 100

ok - thanks for confirming.