Hi, with the following url: https://api.themoviedb.org/3/discover/movie?api_key=######&language=en-US&year=2016&page=1&sort_by=popularity.desc
is it possible to get more than 20 results per page or in some cases all results to reduce requests. At first i am having to issue a request to work out the number of pages a given year has and then re issue the request to iterate through each page to get all movies required.
I don't want to exceed limits so wondered which way would be most efficient.
¿No encuentras una película o serie? Inicia sesión para crearla:
¿Quieres puntuar o añadir este elemento a una lista?
¿No eres miembro?
Contestado por Travis Bell
el 1 de diciembre de 2016 a las 12:17
Hi @nwalker78
No, it is not possible to adjust the size of a page. We limit them to 20. The number of pages is always returned with the
total_pages
field so you can iterate between 1 and the last page.Contestado por nwalker78
el 1 de diciembre de 2016 a las 14:11
Thank you for your prompt response. I don't have any problems with doing it this way i just wanted to make sure there wasn't a better way of doing it before i looped through all 500 pages for a given year. so is adding a sleep(rand(2,5)); delay in my iteration loop sufficient? on the upside the data will only need fetching once and periodically updating.
Contestado por Travis Bell
el 1 de diciembre de 2016 a las 14:18
Our rate limits average out to 4 r/s (being burstable to a total of 40 every 10 seconds) so as long as you stay under that limit you won't have any problems. Even if you trip the rate limits, you can try the request again once your timer has been reset. This is explained here if you didn't see it.
Cheers.