Hi, with the following url: https://api.themoviedb.org/3/discover/movie?api_key=######&language=en-US&year=2016&page=1&sort_by=popularity.desc
is it possible to get more than 20 results per page or in some cases all results to reduce requests. At first i am having to issue a request to work out the number of pages a given year has and then re issue the request to iterate through each page to get all movies required.
I don't want to exceed limits so wondered which way would be most efficient.
لم تجد الفلم أو المسلسل ؟ سجل دخولك و انشئها
هل تريد تقييم او اضافة هذا العنصر للقائمة؟
لست عضو؟
رد بواسطة Travis Bell
بتاريخ ديسمبر 1, 2016 في 12:17 مساءا
Hi @nwalker78
No, it is not possible to adjust the size of a page. We limit them to 20. The number of pages is always returned with the
total_pages
field so you can iterate between 1 and the last page.رد بواسطة nwalker78
بتاريخ ديسمبر 1, 2016 في 2:11 مساءا
Thank you for your prompt response. I don't have any problems with doing it this way i just wanted to make sure there wasn't a better way of doing it before i looped through all 500 pages for a given year. so is adding a sleep(rand(2,5)); delay in my iteration loop sufficient? on the upside the data will only need fetching once and periodically updating.
رد بواسطة Travis Bell
بتاريخ ديسمبر 1, 2016 في 2:18 مساءا
Our rate limits average out to 4 r/s (being burstable to a total of 40 every 10 seconds) so as long as you stay under that limit you won't have any problems. Even if you trip the rate limits, you can try the request again once your timer has been reset. This is explained here if you didn't see it.
Cheers.