Hi,
using /search/multi I sometimes get duplicate results on consecutive pages. For example: Jon Baker, id: 1546757 is the last item in /search/multi?query=jon&page=7 and the first item /search/multi?query=jon&page=8. Is this a bug or should I always take this into account and filter out duplicates when using pagination?
Non riesci a trovare un film o una serie Tv? Accedi per crearlo.
Vuoi valutare o aggiungere quest'elemento a una lista?
Non sei un membro?
Risposta da ticao2 π§π· pt-BR
il 12 giugno, 2020 alle 1:06PM
In order for someone to help you with API Request questions, it is critical that you post here the API Request you are using.
Remember to replace your Key with THE_KEY , or something like that.
I made these two API requests and this ID 1546757 was not on pages 7 or 8.
https://api.themoviedb.org/3/search/multi?api_key=THE_KEY&query=jon&page=7
https://api.themoviedb.org/3/search/multi?api_key=THE_KEY&query=jon&page=8
You'd better post the API Requests you made.
Risposta da Bene8493
il 12 giugno, 2020 alle 2:11PM
I'm sorry. These were the requests:
https://api.themoviedb.org/3/search/multi?query=Jon&page=7
https://api.themoviedb.org/3/search/multi?query=Jon&page=8
I use the Authorization Header instead of the api_key query parameter. The problem is not specific to ID 1546757 it happens with many other people too. I just tested it again and there were no duplicates until page 11. Then again, the last item from page 10 was the first item on page 11: "Jon Ekstrand" ID: 1077404.
Tested again with query: "The". This time there was a duplicate Movie on page 3:
https://api.themoviedb.org/3/search/multi?query=The&page=2
https://api.themoviedb.org/3/search/multi?query=The&page=3
Again last item from page 2 is first item on page 3. There seem to be different duplicates each time I try so might not be easy to reproduce. Most of the time the first duplicate appears after page 5.
Risposta da ticao2 π§π· pt-BR
il 12 giugno, 2020 alle 2:43PM
I tried to reproduce the error and failed.
I checked in 10 pages of a Requisition.
Perhaps if I checked in 20 pages the error would arise.
I believe that only Travis Bell can have an answer.
So let's leave your question open and wait for him to see it.
Risposta da Bene8493
il 15 giugno, 2020 alle 6:42AM
Would be nice if Travis could take a look at it. I wrote a python script to reproduce it. So far I found at least one duplicate every time I run it. The results seem to change quite often, but there are always duplicates in the first 10 pages it seems. Let me know if you need anything else.
Risposta da ticao2 π§π· pt-BR
il 15 giugno, 2020 alle 10:34AM
@Bene8493
I sent a warning to Travis Bell.
This problem is far beyond my ability.
Thank you.
Risposta da Travis Bell
il 17 giugno, 2020 alle 11:27AM
Hi @Bene8493, I've created a ticket to track this here. Unfortunately I don't have any time to look at this in the near future but at least I have it tracking now.
Risposta da Bene8493
il 22 giugno, 2020 alle 2:41PM
Thanks. FYI: also happens in v4 recommendations (/account/{account_id}/movie/recommendations). I got Movie id 393 on page 2 and 4.
Risposta da bicelis
il 23 giugno, 2024 alle 8:32AM
Hello :) I'm still experiencing this issue in 2024. @travisbell, is there a fix perhaps in progress? :)
Risposta da Travis Bell
il 25 giugno, 2024 alle 2:05PM
As items shift around due to either changing data, or things like updated popularity scores during the day, we make no guarantee that there won't be duplicate items across multiple pages. This is because a lot of data is cached.
You can track which ID's have been "seen" by your app and skip ones that have already been returned.
Risposta da Zsolt Bertalan
il 25 ottobre, 2024 alle 5:53PM
That's a bigger problem than you believe. It's not the clients job to fix a bug on the server. I understand it's due to caching, but the caching needs to be more coordinated. The problem is more reproducible on frequently changing pages like popular or now-playing. It's also not only duplicating. The problem is caused by swapping movies between pages, so for every duplicate there is a missing movie. And because the page caches are generated at different times, the frequently changing ones have duplicates and missing ones when you hit the cloud cache for the first time, for example when starting an app for the first time.
Risposta da ticao2 π§π· pt-BR
il 25 ottobre, 2024 alle 7:59PM
It may be better to clear the entire cache.
Generate a new list at a specific time of day.
Keep this new list in the cache for 24 hours.
And repeat this operation the next day.
Risposta da Zsolt Bertalan
il 26 ottobre, 2024 alle 11:09AM
And how will this fix the 'bug' on the server side? The problem is that the various pages represent different resources and they are cached on the server cache at different times, while they are related. Maybe a simplified example will help to understand it.
Imagine two pages with two items each. At midnight, Page 1 holds A and B initially, while Page 2 holds C and D. Page 1 is cached at uneven hours, Page 2 is cached at even hours. At 1 o'clock Page 1 is updated, and at this point B and C are swapped.
If I start an app at 1:30, Page 1 returns A and C, while Page 2, which has an old cache and will only update at 2 o'clock, will return C and D. So B is missing altogether, while C is duplicated. After 2 o'clock Page 2 will correctly return B, but by that time something else is swapped, like D and E between Page 2 and Page 3.
Currently I have popular, upcoming and now-playing movies in my app, which return 20 movies per page. When I open my app for the first time and scroll 10 pages each, instead of 200 each, I see 199, 190 and 193 movies, because there are 1, 10 and 7 swaps respectively, that are not fully cached yet at this point. AFAIK there is no way to get the missing movies, because the server doesn't respect cache-control directives, which is understandable. I'm not sure what's the exact logic of the Cloudfront cache, it would be interesting to know.
Risposta da ticao2 π§π· pt-BR
il 27 ottobre, 2024 alle 11:20AM
Deletes the entire cache.
The system generates a new list. 500 pages.
All 500 pages in this new list will be the pages sent for any search performed throughout the day.
The 500 pages will not be updated throughout the day.
The next day, a new list is generated with the score changes that occurred.
It would be something like "The most popular yesterday" or "Yesterday's Trending"
Risposta da Zsolt Bertalan
il 27 ottobre, 2024 alle 12:14PM
Yes, this is how it should work, but it's not. What is your point?
Risposta da ticao2 π§π· pt-BR
il 27 ottobre, 2024 alle 12:19PM
There are several changes in votes or trending throughout the day.
These changes throughout the day are considered with each new API Request.
My point is, do not consider these changes that occur throughout the day.