Talk:List of films with a 100% rating on Rotten Tomatoes

Latest comment: 6 months ago by Erik in topic Variety coverage

Television questions

edit

I have just discovered that The New Edition Story has a 100% on Rotten Tomatoes. Since it is a miniseries and not a television series, should it be added to list or does it not count? Hitcher vs. Candyman (talk) 04:52, 2 June 2017 (UTC)Reply

Also, should Homicide: The Movie be removed from the list since it is considered an episode of the show Homicide: Life on the Street? Hitcher vs. Candyman (talk) 04:57, 2 June 2017 (UTC)Reply

Question

edit

Are the entries on this list arranged in no particular order? Slightlymad 10:41, 15 October 2017 (UTC)Reply

Edit request

edit

The Farthest has a 100% rating and 21 reviews from critics. Culloty82 (talk) 20:19, 15 November 2017 (UTC)Reply

Up for Deletion...again

edit

This is the third time this article is up for deletion. I think it should stay as I find it to be useful. I feel like there should be a rule (maybe there is, I don't know) on repeatedly nominating something for deletion. How many times does it need to be ruled a 'keep' before people stop nominating it? Anyway, I think it should stay. Thoughts? — Preceding unsigned comment added by Fjf1085 (talkcontribs) 15:21, 22 January 2018 (UTC)Reply

After third AfD

edit

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


Consensus was reached for the following inclusion criteria: "only include films that has a Critics Consensus at Rotten Tomatoes". Thank you. Gaioa (t,c,l) 23:32, 31 January 2018 (UTC)Reply

Inclusion criteria

edit

With Wikipedia:Articles for deletion/List of films with a 100% rating on Rotten Tomatoes (3rd nomination) resulting in no consensus and stating to continue discussion regarding the list's scope, I would like to do that here. I think there is enough consensus to implement inclusion criteria to list a film here because many editors, including myself, find the list rather indiscriminate. First, some numbers. There are 1,027 films on this list ranging from having 5 reviews to having 167. The average number of reviews is 16.6, and films with this number of reviews or less do not have any kind of "critics' consensus" as far as I can tell. I'm not sure when exactly a film receives a "critics' consensus" summary, but it seems to be after a sufficient sample of reviews. Maybe 40? Can someone confirm? In any case, here are a few options to limit the list to:

  • A specific number of films
  • Films that have a specific number of reviews
  • Films that have a "critics' consensus" summary (which seems to exist after a reasonable sample)

We can also move the article to "List of top films with a 100% rating on Rotten Tomatoes" so a limited scope is indicated in the article title. Thoughts on the cutoff and doing the move? If we have too many disparate opinions about the cutoff, we can explore the possibility of a straw poll using the guideline at WP:POLL. Thanks, Erik (talk | contrib) (ping me) 21:48, 24 January 2018 (UTC)Reply

Please add new comments to the "Suggestion for inclusion guidelines" thread below as we both posted our thoughts at the same time. Erik (talk | contrib) (ping me) 21:51, 24 January 2018 (UTC)Reply

Suggestion for inclusion guidelines

edit

I saw that this just closed its third Afd exercise with no consensus. I voted keep, and think this is a useful list, but the discussion raised good issues about how to keep it manageable. I'm assuming the people who had strong opinions are watching this page, so I won't bother posting a request for comment. Let's come up with a workable solution.

The main issue I see is that it's hard to maintain the vote count accurately. It's mostly an issue for newer movies less than a year old, because critics don't tend to retroactively review old movies. Per Rotten Tomatoes, they only count movies on their top ranked list [[1]] with at least 40 reviews, but don't explain why. I think it's to stop the problem of all movies being added to the list by default, and then having to be taken off as soon as the first negative review comes in. By the time you get to 40, you've got good consensus. There's an angry discussion above about making it 20, so the actual cutoff number is worth a discussion.

On a related note, we could discuss whether to grandfather in older movies that are not ever going to get to 40. If we go with 40, we cut off Rio Bravo (film) with 39, Stagecoach (film) with 39 and other classics. The nice thing about the older movies is the vote count is going to be static, and so doesn't present a problem with article maintenance. The problem with this is that there are a lot of obscure movies mixed in with what we'd call known classics, that would have to be addressed.

As far as maintaining the number of reviews, this link will quickly show us if a recent movie is still at 100, and we can quickly eyeball the 100%s to see if the count is right. Maybe checking every week is enough. [[2]].

Thoughts? TimTempleton (talk) (cont) 21:48, 24 January 2018 (UTC)Reply

Hah, we did this at the same time! :) Let's merge our comments. Erik (talk | contrib) (ping me) 21:50, 24 January 2018 (UTC)Reply
  • I think if we are going to keep the list we should defer to Rotten Tomatoes. If they believe the film has enough reviews to formulate a critics consensus summary then it should probably be on the list. From the first ten films this would essentially result in retaining The Birth of a Nation and The Cabinet of Dr. Caligari and dropping the rest. Any other type of cut-off would be arbitrary IMO. Betty Logan (talk) 22:27, 24 January 2018 (UTC)Reply
I'll expand on your suggestion for anyone unfamiliar with the RT site. Each movie listing has a critic's consensus section. The section either summarizes the consensus, or simply says no consensus. I don't know what triggers them to add a consensus, but it's not the 40 review threshold they list on their overall rankings pages, because as you pointed out, The Birth of a Nation has consensus, with 38 reviews, but I noticed 2012's More than Honey also with 38, is not showing consensus.[[3]] They may weight the critics differently so that one film's 38 is considered more credible than another's. So the decision tree seems to be 1) cull or don't. My vote is to cull, since the current list is unmanageable, and it will only continue to get nominated for deletion. Then, if we vote to cull, the decision tree is which method do we pick - review count or existence of critical consensus. (I can't think of a third option but correct me if I'm wrong. If that sums it up, then we'd have two voting sections: a section to vote cull or don't cull; and a section voting for the method of culling for those voting cull. I'll take a shot at it below. TimTempleton (talk) (cont) 00:44, 25 January 2018 (UTC)Reply

Voting

edit

See above for discussion of voting. TimTempleton (talk) (cont) 00:55, 25 January 2018 (UTC)Reply

Decision 1: cull list or leave as is

edit
  • Cull - unsupportable since list is too long, and most new movies will start with positive reviews due to friendly critics getting advance screenings, to create buzz. Too hard to maintain once they drop off. TimTempleton (talk) (cont) 00:53, 25 January 2018 (UTC)Reply
  • Cull for sure because the list is indiscriminate. There is very little meaning in listing a film with five reviews at 100%. Not sure about method yet. Erik (talk | contrib) (ping me) 00:57, 25 January 2018 (UTC)Reply
  • Cull As the nominator of the 3rd AfD, I cannot stop pointing at this article's grotesqueness. My honest opinion is still deletion, but since the community seems to disagree with me, I guess we'll have to work with what we have instead. One thing is for certain that we agree on: the list is too darn long. I don't know very much about RT to be honest, but my issue was with the article itself and its inclusion. Thankful for cooperation, thankful for Wikipedia, Gaioa (t,c,l) 08:55, 25 January 2018 (UTC)Reply


Decision 2: if voting cull, vote for method (review count, existence of critical consensus, other)

edit

*Review count of 40 - this is the number Rotten Tomatoes uses. Basing inclusion on whether consensus has been reached would require going to each new entry that was at 100% and constantly checking it. If we use the 40 figure, we can simply glance at the master list to see the review count and the current percentage status. [[4]]TimTempleton (talk) (cont) 00:53, 25 January 2018 (UTC)Reply

  • Review count of 20 I struck my vote above and changed it. (The comments below as of the time of this comment except for my concurrent one are discussing the 40 review number.) If we use 20, that would make the 100% and 0% lists consistent yet still useful. 20 reviews turns out to be when a film becomes eligible for a Rotten Tomatoes Critics Consensus. TimTempleton (talk) (cont) 21:43, 26 January 2018 (UTC)Reply
  • If that is what RT themselves use, I absolutely support it. We talked on the AfD that any number of reviews we decide on would be arbitrarily invented by Wikipedians, but if we can find a number that is not our own, that's absolutely dandy. I can't confirm that they use the number 40, since I don't frequent RT, but I'll take your word for it. Thankful for cooperation, thankful for Wikipedia, Gaioa (t,c,l) 08:55, 25 January 2018 (UTC)Reply
  • Could we just go with if a page has a "critics' consensus" summary? That should be roughly in the 40-review territory, right? Can't imagine being available for less than 35 or 30 reviews. Erik (talk | contrib) (ping me) 12:52, 25 January 2018 (UTC) I take that back. I reviewed the list and found that Vampyr with 29 reviews had a critics' consensus. I didn't look any further. So I'm not sure now at which number of reviews every film will have such a summary. At the moment, I'm preferring requiring a specific number of reviews. 40 sounds good, and I don't mind 50. 40 would mean 77 films listed. 50 would mean 39 films listed. We could go with 40 for starters, as that would be a 92.5% reduction to the list. Erik (talk | contrib) (ping me) 19:08, 25 January 2018 (UTC)Reply
    Boogie has a critics summary and only 10 reviews so there seems to be no consistent threshold for when a film gets a summary. We should take a look at some of RT's "best film" genre lists and see if there is a threhold there. I am reluctant for us to impose a completely arbitrary limit because it will be the older classics and foreign films that are up for the chop. Betty Logan (talk) 21:49, 25 January 2018 (UTC)Reply
Here's the link to where Rotten Tomatoes says the movies are listed on their best of lists if they have 40 reviews,[[5]] but they also include a note saying they use a weighted Bayesian formula that accounts for the variation in the number of reviews. (To see it, click the question mark at top next to where it says Sorted By Adjusted Score.) I agree that we'd miss out on classics if we use 40 reviews (ex: Stagecoach and Rio Bravo only have 39 reviews) which is why I wanted to start this discussion. THere's also something else going on behind the scenes with their rankings. Note that Paddington 2 has 173 reviews, the most of any 100% movie ever, yet is not the top ranked movie. I'm reaching out to Rotten Tomatoes support for their thoughts, but I doubt they'll give me any proprietary info. Hopefully they can at least explain why they chose 40 as the cutoff, and give me a general idea why a 100% movie with fewer reviews might be ranked higher than another movie with more reviews. TimTempleton (talk) (cont) 00:17, 26 January 2018 (UTC)Reply
It's likely that they use 40 as the number of reviews because it is an appropriate sample. In statistics, around 32 is considered the default sample. (Don't ask me why.) If we do use 40 as the cutoff, that would allow us to list 77 films. I don't think we should worry about including classics. There are even more classics further down the list with even less reviews. We need to keep the criteria simple. Any reason not to go ahead with 40 reviews? We could also include Rotten Tomatoes's Top 100 Movies of All Time as an external link. Erik (talk | contrib) (ping me) 03:35, 26 January 2018 (UTC)Reply
I would agree that 40 reviews is the only non-arbitrary review-count based threshold available. Having a statistical background I can answer at least one of your other questions too. Anything below 30 independent samples is not statistically meaningful—to be fair samples over 100 are preferred but statistical theory starts to fall apart once you drop below 30. Rotten Tomatoes has probably opted for 40 to be on the safe side. As for the Bayesian formula the theory is that smaller samples have a larger error variance around the true value (which is defined as the average if you were to ask every eligible critic). The principle works like this: let's say the overall mean for all films is 60%, and a single critic comes along and gives Paddington a good review, Bayes theory says that the true value for Paddington is closer to 60% (the mean) than 100% (your sample figure based on one review). Say you have 100 critics all giving Paddington a good review, Bayes theory says that Paddington's true score is closer to 100% than 60%. Bayes theory can quantify the scale and predict a "true" value between the mean and your sample figure. However, that doesn't explain how the The Wizard of Oz on 99% with 110 reviews beats Paddington on 100% from 174 reviews, but if I had designed this system I would have also normalised the means for each critic too: for example a critic that gives positive reviews to 40% of the films he reviews is a tougher marker than a critic who reviews 60% of films positively. It is very likely RT have done something similar. I don't think this is something we need to worry about. Betty Logan (talk) 04:20, 26 January 2018 (UTC)Reply
It appears to me that we have established the consensus requested by the AfD. I am eager to start culling the list at once, but will keep my cool for a while more.
Also, I assume that whatever consensus established here will apply to List of films with a 0% rating on Rotten Tomatoes and that that will be likely culled too. If anyone thinks that that article requires a separate consensus, I will start gathering such instantly. TBH, if that article is truly independent, I will nominate that for deletion and see where we end up. Thankful for cooperation, thankful for Wikipedia, "Gaioa" (t,c,l) 09:46, 26 January 2018 (UTC)Reply
timtempleton, shall we go ahead with 40 reviews as the cutoff criteria? All, we do want to solicit additional input? Gaioa, there is much less coverage about films with 0% ratings. If we did the 40-reviews cutoff, that would make for only six films from that list. I'm thinking perhaps we can have a section at Rotten Tomatoes that discusses this list (and links to it) and mentions the top five. Then that section could also mention the 0% rating films and name all six films in a sentence. Erik (talk | contrib) (ping me) 20:59, 26 January 2018 (UTC)Reply
I didn't realize there was an article listing the 0% positive movies. It would be nice to use the same cull criteria, but that would pretty much be the end of that list. I think both the 100% and 0% lists have equal value, but I'm speaking as a movie trivia junkie. I got in touch with the editor of Rotten Tomatoes, and he wrote "The 40-review minimum was chosen decades ago based on the amount of reviews there were to collect for new releases back then, and has stuck since. As there are more reviews now, we tend to wait until a wide-release's Tomatometer has stabilized well beyond the 40-review count before marking it Certified Fresh if it qualifies." On the other hand, I found out that 20 is when a movie becomes eligible for Rotten Tomato's Critics Consensus. I'm now thinking we could make the cutoff for the 100% and 0% pages at 20 reviews; they would be consistent, and we'd preserve more of the films on the 0% list also. I'm changing my vote above to 20 reviews. It will make the list harder to maintain, but will preserve more films on the list for interested readers, its true audience. As far as maintenance, these days the newer 100% and 0% Rotten Tomatoes movies get so much media attention, they will be easy to add and maintain going forward. Older movies (pre 2016 or so) will rarely have more reviews added, also simplifying list maintenance. I thought we'd have more participants so I might wind up pinging some of the voters from the last AfD for their thoughts. TimTempleton (talk) (cont) 21:43, 26 January 2018 (UTC)Reply
I would ping all of them to be fair and avoid the appearance of canvassing. Erik (talk | contrib) (ping me) 21:52, 26 January 2018 (UTC)Reply
I disagree about the necessity of including those who voted delete, since this isn't a delete vote, but will ping everyone anyway. Please disregard if uninterested. Johnpacklambert, Beemer69, Pinguinn, Rhododendrites, Ribbet32, Mangoe, FloridaArmy, The Rim of the Sky, Ribbet32, Loriendrew, Roman Spinner, K.e.coffman, Dr. Blofeld, Jaguar, Smurrayinchester, Spanneraol, Dpm12 TimTempleton (talk) (cont) 23:54, 26 January 2018 (UTC)Reply
The rationale for setting the cull threshold at 20 reviews would be to ensure that all films with a critics summary are included, on the basis of what Rotten Tomatoes says. However, they have clearly provided incorrect information or changed their policy at some point, because as I pointed out above Boogie has a critics summary on just 10 reviews. Flicking through some of the others below 20 reviews I also found Rambling Rose with a summary on just 18 reviews. If we set the threshold at 20 reviews we would still be dropping films with critics summaries, so it would just be an arbitrary threshold. If the motivation here is to ensure we include films with critic summaries than that should be the criteria. Perhaps another option is to automatically include all films with 40 reviews and more and also include any films below that threshold with a critics summary. As for the list about the 0% films, I don't think we should concern ourselves with this just yet; this list should be sorted out first and then we can consider the options for the other list. Betty Logan (talk) 00:21, 27 January 2018 (UTC)Reply
There are some inconsistencies, as you point out, so we may never get it exactly right, but going forward it sounds like RT's critics consensus is more often than not going to be established at 20 reviews. If we choose a 20 cutoff, we do lose some older movies that were for whatever reason able to get critics consensus. Keep in mind also the manual work involved going 745 movies that fell below 20 reviews but have 100% ratings, to see if they have consensus, but it's better than going through the sub-40 movies (949 as of this moment). Let's see what others think. TimTempleton (talk) (cont) 22:08, 27 January 2018 (UTC)Reply
Alright, so the question right now is to put the cull at 20 reviews or 40 reviews. Both these digits are used by RT themselves, so neither is invented by Wikipedians. Personally, I don't really have an opinion; I still want this article null and void tbh. But let's think through this and ask "which review count would be most meaningful to an encyclopedia?"
And try not to make arguments based on "my precious movie will be murdered" or similar. Thankful for cooperation, thankful for Wikipedia, Gaioa (t,c,l) 11:25, 28 January 2018 (UTC)Reply
My vote for 20 was because that's when a movie is supposed to get a critics consensus. It appears that other films below that threshold show critics consensus as well, so a hard number doesn't work as well as we'd like. Therefore, I support including all films with a critics consensus that are at 100%. And the best way to cull the list would be to start from the bottom and delete those with the fewest reviews that don't have critics consensus. I don't know what the final number is but we can always go from there. TimTempleton (talk) (cont) 20:02, 29 January 2018 (UTC)Reply
I agree completely on 20 as the cut off point.♦ Dr. Blofeld 21:12, 29 January 2018 (UTC)Reply
Other languages?
edit
  • 20 for English language films, 10 for other languages: There is some systemic bias on Rotten Tomatoes too, Anglo-phone films tend to have higher reviews than films of other languages such as Russian, Hindi, Spanish, etc. If we set the bar as 20 for all languages, this list will evolve into an only-English 100% rating. 86.97.130.20 (talk) 16:58, 27 January 2018 (UTC)Reply
    I'm sorry, but this is a ridiculous suggestion. If below 20 is too low for English-language films then it is too low for non-English films. You could just as well say we should have a lower threshold for classic films otherwise the list will be dominated by recent films. If you want to maximise the representation of classics and foreign language films then the answer is to simply not cull at all. Ultimately, Rotten Tomatoes is an English-language review aggregator so it will be dominated by recent English-language films, and built-in bias of the website isn't Wikipedia's problem. Betty Logan (talk) 21:55, 27 January 2018 (UTC)Reply
    Indeed. If there's systemic bias in RT, that's RT's problem. This is an objective list which only are supposed to convey raw information. We're not giving an handicap for foreign films just because "we" believe it is needed. Thankful for cooperation, thankful for Wikipedia, Gaioa (t,c,l) 11:25, 28 January 2018 (UTC)Reply
Perhaps that's an opportunity to consider adding an article entitled Foreign Films with 100% Reviews on Rotten Tomatoes. That would be a lot of manual work to check whichever of these 1,027 films you were unsure of. Also, I see lots of hatnotes flagging articles for not representing a "rest of world" perspective. In this context, "Foreign Film" really means non-US, so I'm not sure if that presents an issue. Maybe call it List of non-English language films with 100% Reviews on Rotten Tomatoes? But then, you run into the same list creep issue we're trying to address here. TimTempleton (talk) (cont) 22:08, 27 January 2018 (UTC)Reply
@Timtempleton:Please no! I nominated this for deletion questioning if it was notable enough, and responses varied. I sincerely doubt there is any notability-related reason to split this article into two. to counter language bias. Now, let's get back to the matter at hand please. Thankful for cooperation, thankful for Wikipedia, Gaioa (t,c,l) 11:25, 28 January 2018 (UTC)Reply
PS: I'm not anti-diversity just for saying this. I wrote the article List of highest-grossing non-English films single-handedly.
It isn't necessary to split the article, we can organize it into two sections in this article titled English-language films and Other language films. Leave a note above each section notifying the editor of the consensus we gained here. Thoughts? 86.97.130.20 (talk) 13:59, 29 January 2018 (UTC)Reply
I still don't think it's a good idea. If RottenTomatoes is biased, why should it be Wikipedia's task to set it right? This is a list about RT, not a list that is "good". And we can't use arbitrary Wikipedia-invented rules to decide what should be included - that often counts as WP:OR.
Besides, who says RT is biased against foreign film? We at Wikipedia can't take your word for it, you must prove your point with sources. And regardless, this is not the place for determining and counteracting such bias - such criticism would belong to the article Rotten Tomatoes itself.
If you want to further your point, consider adding a (reliably sourced) entry about it to Rotten Tomatoes. Thankful for cooperation, thankful for Wikipedia, Gaioa (t,c,l) 16:31, 29 January 2018 (UTC)Reply
I also concur not to separate out by language. This list is not the equivalent of list of films considered the best; it is simply based on the movie-going population's fascination with having a "perfect" score. (To this end, the prose needs be expanded to summarize the situations where movies with 100% ratings get knocked down and the resulting reactions.) Because of this narrow—but acceptable—scope and the cutoff criteria being developed, there is no need to engage here in spinning off or splitting lists. With that said, how about we go ahead with the 40-count cutoff plus movies that have a critics' consensus even with less than 40 reviews? If there are still too many, we can cull the ones that have less than 40 reviews. Erik (talk | contrib) (ping me) 17:22, 29 January 2018 (UTC)Reply

Final consensus

edit

So consensus is to cull the list to those films with at least 20 reviews, plus those with fewer than 20 but which have achieved Rotten Tomatoes Critics Consensus? I know this will not satisfy everyone, but seems to be the best middle ground. Another out of the box thought I just had is that we can put a note on the talk page with a link to the current pre-culled list in the archives for anyone interested in seeing the original list as it stands today. Film buffs reading the talk page will find everything they want but the mainspace article won't be so long - win win? There will likely no longer be any movie that will hit 100% positive but not get critics consensus at some point, so between the archive and the current list, every 100% film will be available for readers. TimTempleton (talk) (cont) 17:37, 31 January 2018 (UTC)Reply

@Timtempleton: Aye. I still find this article excessive by simply existing, but I'm glad we found consensus to shorten it. I'll save the list in its current state right away, and put it at Talk:List of films with a 100% rating on Rotten Tomatoes/old list. I'll also put a note in a comment above the list to make all quickies aware of this consensus.
But if we're sticking with only those with Critics Consensus, someone else gotta do the culling itself. I don't have time to look through all these films. I'm sure someone else would be glad to do so.
One more thing: let's add {{atop}} and {{abot}} around this discussion now when we're done here. Thank you. Gaioa (t,c,l) 19:03, 31 January 2018 (UTC)Reply
Honestly, there are SO MANY films right now to check if they have critical consensus, I don't think anyone has the energy for it. Would it be acceptable for a one-time post-discussion action to delete everyone with less than 20 reviews and then gradually re-add those with critical consensus? Thank you. Gaioa (t,c,l) 19:22, 31 January 2018 (UTC)Reply
Good work on how you saved the old list - much better than my plan to link to the edit history. It seems that since the default sort order isn't by # of reviews, it will take a while to manually go just through and edit out the sub 20 reviews, and then someone else has to restore them in the right place chronologically, versus just taking a non-qualifying article off with one step. Two steps versus one. Let me start to take a few off the top and see how long it takes me, and I'll comment back here. TimTempleton (talk) (cont) 22:43, 31 January 2018 (UTC)Reply
OK - it took me about five minutes to take ten names off, so I'll estimate about an hour to take 120 names off. Let me plug away with this unpopular task as time permits and see how far I can get. While doing this, I discovered that there are several older films with more than 20 reviews that don't have Rotten Tomatoes Critics Consensus. Also, an unfortunate side effect of this is that what's considered one of the racist movies ever made - The Birth of a Nation - is now at the top of the list, the first thing readers see. An unintended consequence, but it is what it is. I also noticed there are many A Trip to the Moon reviews on the web that are not shown on Rotten Tomatoes. That may be the case for classics that didn't make the cut, such as 1903's The Great Train Robbery. Seems like Rotten Tomatoes has two tasks they should focus on - adding Critics Consensus for the 100% positive movies that have at least 20 reviews, and finding and adding additional reviews for the classics, so they can also meet their supposed threshold and be marked as having Critics Consensus. TimTempleton (talk) (cont) 22:59, 31 January 2018 (UTC)Reply
Nice. But do try not to fall for the WP:ILIKEIT fallacy, how tempting it ever may be. Maybe RT will react when they see their list is being trimmed.
Alright, I'm closing this discussion now! Thanks everyone! Thank you. Gaioa (t,c,l) 23:26, 31 January 2018 (UTC)Reply
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

Looks much better!

edit

Looks much better now! Great job!.♦ Dr. Blofeld 15:22, 1 February 2018 (UTC)Reply

Comment on cull

edit

After taking a chunk of classic films out, and seeing that 2017's The New Edition Story and 2008's Boogie (2008 film) both made the cut, while All Quiet on the Western Front (1930 film), Babes in Toyland and Wuthering Heights did not, suggests that the cull criteria we're using is one that favors modern films, and many good classics will sadly be left out. I suppose that makes sense; as opposed to Wikipedia, Rotten Tomatoes exists to make money, and there are probably more opportunities to monetize newer movies than older ones. I do hope that their editor can review the older films, find enough reviews and/or establish consensus so we can add them back. It also suggests that we should look for other best film lists besides List of films considered the best to add to the external links section, for balance. TimTempleton (talk) (cont) 19:11, 1 February 2018 (UTC)Reply

Classic and foreign films were always going to pay the price: Gone with the Wind couldn't even garner 100 reviews on Rotten Tomatoes while the most recent Star wars film got over 300. Like I said above, if your aim is to maximise classic and foreign representation then don't cull the list. That said I don't have a problem with the bias. If the bias is inherent in the subject that is not our problem i.e. we don't expect our Hitler article to give over 50% of its coverage to telling us what a top bloke he was. The reality here is that Rotten Tomatoes is an English-language review aggregator that has only been around for 20 years, so it's not really a surprise that it is skewed towards recent English-language films. That is just the reality of the subject we are trying to cover. I recall somebody once complaining that every single film on the List of highest-grossing films was English-language; that is symptomatic of Hollywood dominance in the film marketplace not a flaw in how Wikipedia tabulates the data. Betty Logan (talk) 20:03, 1 February 2018 (UTC)Reply
Yes, it's certainly biased towards newer films, though looking at the list there's a reasonable balance overall so not much of a problem. I think the critical consensus criteria is the best way to regulate this list.♦ Dr. Blofeld 19:23, 1 February 2018 (UTC)Reply
Well, ladies and gentlemen, I'm surprised that you're surprised. This list is arbitrary. There's no rule within Rotten Tomatoes that they must include the classics or be focused on any direction of film. They appear to be biased in some aspects, yes, but let's remember that RT did not make up this list, Wikipedians did. RT just has a big bunch of films with various ratings, and it was editors like you and me that decided to make a list out of the highest ratings. And now, as some sort of secondary internal original research, we have discovered that our list become unfavorable to some films which we like and think are more important than others and that we must do something against that.
This is one of the reasons why I called for deletion recently - this doggone article is a directory, and no matter how we're trying to make it something else, that's just WP:OR. I suggest you all try to improve List of films considered the best to achieve a satisfactory film list, instead of trying to adjust the truth from RT. Unless someone may be willing to relist this for deletion, in wake of new issues? Thank you. Gaioa (t,c,l) 10:29, 2 February 2018 (UTC)Reply
It is not WP:OR. Nothing on the policy page applies in the slightest. You need to look at Wikipedia's policies and guidelines for working with lists. Editors can come to a consensus about what common selection criteria to apply. With that said, I am still in favor of a cutoff based on the number of reviews. This isn't any kind of celebratory list where we need to make sure we keep certain classics. Society at large is obsessed in this perfect score (as reflected in sources), and that's why we're listing these films. Again, there can also be prose about the arbitrary nature of it, like Get Out and Lady Bird "losing" the perfect score because of one review. Erik (talk | contrib) (ping me) 12:35, 2 February 2018 (UTC)Reply
Regardless of the cull inconsistencies, that doesn't take away from the fact that a 100% fresh review on Rotten Tomatoes has become one of the most culturally relevant ways to demonstrate a movie's broad appeal. We don't want to throw the baby out with the bathwater. 100% Fresh is the equivalent of the Good Housekeeping seal of approval, and movie companies will use it to promote movies whenever they can. We'd be remiss to not showcase the relatively rare film that reaches this cultural milestone. Remember what Winston Churchill said about democracy. "It's the worst form of government, except for all the other ones." Besides the AFI 100 Movies list, which is just American films, I don't see a better more comprehensive, more defensible list. TimTempleton (talk) (cont) 21:59, 2 February 2018 (UTC)Reply

Movies just missing the cut

edit

I thought I'd post this here since there was a lot of interest updating Black Panther (film) before it got its first bad review. [[6]] TimTempleton (talk) (cont) 00:10, 10 February 2018 (UTC)Reply

i comment on page format.

edit

i remove date from some rotten tomatoes citation because most do not have date in title.

but actual page title has date.

title citation should have date or not? IUpdateRottenTomatoes (talk) 20:57, 15 March 2018 (UTC)Reply

Please give an example of what you refer to. If you mean accessdate then it's the date an editor verified the site and not a date shown on the site. See Template:Cite web#URL and don't remove accessdate but update it if you update the number of reviews. PrimeHunter (talk) 22:01, 15 March 2018 (UTC)Reply
PrimeHunter i talking about for example paddington 2 (2017) or just paddington 2. IUpdateRottenTomatoes (talk) 22:26, 15 March 2018 (UTC)Reply
Just the year? I would omit that like we do now but either works. https://www.rottentomatoes.com/m/paddington_2/ says "PADDINGTON 2" in a large font and "2018" in a smaller font on the line below. The HTML title of the page is "Paddington 2 (2018) - Rotten Tomatoes". HTML titles are secondary if the page itself displays a title, and we also omit " - Rotten Tomatoes" from the title field. PrimeHunter (talk) 22:59, 15 March 2018 (UTC)Reply
I think in thise case Paddington 2 will suffice. A date is only necessary if there is more than one Paddington 2 on Rotten Tomatoes. The citation provides sufficient information to locate the page in this case, although that may not be true in every case. There are some generic titles on Rotten Tomatoes where the date is necessary to provide a full unambiguous citation (see all these films called Revenge for example). Betty Logan (talk) 02:34, 16 March 2018 (UTC)Reply

cc or 20?

edit

Hmm I seem to be missing something. above there's a great big section closed staying consensus to "only include films that has a Critics Consensus at Rotten Tomatoes". But I'm being told by two people now that that cc doesn't matter if there are 20 reviews? — Rhododendrites talk \\ 17:35, 25 March 2018 (UTC)Reply

Yes, it appears that Gaioa did not write the so-called closing outcome correctly. Let's re-verify what the criteria should be. Pinging others who were involved: Timtempleton, Betty Logan. For the record, I support the criteria of a film having at least 20 reviews or a critics' consensus. Erik (talk | contrib) (ping me) 19:06, 25 March 2018 (UTC)Reply
I think the closing consensus is premature. The discussion revolved mostly around what threshold the cut-off should be. Some of us favored 40 reviews, some 20 reviews, and some a critics summary. There was some consensus about retaining films with a critics summary (which Gaioa summed up) but we never reached a consensus either way about whether there should be a hard-count threshold as well i.e. 20 or 40 reviews. I regarded the current situation as an intermediate holding position: nobody is arguing to retain films with under 20 reviews and without a critics summary so it was ok to remove those films and worded the criteria to that effect. I honestly don't think the discussion above supports removing more than that at this stage (and I was one of the editors arguing for a more stringent inclusion criteria). Betty Logan (talk) 19:27, 25 March 2018 (UTC)Reply

missing

edit

https://www.rottentomatoes.com/m/fall_of_the_roman_empire/ Nergaal (talk) 17:57, 25 June 2018 (UTC)Reply

https://www.rottentomatoes.com/m/summer_1993 Krakatoa (talk) 06:03, 8 October 2018 (UTC)Reply

Documentaries

edit

I noticed the list is dominated by documentaries. I don't know exactly how many, but I checked 10 films at random and 9 of them were documentaries. This might be worth discussing in the article if there are sources. Lizard (talk) 22:25, 9 October 2018 (UTC)Reply

Forgot to mention this seems to be a recent trend. The films I checked were all released within the last few years, so it wasn't exactly at random. But the point still stands. Lizard (talk) 22:31, 9 October 2018 (UTC)Reply
This states, "As most of you can probably tell, most of these entries are smaller movies that didn't necessarily fall on everyone's radar, with a good portion of them, like Work and Quest, being documentaries." We could include that. Erik (talk | contrib) (ping me) 00:12, 10 October 2018 (UTC)Reply

New columns

edit

Here I removed the addition of "Av. Rating" and "Audience Score" and "Director" columns. First, I do not find "Av. Rating" to be necessary for the purpose of this list. For "Audience Score", per MOS:FILM#Audience response, we do not include user ratings because they are not a genuine representation of what audiences thought. Lastly, I do not find the "Director" column necessary either. I think it is the most important step to exclude "Audience Score", but considering how much more would need to be maintained with "Av. Rating" and "Director" columns, we should come to a consensus about these two columns. My perspective at this time is that such a list should be fairly simple. If anything, I'd rather see a brief synopsis of each film rather than the average rating or the director. Erik (talk | contrib) (ping me) 20:13, 16 January 2019 (UTC)Reply

I mostly agree with this. The user ratings on Rotten Tomatoes can be manipulated so we shouldn't include them, and the director is very peripheral to the scope of this article. It might be interesting to note which director is the most represented on the list but we don't need a whole column for that. As for the average rating I have mixed feelings about this: on one hand it is peripheral information, but on the other I can see the argument that where you have a list of films all on 100% the average rating may be a useful discriminant if you want a qualitative ordering mechanism. Betty Logan (talk) 00:52, 17 January 2019 (UTC)Reply
Including the average rating could give more context, maybe. For example, if 20 reviewers give a film a 3/5 rating, that's kind of mediocre. However, it could very well end up as 100% positive if Rotten Tomatoes decides they're all more positive than negative. Audience score is pretty much worthless since it's easily manipulated. Director seems unnecessary. NinjaRobotPirate (talk) 05:52, 17 January 2019 (UTC)Reply

Formatting for films which also have 100% Audience Review too?

edit

Could we add formatting (perhaps the film title in bold) for any entry where as well as 100% Critic Rating, they also have an audience rating of 100% too? We could restrict it to those that meet the following criteria, to prevent newly-release films being included:

  • the release date must be at least 12 months ago
  • there must be at least 100 User Rating

At the moment, there is only one that meets this criteria:

Name Year No. of reviews Ref
O.J.: Made in America 2016 53

What do you folks think of this?

If so, could someone amend the hatnote to show this, and amend the entry for O.J.: Made in America to show it in bold?

As this is the only one that currently meets the criteria (and I doubt that too many will meet the criteria in the future), it's quite easy to do a check on Rotten Tomatoes from time to time to make sure that it is still 100% for Audience Ratings!

Thanks 193.9.4.15 (talk) 14:18, 2 March 2020 (UTC)Reply

Hello, I understand the interest in expanding the table to indicate the film has a 100% audience score, but considering how such user scores lack good controls, they are suspect to vote-stacking. Not to mention that any random person could easily tank a 100% audience score by submitting their own negative score. In other words, it's too readily subject to manipulation. I think it would be more worthwhile to do something like include CinemaScore or PostTrak grades, but I don't think this table needs additional data columns. Erik (talk | contrib) (ping me) 19:38, 2 March 2020 (UTC)Reply

Reduced ratings

edit

It looks like because the Rotten Tomatoes staff is adding reviews for some older films, some of them are getting their 100% rating reduced. For example, Rear Window now has 99% with 70 reviews as seen here, where last December, it had 100% with 66 reviews as seen here. I don't see any coverage about this, but it would be a good observation to add to this list article if a source covered this aspect. A similar observation to consider is if 100% is even less likely if there are going to be many more critics qualified to be included, following the recent expansion. Erik (talk | contrib) (ping me) 22:33, 22 April 2020 (UTC)Reply

A Commons file used on this page or its Wikidata item has been nominated for deletion

edit

The following Wikimedia Commons file used on this page or its Wikidata item has been nominated for deletion:

Participate in the deletion discussion at the nomination page. —Community Tech bot (talk) 22:05, 10 May 2020 (UTC)Reply

"Making Waves: The Art of Cinematic Sound" listed at Redirects for discussion

edit

  A discussion is taking place to address the redirect Making Waves: The Art of Cinematic Sound. The discussion will occur at Wikipedia:Redirects for discussion/Log/2020 July 6#Making Waves: The Art of Cinematic Sound until a consensus is reached, and readers of this page are welcome to contribute to the discussion. Nardog (talk) 00:30, 6 July 2020 (UTC)Reply

Citizen Kane

edit

Why is Citizen Kane missing on this list? Taddah (talk) 19:54, 25 March 2021 (UTC)Reply

It has 99% now. Erik (talk | contrib) (ping me) 20:22, 25 March 2021 (UTC)Reply
Taddah, you were a month ahead of others in noticing that it was missing.   It's been covered in the past day or two, like here. Erik (talk | contrib) (ping me) 11:54, 28 April 2021 (UTC)Reply

Year statistics and recentism

edit

I think this article could benefit from some more perspective. For instance, a visual statistic for years up to 2005 with an unusual high amount of 100% films (maybe make it a run chart or line graph for all the years, akin to what they use for the stock exchange or inflation rates?). Also, are there external sources we could use to point out the system's blatant recentism starting around the time of 2005-2010? From that point on, years are increasingly bloated with more and more 100% listings, where I would doubt many professional critics would say that this phenomenon would be indicative somehow of a trend of rising excellency of film releases overall. --2003:EF:170E:7F46:E8A1:8329:2F1F:12F7 (talk) 15:52, 5 April 2021 (UTC)Reply

I haven't seen anything about recentism on Rotten Tomatoes. This list isn't really reflective of "rising excellency" anyway. It is a list of films where every single sampled review at least "liked" it. Just one dislike destroys the 100% score, no matter how many other likes there are. And if anything, that means a film is more likely to lose its 100% score with the more reviews it gets. So it's possible that films with a certain number of reviews are more likely to proliferate. Also remember that with the Internet, there are more film critics than ever before. The 100% score is just a cultural phenomenon that is of interest to readers. A list of films that have "universal acclaim" on Metacritic would be better, but there isn't coverage about that. Erik (talk | contrib) (ping me) 17:35, 5 April 2021 (UTC)Reply
No matter if RT, as a website, or the score is a "cultural phenomenon", it's still blatantly obvious recentism even just when you're looking at the bare figures. While the 100% scores are all based on *PROFESSIONAL* reviewers and not just some bloggers, audiences, or fans, a rising number of critics would actually *DECREASE* the chance for new films to reach 100%, and still we're seeing the exact opposite. Which is, in fact, another proof for recentism bias that I hadn't even thought of. --2003:EF:170E:7F90:71E5:2BD2:6C74:23A8 (talk) 18:31, 5 April 2021 (UTC)Reply
Okay, we need coverage from reliable sources about that, then. I follow a lot of movie news and haven't seen RT scrutinized in that way. Erik (talk | contrib) (ping me) 19:26, 5 April 2021 (UTC)Reply

Shoah and Pauline Kael

edit

We know that Kael was not kind to the 9 hour epic, but RT won't open submissions until next month. Does that count? Espngeek (talk) 18:29, 24 August 2021 (UTC)Reply

Two films with 100% rating

edit

Mary Poppins (1964) and The Railway Children (1970) also have 100% on Rotten Tomatoes. SECREngineNo592 (talk) 06:03, 7 March 2022 (UTC)Reply

Rotten Tomatoes shows Mary Poppins to have 96% here. As for The Railway Children here, while it is 100%, it is only 13 reviews, and we need at least 20 reviews or a critics' consensus. Erik (talk | contrib) (ping me) 16:39, 7 March 2022 (UTC)Reply

Rewording

edit

It seems like there is a misunderstanding of the criteria for this list, and it depends on if the language is about including, or if it is about excluding.

  • One can say to include a film with a critics' consensus or to include a film with at least 20 reviews, and a film could meet one criteria or the other or both to be included.
  • If one says to exclude a film with a critics' consensus or to exclude a film with at least 20 reviews, these are also separate criteria. On one hand, it can be said to remove the film if it has no critics' consensus. On the other hand, it can be said to remove the film if it has less than 20 reviews. So the exclusion sentence cannot be the direct opposite of the inclusion sentence.

Better to say:

  • If a film has at least 20 reviews and has a critics' consensus, include it.
  • If a film has at least 20 reviews even though it has no critics' consensus, include it.
  • If a film has less than 20 reviews but has a critics' consensus, include it.
  • If a film has less than 20 reviews and has no critics' consensus, exclude it.

Thanks, Erik (talk | contrib) (ping me) 13:27, 9 November 2022 (UTC)Reply

Attica ratings quoted are inconsistent and irreconcilable.

edit

The list shows the 100% rating of 2022 documentary "Attica" as being based on 56 reviews. But the wiki article about "Attica" states that "On the review aggregator website Rotten Tomatoes, 98% of 48 critics' reviews are positive". Therefore FIRSTLY either the listing or the article sentence is wrong. But SECONDLY, it is simply impossible for 98% positivity (when 48) to become 100% positivity (when 56). This can make one start to wonder just how accurate is the whole listing? and so I recommend that the "Attica" entry (and/or article) should be resolved by a Rotten Tomatoes expert asap. Pete Hobbs (talk) 02:40, 5 August 2023 (UTC)Reply

>2005

edit

WAAAAY TOOO MUCH films from 2005 on. How so? 2A00:20:D048:1308:A120:11F:B866:38CA (talk) 00:33, 2 January 2024 (UTC)Reply

More movies, more people watching them, more people interested in talking about them? Does not seem odd to me. Beach drifter (talk) 00:35, 2 January 2024 (UTC)Reply

Variety coverage

edit

Sharing this reliable source to further establish the notability of this list. Erik (talk | contrib) (ping me) 14:55, 20 April 2024 (UTC)Reply