Do Game Review Scores Matter? 88
jasoncart writes "This piece on Ferrago discusses the use of review scores and the significance they play in gamers purchasing decisions. Ultimately, according to columnist Ben Parfitt, review scores are pointless." From the article: "Few things wind me up more than when what appeared to be a well-balanced and thoughtful gaming discussion descends into a successive barrage of review scores and Gamerankings ratings."
Well, duh. (Score:1, Interesting)
For game review figures I mostly only care about very rough breakdowns... THe five-stars system is good that way because it gives you a rough idea of "Terrible" "Poor" "Okay" "Good" "Great" without getting all hung up on whether game x is
On the other hand, numerical scores are fun for trying to find the worst games ever.
Re:Take them with a grain of salt (Score:2, Interesting)
Answer: NO! (Score:3, Interesting)
Re:Reviewers are robots. (Score:2, Interesting)
Read the reviews of the game but in the end don't read the scores they are given, because if the game doesn't require a computer faster than has ever been built to render it will not display well.
Re:Reviewers are robots. (Score:5, Interesting)
You can be sure that at Gamers.com [gamers.com] there are no bought reviews. In fact, in certain situations we've even received hate email by developers and publishers for not praising their game. We report what we consider to be fair and just, despite all the buzz.
Of course, not everyone should trust a single reviewer's opinion over a particular title. That's why we have a special section (Newest Games [gamers.com]) in our forums to let new games fall under even more scrutiny by our members.
Between reading a review and following up on other gamers opinions, one should have a rough idea if the game is worth purchasing.
Independent reviews are better (Score:4, Interesting)
Professional reviews are useful for a very general overview of a game, but usually it is easier to find important details at places like GameFaqs. People posting on their own without a profit motive are more likely to mention that a game is really short or overly linear, for example. Of course the noise to signal ratio is very high, but the information is there for people with a little patience.
Do film review scores matter? (Score:2, Interesting)
I personally thought kill bill 1 and 2 were so pretentious, oh, but they were cool movies, and how dare I speak out against them... people seem to think you are challenging them if you do not agree with thier movie viewing habits.
Look, kill bill(s) were shit IMHO. ok, deal with it, love it. move on.
Game reviews on the other hand, or music reviews. If you look at the Lemon Jelly website they are talking about thier mixed reviews, some say it is thier best, some say it is tripe.
You gotta wonder how much there subjective reviews call for actual thinking, I mean, you can say anything is shit, and back it up with the argument, well I thought it was shit.
Except for kill bill movies, they were shit.
Plus you have the game reviews in 'official' magazines... why did I put official in quotes?
I thoguht about a system, wher eyou decide what is important to you, beforehand, and then each person gets a different game score based on thier profile.
For instance, if you really hate niggles in gameplay, then GTA:VC/SA might have scored less for you in an online mag (or a digital ink mag
Basically, you are asking does subjectiveness matter... and I can't be bothered to talk about that on slashdot.
Meta sites and knowing the reviewers (Score:5, Interesting)
Sites like game rankings [gamerankings.com] give you the review numbers from a number of sites. Given that you get a fairly good idea of where a game sits. For films movies [movies.com] does the same for films. Given that and active reviewers on the site gives you a fair idea of what is good.
Knowing your reviewers is the other way to get good information. If you regularly read a particular reviewer you'll get a good idea about what they like and what they don't like. This is easier with films than games, but still possible.
Reviews are definitely subjective, but are still a useful way to make your money and time go further and if a bit of thought is used are well worth looking at.
Creating a meaningful average (Score:4, Interesting)
The article points out the flawed logic inherent in the system of averaging random scores assigned on a 0-10 scale. Is it possible for a game with an 8.1 average rating to be better than a 9.1 average rating? Yes. Everone is entitled to their own opinions. The author cites an example of where in his opinion a game with an 8.1 rating is more enjoyable than a game with a 9.1. Apparently the author was chastised for expressing his opinion. This is a downside to averaging: it can lead to groupthink.
But what can we do to combat groupthink? Consider the following simple ranking systems:
At first glance it appears that any one of these systems would work adequately if used consistenly and then averaged for at least 30 reviewers. The average scores should then in theory be meaningful, right? Well unfortunately we have to note the key words there: 'used consistently'. If the reviewers cannot agree on a format, then you have to reduce it to the lowest common denominator. Similarly, many reviewers would simply ignore the 'recommended' option in favor of the extremes. This suggests that perhaps the best option is to average the binary review score.
But wait! What if the system gets flooded with artificial reviewers? This happened in recent memory when Sony admitted inventing fictitious reviewers to gush about the movie "A Knight's Tale". What if those artificial reviewers get included in the average? That is a serious problem, but it's easily addressed with moderation! Examine each reviewer's track record before adding them to the mix. And then pull any reviewer that is consistently out of touch with reality.
Recommendation: Find a bunch of games you like and a bunch of games you dislike. To be thorough, you want to find at least 30 in each category. Search out critics that agree with your tastes for at least 2/3 of the titles. Average the opinions of these critics when a new release comes out. If the result comes out at least 2/3 (0.67), then you'll probably like the game.
Addendum: For better results, you can assign weights to certain critics and then perform a weighted average. For example, you might observe that critic A agrees with you 90% of the time, while critic B only agrees 80% and critic C agrees 70%. In this example,if only C dislikes the game, then your result will be greater than 2/3 (favorable); however, if A dislikes it, then the result will be less than 2/3 (unfavorable). Keep in mind that to be statistically meaningful, you need to have at least 30 reviewers, and also remember that if you get burned by a critic, you can always mod him down. In fact, you could in theory set up a dynamic system that continuously adjusts the weights of reviewers based upon how well they match your opinions.
A note on resolution: If you're able to get tristate or better "resolution" in your reviewers, more power to you! In fact, I encourage this. However, on a practical note I think it will be difficult to find enough reviewers with a high enough common denominator. Of course, this does not prevent you from assigning special weights to the differing rating systems used by various reviewers. Be creative! Invent your own system. :)
Pipe dream: It's my personal pipedream to have a website where everybody can register their opinions on various topics. Each person could then seek out (or be matched to) other individuals with similar tastes. People with less time to devote to reviewing things would defer their opinions to others. Eventually this would trickle up to a small set of individuals making recommendation
The author is guilty of what he's writing about (Score:5, Interesting)
Games should always be considered on their individual merits, on the qualities that they offer and the accomplishments they boast. This can never be distilled into a percentage or ranking out of ten. Hold games up to examination and this evidently becomes the case. For instance, when I reviewed San Andreas I gave it a 90%+ review score. I would not on the other hand award as high a score to something like Castle of Shikigami 2 on the Gamecube though personally I feel it is the better game. It would score lower because it is less technically accomplished, far smaller in scope and offers far less variety. I still prefer it however because what it does it does extremely well and when push comes to shove I would rather play it than San Andreas. That's not to say I think it's more accomplished - I simply prefer it.
I'd ask the writer of the article this: why the hell did you rate GTA: San Andreas better? This IS the problem with these scores. GTA gets a better score simply because the conventional wisdom says it is a more accomplished game, and NOT because the reviewer actually likes it better. He admits it in the article for all to see. Hype = high scores, and even someone who is writing an article about how the scores don't work is swayed by it.
This is how a game like Katamari Damacy gets lost in the Half-Life 2s and Halo 2s of the world. Conventional wisdom says that a strange Japanese game with no real storyline, blocky graphics, and simple gameplay is not as "accomplished" as a sci-fi FPS. The $20 price tag alone almost screams "inferior game." But an expensive price, polished graphics, long development cycle, sweeping advertising campaign, and a big booth at E3 are not what makes a good game.
Get rid of score inflation (Score:5, Interesting)
Similarly, game scores seem to evoke this feeling among fans of particular games. Anything below an 8/10 is perceived as "crap."
In reality, I own games that I would rate as a 6/10 which are still enjoyable. These games may be merely average, but if certain aspects are present, they can still be anjoyable. "Buffy The Vampire Slayer: Chaos Bleeds" would fall into that category. The game received in the 6.5/10 range all over, and it's a score I would agree with. The camera is lousy, and the controls are inferior to the original in almost all respects. Despite this, the story is entertaining, the voice acting is pretty good (with the exception of the knock-off Willow), and the subject matter is entertaining to me. It is a 6.5 game, and I don't believe anything to the contrary, but it's still entertaining.
Dead or Alive 3 is another great example. It's probably a 7/10 game. The graphics are beautiful, yes, but the game wasn't really substantial change from DOA2. Weakening the counters improved the battle system, but the new characters were universally dull (except for Hitomi), and the game was otherwise nothing more than the second. It doesn't really deserve anything spectacular as far as scores are concerned, but it's a favorite with my friends and I when we get together at my place.
EGM was one of the few magazines I discovered that was willing to make this stand. a 5/10 game was AVERAGE. You might enjoy it if it had a particular point that really appealed to you. If you were a huge RPG fan, a 6/10 RPG would be worth buying if you'd already finished the last three 8/10 games. The 6/10 was not crap. Games at 3/10 and below were crap. And a game had to be spectacular to get into the 9 range. Unfortunately, people don't seem to be willing to accept that scale; everything needs to be between a 6 and 10. The problem is that it just dilutes the actually worthwhile games. Gamepro was notorious for this. They gave straight 4.5/5 and 5/5 to Starfox64. The game was good, but it was not worthy of that level of score. When compared to something that truly was, it served to make the worthy game's scores "lesser."
Do scores matter? In EGM's case, I'd certainly agree. Back when I still kept up with that sort of thing for professional reasons (I was an assistant manager at a game store), they were generally pretty trustworthy. In a case like Gamepro's, which unfortunately seems to be more the standard than the exception, it makes the scores completely inconsequential. At that point, I learn to just ignore the score and read betweeen the lines of the reviewer's euphamisms.
Re:Reviewers are robots. (Score:3, Interesting)
The upside to that is if you don't get angry when a game is poorly-received, it means that you weren't passionate enough about making it.
From the outside, it seems that Rockstar did a good job in this respect -- the second-in-series, Grand Theft Auto II received low marks [ign.com] in various reviews, but they seem to have taken them as feedback, and produced a winner with GTA III.
Between reading a review and following up on other gamers opinions, one should have a rough idea if the game is worth purchasing.
I might twist that slightly -- given the number of games available these days, I'd say that professional and player reviews most often give me an idea as to whether it's even worthwhile to try the demo.
____________________________________________
Inago Rage - A demo worth downloading(!) [dejobaan.com]
$4.99? I can top that. (Score:3, Interesting)
http://www.netjak.com/review.php/537
Re:Reviewers are robots. (Score:3, Interesting)
And you have to stop halting my experience with full screen ads every page because I refuse to let you put a tracking cookie on my computer. The "Click here to skip this ad" is also barely visible in Firefox.
I know you need to feed the bandwidth family, but please come up with a less obtrusive advertising scheme. I won't be going back to gamers.com because of this (it simply takes too long to see if it is worth my time) so you won't be getting my eyeball revenue anyway.
Just my hopefully constructive criticism.