I’m not a fan of review aggregators, and by extension I’m not a fan of BasedGamer. Don’t take my lack of support for open opposition; I simply was not compelled to donate the minimum $10 to validate my “based” quality. Although steep, the $50,000 crowdfunding goal seemed tenable since $2,000 had already been eaten by Jennie, meaning the project needed at most the support of 4,800 people. And since the BasedGamer team had outright stated on their Reddit AMA that the money would be used to hire contractors, I did not indulge in financial scam conspiracies. The light that sparkled off BasedGamer’s clichéd and nebulous infographics simply did not inspire me. The optimistic underscoring that came with Jennie’s treated speech did anything but open my wallet.
But these superficial flaws hold little weight compared to the paltry red herring, the foundation that Jennie & co. have presented for BasedGamer. And now that the project has been funded thanks to some sizable last-minute donations, we should carefully consider the project’s politics, and the role that review aggregators play in videogame consumption. To do this, we’ll dive in headfirst and parse BasedGamer’s raison d’être:
“In the present, the leading aggregator for video game reviews focuses more on critic review scores rather than community review scores. With no transparency into how a review score is determined, as well as focusing on only a pre-selected group of review critics to determine a score, the gamer is more likely to be exposed to aggregated reviews with hidden agendas, or publisher abuse.”
For the last decade, Metacritic has been the only review aggregator that any major gaming news site has talked about, the only review aggregator that’s named in seedy contracts like Destiny and Fallout: New Vegas. And while developers, publishers, and PR marketing firms scour through past titles on Metacritic to decipher which news sites are most influential, alternative aggregators like ReviewTrax (that also uses a four-point scale, rates sites and authors, and doesn’t translate grade scoring) are left in the dust.
Reaching out to emerging blogs and Youtube channels brings new perspectives, but every perspective is susceptible to subconscious bias and ulterior motive — didn’t we just learn about the Youtube review scandal associated with Middle-earth: Shadow of Mordor? It begs the question: are current review aggregators flawed by design, hosts to hidden agendas and publisher abuse? If they are, what makes BasedGamer, another review aggregator, any different? It is still a value-driven platform that measures review without providing a standardized review methodology, as far as we know. On what basis does a reviewer definitively measure good entertainment, very bad visuals, okay storyline, or excellent gameplay? How does it propose to include transparency in user-submitted opinions, i.e. the explicit and implicit elements that affect a user’s score? How can BasedGamer decide if a game should be given a score that diverges from Metacritic’s if it doesn’t know how Metacritic creates its Metascore Magic?
Since Metacritic’s Metascore generation is algorithmic and weighed only with critic reviews, BasedGamer goes the opposite way, relying on popular opinion to define scoring with user-submitted reviews and ratings. But a game isn’t good because many people enjoy it; functional quality is inherent. Beyond that, personal preference, knowledge, and trend respectively affect user ratings. As we mull over these considerations, we can begin to ask how BasedGamer will level the playing field between industry and community, and how to reconcile with the fact that there is no standardized methodology, no general practice or control group to conduct videogame reviews.
In a 30-page study on Metacritic, Mike Zuurman of the XeNTaX Foundation hypothesizes that critic and user reviews differ significantly considering the scoring method differs between user and “professional reviewer”, and “considering the scoring method of Metacritic is suggestive of statistical flaw.” It’s a lot of information to swallow, but one of the great points that Zuurman raises in the study’s conclusion is that there is no formal control or evaluation method that critics abide by. The conventional critic and user do not have formative understanding of ludology to make significant criticism — their acumen is guided by intuition and experience. As obvious as this is, many gamers don’t account for this if and when they use Metacritic or any other review aggregator to influence their game purchase. Since sites haven’t standardized review methodology across the community, the best they can do is employ tastemakers to pass judgment and give the illusion of trend, which will realize if enough people are influenced.
If BasedGamer doesn’t address this concern, then it will be doomed to perpetuate illusion, albeit a separate illusion — the site will offer a counterpoint, an alternative value judgment running parallel to the mainstream (Metacritic). As its userbase grows and other emergent developers and publishers harness the marketing power of BasedGamer, the site’s stated goal may be co-opted by dominant trend and information overload, like the consumer-focused business reviewer Yelp. Removing percentiles won’t fix the lack of evaluation methodology among professional or amateur reviewers. Allowing users the opportunity to rate reviews won’t prevent bias or tampering. Most importantly, creating a new review aggregator won’t guarantee that you’ll enjoy your next game purchase.
But it’s worth a shot. And if you’re still distraught that BasedGamer is happening, find consolation in this: you don’t need a site to validate your taste in videogames. You can decide for yourself.