Loodo Score FAQs
You can find an in-depth explanation of how our Loodo Scores are calculated here.
How can I get in touch with feedback or questions?
Please email us at firstname.lastname@example.org
What sources go into the Loodo Score?
We currently include ratings from the following sources:
Apple App Store
BoardGameGeek (digital games section)
We’re also aware of the following sources that we may add in the future:
If you know of other good critics let us know by emailing email@example.com! Also let us know if you think we missed a review from one of our included critics for a certain game — game names sometimes differ across sites.
How much weight does each source get in the Loodo Score?
It’s hard to determine how much to weigh each source: how much is a critic’s opinion worth vs the average of user ratings in the App Store? For now we’ve taken a simple approach, but we’re likely to change this and are very open to feedback here.
The Apple App Store and Play Store both receive the most weight.
BGG user ratings (collectively) are half of the Stores’ weight.
Each critic is half of BGG (so a quarter of a Store).
So if, for example, a game is iOS-only, has a BGG score, and has been reviewed by two critics, the Apple App Store would get 50% weight, BGG would get 25%, and each critic would get 12.5%.
How often do you update your data?
All of the data have been updated within the past two months as of the time of this post, and we plan to refresh the data again within the next several weeks. We’ll also make sure that any newly released games are added to the site quickly (within a day or two, once there are actually ratings for the game to include). Our longer-term updating cadence will depend on how much change we see between updates.
Do you include your own reviews in the Loodo Score?
No. You can see our opinions in our reviews, but we want the Loodo Scores to be unaffected by our personal views.
Why are your long-form review ratings out of 10 but the Loodo Scores are out of 100?
For one, we want to make sure that the distinction between the two is clear — we know we still have more work to do on that, but one way was to put the numbers on different scales (so if you’re talking about a 10 or lower, it’s always going to be our personal opinion, not the aggregated Loodo Scores).
Secondly, we think anyone who claims to subjectively evaluate something down to 1% accuracy (which is implied by a 100-point scale) is probably being overprecise (I see you, 10th grade English teacher). Every five-star system we’ve seen uses half stars as well, so is effectively a ten-point scale. So we landed on ratings out of 10 for our subjective reviews. When combining scores from multiple sources, however, we think there is value in having more granularity.
Is the Loodo Score a rating of the overall game? Or just the app?
This may come across as a bit of a cop-out, but our Loodo Scores include whatever was in the mind of the players or critics whose ratings we’re aggregating. That means they are almost certainly a combination of app quality and underlying game quality.
For our long-form reviews (with our personal opinions), we are aiming to just evaluate the quality of the adaptation, not the underlying game.
How do you treat games that are “straight to mobile” or in other words, never had a physical version that it was based on?
These games are pretty uncommon today as far as we can tell. As we’ve built out the site, we’ve found that “bulk” additions of games resulted in us having listed a few of these. For now, they have no specific identifier / filter in our website except for the fact that they are listed as “N/A” for the “BGG Original Game Score.” On our pages, this is still a hyperlink that defaults to a BGG search for this game.
We do think these games are a fun, special case and would like to take some more time to check them out in the future — but as of this writing, we have not personally played any of them.
There’s also the somewhat grey area of “games that are like board games but not quite” where something like Hearthstone might be an example. We are still deciding how to treat games like this but for now, for the most part, we have not listed them.
How do you handle games available on multiple platforms?
Our app store ratings are, of course, specifically about the app versions of games. For critic reviews, though, we are often less sure about the platform they used to play a game — some make this clear in the review, but many do not. Therefore, Loodo Scores do probably include some “platform mixing”, with some critics playing on mobile while others playing on Steam or elsewhere, and we certainly recognize that the platform a game is played on can in some cases have a big impact on the experience. We’re working on ways to better identify critics’ platforms and filter accordingly.
Why didn’t you share your code and the exact details of your scoring system? How do we know you’ve done this correctly or fairly?
We would like to do this at some point! We aren’t holding back our calculations due to concerns about keeping them “proprietary” or anything like that — the fact is that they are still rather messy, and getting them in a form that can be shared and actually understood is time-consuming. If you have questions about any of the specific calculations we are happy to provide more detail.
Why don’t you adjust game scores to be a normal distribution centered around 50, with half of games above that and half below?
Some argue that if scores can go up to 100, the “right” way to distribute scores is to have the average be 50. The problem is that most people don’t think that way (and most rating systems that collect user ratings don’t end up that way) — if you tell someone a game is a 60/100, the typical person will not think that’s an above-average quality game. We think what we ended up with (an average score of 74) is likely to better match people’s intuition about what a certain score means.
Are overall ratings even the right way to think about games? Could they do more harm than good?
We’ve thought quite a bit about this, and while we agree there are limitations and downsides of numerical ratings we believe they are overall beneficial.
First, to flesh out the criticism a bit more: some worry that ratings cause developers to become overly-focused on satisfying critics (by fulfilling the perceived requirements to get a high rating), at the expense of doing what is actually “right” or best for the players of the game. This is a specific instance of more general critiques of metrics — this isn’t a forum for political debates, but similar arguments are made against using standardized testing in education, or optimizing for particular business metrics, or looking at only wealth or GDP in assessing wellbeing across countries. People point to the aphorism that “not everything that counts can be counted, and not everything that can be counted counts.”
As in those other contexts, the right answer is probably best summed up by another aphorism: “everything in moderation.” No, the opinions of players in the App Store and a handful of critics are not the only determinant of whether a game is worth playing — but they aren’t irrelevant, either.
Specifically in the context of mobile board games: there are a lot of good games out there, but we think there are a lot of potential players who haven’t tried them yet. In our view, discovery is one of the main barriers: if you don’t already know which game you want to play, how can you find out? Today, people basically have to know someone, or follow one of the bloggers in this space (which means they are already likely heavily involved in board games), or just take a gamble on what is often a $10 app (per player unless you’re on a family sharing account). That’s a lot of research and a lot of money to shell out, and we think that turns off a lot of people who would actually enjoy these games.
Loodo Scores are one way to make mobile board games more discoverable to potential new players. We would never claim that the top game in our list will be everyone’s favorite, or that the exact ordering of games is correct — but we do feel confident that games high on the list have a consensus of being high quality.
Are there other metrics you’re thinking about including?
We’re so glad you asked! (No, no one asked this)
We recognize the limitations of a one-dimensional game rating (see above), and are very interested in ways to offer richer information for potential players while still keeping things as simple as possible. We have a couple thoughts along those lines: a “love-it-or-hate-it” rating and “if you like this you might like that”-type recommendations.
Love-it-or-hate-it (we’re taking name recommendations) would be a measure of how similar ratings are across sources. If you see a game with an average score, is that because everyone agrees it’s average or because some love it and some hate it? If it’s the latter, that’s probably a game where it’s worth reading some of the reviews in more depth to understand whether the pros or cons matter more to you.
Different people will like different games, of course: there is not some single ranking of overall quality that everyone will agree with. We’d like to be able to offer recommendations that are a bit more personalized. One way would be to adopt something like the approach in this excellent blog post, which looks at player rating similarity across games.