A transparent look at the Brussels Score algorithm
We don't just show the highest-rated restaurants. We look for places that do better than expected for their type — taking into account cuisine, location, and how many reviews they have.
Google ratings are useful but not perfect. A 4.5-star restaurant near Grand Place isn't the same as a 4.5-star spot in a quieter neighbourhood. Busy tourist areas get more reviews from visitors, while neighbourhood spots get reviews from regulars.
Our algorithm tries to answer: "How good is this restaurant compared to similar places?"
Every restaurant gets a score from 0-100 based on multiple factors. Here's how each component works:
The Google rating weighted by confidence. A 4.5★ with 400 reviews often outranks a 4.9★ with 25 reviews because we can trust the data more.
Our machine learning model predicts what rating a restaurant "should" have based on its cuisine, price, and location. If the actual rating is higher than expected, that's a strong signal of quality. This is our key differentiator for finding hidden gems.
Restaurants with limited hours or days often indicate a local gem. If they can survive with restricted opening times, they must be doing something right.
Non-chain restaurants get a bonus. We want to surface unique local spots, not franchises you can find anywhere.
Restaurants from immigrant communities get a bonus when located in their community's traditional neighborhood. A Turkish restaurant in Saint-Josse is more likely to be authentic than one near Grand Place.
Iconic Brussels establishments like Fin de Siècle, Potverdoemmeke, and other legendary spots that define the city's food culture.
Small bonuses for: "Chez [Name]" family restaurants (+2%), specific regional cuisines like "Sichuan" vs generic "Chinese" (+1%).
We track Michelin stars, Bib Gourmand, Gault&Millau, and Reddit mentions — but they don't affect rankings. These are shown as badges in the UI for reference.
Some factors reduce a restaurant's score:
Restaurants within 150m of Grand Place with mediocre ratings get penalized. Tourist traps thrive on foot traffic, not quality.
A perfect 5.0★ with only 20 reviews is probably friends and family. We use Bayesian confidence weighting to discount ratings with too few reviews.
Chains like Bavet, Exki, Pizza Hut get penalized. We want to surface unique local spots.
1500+ reviews in tourist areas suggests mass-market appeal over quality.
Restaurants are sorted into three tiers based on their Brussels Score, displayed with colored markers on the map:
Top picks — the best Brussels has to offer. About 15% of restaurants. These are the places worth going out of your way for.
Recommended. About 18% of restaurants. Solid spots that locals recommend.
Not recommended. About 67% of restaurants. Tourist traps, chains, or places we can't confidently recommend yet.
When you see "+0.3 stars above expected", here's what it means:
Restaurants that score above expected are doing something special compared to similar places.
No ranking is perfect. Here's where ours falls short:
The entire codebase is open source. You can see exactly how the scoring works, suggest improvements, or build your own version.
Inspiration: Lauren Leek's London Food Dashboard — using ML residuals to identify undervalued restaurants.
Questions? Open an issue or contribute to the project.