I swear, every time Real Madrid puts four goals past a decent team, the whole rating ecosystem just loses its mind. You watch the game, you see a good, solid performance, and then you check the scores afterward and wonder if the analysts were watching a completely different fixture on Mars. This Osasuna game was a perfect example, and I had to go through the motions of checking everything just to prove my gut feeling.

ca osasuna vs real madrid player ratings: Did Real Madrid deserve those incredibly high scores?

The Pre-Game Setup and The Initial Watch

I committed to the Osasuna match from the jump. Had my usual game-day routine: coffee, settled onto the sofa, and sat there watching every single minute, start to finish. I always try to watch the game without external noise first, just the pure eye test. No stats, no live Twitter feeds, nothing. Just the 22 guys running around. Look, it was a solid win, no question. Madrid’s attack clicked beautifully at moments.

Vini Jr. was absolutely electric—you give him an 8.5 or 9, and I nod my head. He earned it, running rings around their defenders, looking sharp, and scoring goals. But here’s where the confusion started creeping in during the first half. The opening 45 minutes wasn’t exactly a defensive masterclass. Osasuna was aggressive, they had their chances, and frankly, Lunin pulled off a couple of important saves that stopped the game from getting ugly before the second half even started.

When Rodrygo got his score, I thought, ‘Good, solid 7 for him, decent link-up play, sharp finish.’ But overall, it felt like a professional win where the attackers were brilliant and the defense was merely okay, maybe a little exposed at times.

The Post-Match Shock: Pulling the Aggregated Numbers

As soon as the final whistle blew, I did my usual routine. This is where the work starts. I pulled up three major sports sites—you know the big names—and two aggregated statistical platforms just to compare notes. I expected Vini to be sitting pretty, maybe Tchouaméni around 7.5 because he was tidying up well in front of the back four. What I saw instead made me spit out my lukewarm coffee. The scores were ridiculous.

  • Nacho Fernandez scoring an average of 7.8 across the board? Seriously? I remembered seeing him dragged out of position at least twice early on, leaving huge gaps. His distribution was fine, but a 7.8 suggests a defensive stalwart performance. He wasn’t that.
  • Carvajal getting an 8.0 or higher nearly everywhere. He was fine. Just fine. He wasn’t shutting down the flank single-handedly; he was just doing his job against a mid-table side that faded badly late. An 8.0 is a match-winning performance for a full-back. He did not achieve that.
  • Bellingham, naturally, got a 9.0+ average on several sites, mostly because he scored. If you strip away the goal, how many key passes did he actually register? How much defensive pressure did he relieve? He looked tired at times.

The numbers were clearly inflated, and it immediately triggered my skepticism. It felt like these outlets were rating the four goals scored and the prestige of the Real Madrid badge, not the actual individual player contribution over 90 minutes. They saw the final scoreline, and the ratings just automatically shot up.

ca osasuna vs real madrid player ratings: Did Real Madrid deserve those incredibly high scores?

Digging Deeper: Comparing Stats vs. My Eye Test

This is the part where I started getting annoyed and turned my focus to the underlying data. I spent forty minutes going through the micro-stats on a couple of reliable trackers I trust, breaking down defensive duels and progressive carries. I wanted to find proof that these absurdly high scores were justified, but the data simply wasn’t supporting the media narrative.

I focused heavily on the defensive midfield and the backline. Camavinga was doing heavy lifting, quietly sweeping up. He had a successful tackle rate near 85% and was crucial in possession recovery. His average score? Maybe a 7.2. This disparity confirmed my suspicion: the unsung heroes who actually stabilize the team rarely get the credit when the attack is flying high.

Then you look at a forward who scored once, registered one key pass, and spent twenty minutes walking around, and they get an 8.5 simply because they wear a famous number. It doesn’t balance out in any realistic rating system.

The Final Verdict on Ratings Bias

I realized what was happening—it’s always the same thing. It’s the media narrative engine. They know which players move clicks and which scores generate engagement. They slap high scores on the big names because it validates the storyline: Real Madrid is unstoppable, their stars are demi-gods, etc. It’s pure reputation bias, and it drives me absolutely nuts when I’m trying to conduct an objective review.

I went back and created my own spreadsheet, manually adjusting scores based purely on defensive output, successful passes in the final third, and minutes played without making critical errors. My system rated Vini Jr. high, sure, but it knocked down Nacho, Carvajal, and even Bellingham by almost a full point each, dragging the overall team rating down to a realistic high 7s, not the absurd 8.3 average the press was peddling.

ca osasuna vs real madrid player ratings: Did Real Madrid deserve those incredibly high scores?

My conclusion is always the same when this happens: these websites don’t rate the individual performance; they rate the expected performance of a superstar when his team wins big. They are playing it safe, prioritizing clicks over accuracy. And that’s exactly why I keep sharing my own process—to prove that if you look closely, the emperor is often wearing fewer clothes than advertised.

Disclaimer: All content on this site is submitted by users. If you believe any content infringes upon your rights, please contact us for removal.