Stephen Brook is a distinguished author of books on travel, wine, and other subjects, as well as an award-winning wine journalist. He has compiled anthologies, and written numerous books on wine, especially the wines of Bordeaux and California. He contributes articles regularly to four international wine magazines, lectures on wine, and is in demand as a guest judge at wine competitions around the world. As a lover and keen observer of the wines of Bordeaux, Stephen has been dismayed by a trend amongst eminent critics when it comes to pronouncing their scores for each new vintage, tasted during ‘en primeur week’ in the region each April, noting that “Scores have inflated to a preposterous level.” So he set out to prove just how preposterous in this article.
Bordeaux: Every Wine’s A Winner
by Stephen Brook, 2015 This is the time of year when importers solicit your orders for 2014 Bordeaux. Their recommendations are often supported by the scores of esteemed wine critics, just as a film poster is peppered with the stars of reviewers. Thus year I decided to be the first off the block and so I posted my scores for the top châteaux of Bordeaux on an American website (hosemasterofwine) back in January. Of course I hadn’t tasted the wines – nobody had back then – but I saw no reason to let that stop me. I simply predicted what I thought the scores would be. It’s not that difficult. No critic is likely to score – to take a random example – Château Rahoul at 97/100, however good the wine may taste. Conversely, no one is going to give Lafite a score of 88/100, even if the sample poured seemed lacklustre on the day.
Personally, I’ve thought for twenty years that it’s madness to score wines so early: wines that will still spend another year in barrels, that may or may not be final blends, that may or may not have press wine incorporated, and that may be subjected to fining and/or filtration. And that’s quite apart from the issue of the integrity of the sample. This year during the primeur tastings I was offered samples from a new barrel, even though the final wine is never aged entirely in new oak. Samples are tweaked, and campaigns to have their authenticity independently verified are stamped on by the proud Bordelais. Moreover, I am often disappointed by wines that are meant to blow my mind away. Mouton, for example, can often underwhelm me, although experience tells me once it is bottled and rested, it is likely to show splendidly.
What’s a poor critic to do? Give a middling score and suffer the consequences? Or give a high score and hope everyone else does the same? Guess. So how did I do? I compare my scores with those from Neal Martin (NM), Gavin Quinney (GQ), Wine Spectator (WS), Tim Akin (TA), and James Suckling (JS). Let’s take a look at Lafite:
|SB 94-96||NM 94-96||WS 94-97||GQ 95-96||JS 97-98||TA 97|
Not a perfect match, but close. Quite often I find my scores a point or two lower than those of some others. Which does rather surprise me since no one is claiming that 2014 is a truly great vintage, merely a very good one. Let’s try Léoville-Poyferré:
|SB 93-95||NM 93-95||GQ 93-95||WS 92-95||JS 93-94|
Close to unanimous. So let’s cross the Dordogne and see how Lafleur is doing:
|SB 95-97||GQ 95-97||NM 94-96||JS 97-98||TA 98|
Not quite Bingo! But close enough. Next: Jonathan Maltus’s Le Dôme:
|SB 92-94||NM 92-94||GQ 92-95||WS 90-93|
I call that a result. The lesser châteaux are sometimes harder to predict, but here’s Pedésclaux in Pauillac:
|SB 86-88||GQ 88-90||WS 88-91|
Kirwan in Margaux:
|SB 90-92||GQ 90-91||WS 89-92||JS 90-91|
The figures show near unanimity for other wines including those from Châteaux Montrose, Calon-Ségur, Clerc-Milon, and Fleur-Petrus. And what does this all prove? First, that most tasters give scores that are close to identical, and second, that those scores are not difficult to predict. Moreover they tend to be within an increasingly narrow band. Of James Suckling’s scores, the lowest is 90-91 (for just three châteaux). Since only a handful of famous châteaux ever receive more than 97, that means the scores generally range from 92 to 97. (Jancis Robinson, to be fair, scores far more conservatively). And what does that tell the consumers? That all 2014 Bordeaux are excellent. In short, scores have become meaningless. In my book, a score over 90 (or 18, for the OAPs among you) suggests an outstanding wine. If there are no mediocre or underperforming wines any more, then wine critics, as I think have demonstrated with my predictive scores, can stay home and scribble their scores from the comfort of their chaises-longues.
I quite understand that consumers, bewildered by the ever growing choice of worldwide wines, need guidance, and scores can indeed be part of that guidance. But if 91 now means ‘acceptable’ and 93 ‘pretty good’, then the consumer will remain bewildered and unaided. A well known and highly informed British wine writer once gave a talk at a wine conference in the USA, at which he invited the audience to ‘score’ works of art he illustrated with a slide show. The point was well made: scores may be a necessary evil, but they are intrinsically absurd. Today that same writer is one of the worst offenders when it comes to routinely stratospheric scoring.
Is the object of such scoring to suck up to the rich and powerful of the wine world? Are wine critics now merely adjuncts of the PR industry? In the 1980s wine writers were even banned from certain regions – Oz Clarke and Champagne a prime example – because they wrote truths that were unpalatable to local syndicats. That seems inconceivable today. No one dares offend. Yes, there are a few scores below 90. I predicted some of them: Croizet-Bages:
|SB 85-87||NM 85-87||GQ 86-88||WS 87-90|
|SB 89-91||NM 87-89||GQ 89-92||WS 90-93|
But would anyone dare to give those properties 95/100, even if they did produce a spectacular wine? No, because wine critics are now scoring brands and not the wines they purport to taste.