decent 21 rating for food from Zagat. Adds Gold, a college-trained statistician: “Zagat, of all the major sources, probably has the lowest levels of reliability” because of its self-selected survey base, which provides little to no information on the people who actually cast ballots.
At this point, OpenTable may be the most reliable of the sites that aggregate restaurant ratings. Site administrators send review surveys only to those diners who have honored their OpenTable reservation, and the diners have approximately 30 days to fill out the forms. This process guarantees two things, says Ann Shepherd, vice president of marketing for OpenTable: 1) that every review is actually based on a meal eaten at the restaurant; 2) that the meal was eaten recently, while the memory of it is still fresh.
Those are two promises that you will never hear from Zagat, a guide that looks destined to follow so many other print publications into oblivion.
ZAG AT’S 2010 MAKOTO BLURB shows why the guides are “refrigerator-magnet poetry,” “dubious,” good only for a “lonely traveler... searching for a good place to sup before masturbating himself to sleep.”
Grade Creepiness: As first pointed out in SmartMoney magazine in 2007, Zagat food grades have spiked dramatically over the years for restaurants in the New York guide. Same goes for the D.C. area: In the 1992 guide, only 13 restaurants earned grades of 25 points or higher (out of a possible 30). In 2010, more than 60 restaurants topped the 25-point mark. Even more startling, 72 percent of D.C. restaurants with actual ratings in the latest guide earned a grade of 20 or higher, which means that nearly three-quarters of our eateries are “very good” or better. No one sucks here anymore.
The Method, Man: Readers have no idea how many votes are required before the results are considered statistically relevant to merit a Zagat grade (Zagat’s Barbalato told me that Makoto edged Inn at Little Washington, 28.9024 to 28.8495, but would not say how many votes were cast). What’s more, they have no idea if the grade represents the votes of the restaurateur’s spouse and 100 close friends or a statistically sound sample of D.C. area diners. All they know is this: The voters are self-selected, which is a pool almost guaranteed to skew results. Readers don’t even know if voters actually ate at the restaurants in question. OpenTable, by contrast, posts reviews only from diners who have honored their online reservation.
Cut-and-Paste Prose: Zagat editors take a ransom-note approach to writing descriptions of the restaurants in their guides. Compare that to consumer-oriented sites, such as Yelp or Don-Rockwell, where amateur critics can relate their entire dining ex-per iences without fear that an editor will place a dis about “uncomfortable seating” right next to some yahoo’s Pollyanna piffle about a “wonderful experience.” Zagat is refrigerator-magnet poetry at a time when people want the Library of Congress at their fingertips.
You’ve Been Duped: For those who purchased the 2010 edition of the Washington, D.C./Baltimore Zagat guide, the Makoto entry might seem familiar. For good reason. It’s the exact same entry as last year’s. The exact same awkwardly phrased copy. The exact same dubious grades. And yet you’d be hard-pressed to learn from the introduction that the 2010 guide is merely an update. Here’s the truth: The even-year Zagat guides are based on the surveys from the previous year. Do you know how much a restaurant can change in two years, let alone two months? Would you trust a review from a year ago on Yelp?
Solitary Confinement: Zagat bases your estimated check on a single dinner and drink, plus tip, which is rather symbolic. Social network sites want to create a community around a common interest in food, which explains not only Myspace Local’s recent move into online restaurant reviews but also local restaurateurs’ embrace of Facebook and