In 1999, a small company was incubated (funded by software re-seller SoftChoice) on the Upper West Side of Manhattan called Defog .com Their tagline was Find What You Have in Mind and their goal was to show you where people just like you were going for dinner, drinks, or to socialize. “If you’re a one-legged Puerto Rican gay nurse living in SoHo, we can show you what all the other one-legged Puerto Rican gay nurses living in SoHo are enjoying,” was an actual pitch from the company.
In June of that year the company dispatched teams of ‘Defogers’ armed with Palm Pilots (with which the company had a sponsorship deal) to interview diners and bar-goers throughout Manhattan and Toronto, Canada. The Defogers asked the patrons to fill out information about themselves on the Palm Pilots and then rate the location they were currently in. The ratings were not numerical but instead were what the company called Relaticons, or Relatable Icons. They understood that one person’s “small” bar might be someone else’s “big” bar, depending on their experience and taste. So when asked to describe how big the location was, they didn’t have choices like Small, Medium, Large, or a number between one and five, they had Relaticons like GARAGE, or ARENA, or TENNIS COURT. These were all things that people intrinsically understood the size of. The same applied to noise level and a dozen other characteristics of the location. When people went to the website they would sign in and either go to MeMatch, which would show them what was popular among their peer group (one-legged gay nurses, for instance), or they would go to MoodMatch, which was an opportunity to try something that they typically might not choose, but that they were in the mood for. (They also had a flame icon that was an early version of Google’s I’m Feeling Lucky button which was simply a random generator.)
While the company was swept away in the tech crash of 2002, what Defog was trying to do was develop the best way possible for you to ask you what you would like. Hundreds of times a day we ask ourselves what we would like- what movie, song, taco, guy, car, suit, hair, flower, and on and on. That’s why we take advice from our friends—they’re kind of like us. And we know what they like almost as much as what we like. The only thing better than asking a friend is to ask yourself. And that is what’s at the heart of online review websites: I don’t have any experience with this, who has, what was their experience, and who are they? (Though the last point is sorely missed by almost all online review sites today).
The same year that Defog was founded, there was another review website started called Epinions.com. It was a place where people could gather around the proverbial watercooler and share stories about the products and services they used in their daily lives. Within three years of its founding however, all but one of the five founders had departed for other opportunities, leaving Epinions to flounder for most of the 2000s (it was acquired by Shopping.com and then by Ebay). The company was shuttered in 2014 and removed completely from the internet in 2018.
How could a website that gave people a platform to provide their opinions about things (which everyone loves to do, especially online), have failed so utterly? After all, the founders, many of whom left 8- and 9-figure unvested options at their previous companies, which included Netscape, @Home, Morgan Stanley, AOL, Yahoo, and McKinsey, had a lot riding on its success. The answer is that in their eyes, there was no viable business model. They saw no way to monetize their online opinion website without affecting the relevance and integrity of the reviews, and to their credit, instead of pushing a business model that involved pay-to-play reviews and ratings, they just let the website bump along in a state of suspended animation for a decade and a half.
Fast forward to 2018. The company Yelp, which provides a platform for reviews and commentary on everything from burger joints to eye surgeons, has annual revenue of $800,000,000 and is valued at over a billion dollars. USA Today anointed website Healthgrades, “the first comprehensive physician rating and comparison database,” after it was purchased for a quarter of a billion dollars in 2010, and WebMD, which has a physician rating component, recently sold for $2.3B.
So how did Epinions and Defog miss the boat so badly and fail to find a way to capitalize on their first-mover advantage, 20 years ago? For the answer to that we need to look more closely at these rating websites and what they offer and don’t offer, what they do and don’t do. And in this case we wanted to take a closer look at how the online physician rating websites work, or don’t work.
CHICKEN WINGS
If you sampled a random 10 people and asked them if they “liked” Dominos Pizza chicken wings, perhaps three would say yes, five would say no (because after all, they are wings from a pizza place and any self-respecting chicken wing opinion-giver has likely a list of chicken wing choices to provide before they arrive at the Dominos offering), and two would say they don’t eat chicken wings at all. So the number of people who “like” Dominos chicken wings in this case would be three out of ten. Dominos chicken wings would get reported as unliked by 7 of 10 people, which is pretty bad.
Now, we know that two of those respondents don’t eat chicken wings at all, so it’s likely they might never have eaten Dominos chicken wings, rendering their participation in the survey misleading. And of the other five who said they didn’t like Dominos chicken wings, it’s likely that at least one of them has never had Dominos chicken wings. It is also very likely that if the respondents were asked “If someone was ordering Dominos and you were in the mood for chicken wings, would you order chicken wings from Dominos?” maybe 40% would say yes, and depending on how fussy they are about chicken wings, they all might say yes. It’s not that they would never eat Dominos chicken wings, it’s that they’re not their favorite, they might say. So now two of the five would eat Dominos chicken wings if they were on the menu, and two of the respondents don’t eat any chicken wings, which means their vote was tainted. So suddenly, without anything materially changing, the Domino chicken wing score has gone from 3/10 to 7/10. Or if you’re keeping score at Yelp, Healthgrades and WebMD, it went from a 1.5/5 to a 3.5/5. And in the world of online reviews, that’s a gargantuan difference. And it is this type of completely unqualified surveying/reviewing that goes on at all the review websites that solicit user opinions.
You can’t know if the information being provided online is accurate because you don’t know the details of the person providing the information. And while that’s merely an inconvenience when it comes to restaurants and dry cleaners, it can mean quality of life when it comes to physicians. That’s why the Defog model was so unique. They knew that consumers needed to be confident that the people who were providing the information were just like them, or as close as the company could drill down in data, to those providing the information on bars and restaurants.
You wouldn’t take advice from your neighbor’s grandma about concert tickets, so you shouldn’t choose your surgeon based on some random person’s online review. (That said, online reviews are a decent way to get a feel for a physician’s practice, and an opportunity to spot any common complaints or issues that the physician has not addressed).
Before the tech explosion of the mid-90s, the restaurant review company Zagat’s Guides (with their trademark burgundy books) was doing booming business. They were in dozens of cities around the world and catered to foodies and non-foodies alike. After all, it was a wonderful change from the stuffy newspaper reviewers who had more in common with the rarified air of CEOs and celebrities than they did with the general public. Zagats was a place where you could hear what other people were saying about the restaurants you were interested in, saving you from having to experiment for yourself. And people absolutely loved seeing their reviews in print (even if they weren’t credited). It was an amazing business model, and one that pre-dated what the new tech companies were about to implement on their way to billion dollar valuations- content created by the audience who then pays to consume it (if you think just because you’re not physically paying to visit a website, you’re not paying, you’re wrong- more about that later).
At Zagats, they simply had to review all the submissions and choose which ones they wanted to showcase in their books. Can you see the inherent conflict there? These reviews were being supplied by the people who ate at these restaurants, but what made it into the Zagats Guide was chosen by one or two people at the corporate office. The comments were more the opinion of the Zagats staffer than any real crowdsourced revelation. The staffer decided which positive reviews to include or keep out and which negative reviews to include or omit. Millions of people a day were making their dining decisions based on Ronnie from Hoboken‘s decisions about what went into a review and what stayed out. And Ronnie from Hoboken hates chicken wings. And because of that, now you hate chicken wings (especially from Dominos), all because Ronnie had a bias against chicken wings in 1997.
MAYOR FOUR GUYS
The location-based app Foursquare was founded in 2009 and quickly became a global hit. In 2010 President Obama joined Foursquare so he and his staff could leave a “Tip” (a short 200 word review and one photo) about the places they ate in their travels around the world. If you are a member of Foursquare and you have the most visits to a location of all the members, you are anointed the “Mayor” of that establishment and you receive special offers. You are also pinged and reminded to keep going back to that establishment so you can keep your Mayoral position. Foursquare also let’s users rate venues by asking them whether they “like” the venue, how trendy it might be, how noisy and clean it is and whether the venue takes credit cards or has outdoor seating. All of this user-supplied information goes into the database and spits out a final Rating, between 0.1 and 10, for each venue.
With 60 million users and coming up on about $100,000,000 in annual revenue and its tenth anniversary, Foursquare discovered something that the guys at Epinions either didn’t, or didn’t look for: a way to monetize all the data they collect (though it took them seven years). To their credit, the Foursquare team has adeptly pivoted to make their business about “location intelligence,” and the selling of all that intelligence is big business. Asif Khan, founder of the Location Based Marketing Association, says there are 5,000 businesses looking to profit from location intelligence; “by 2019, total technology spend will reach $43 billion and advertisers will shell out $21 billion on location-based strategies.”
Foursquare even has their own enterprise website at enterprise.foursquare.com which does nothing but monetize their data and their technology. They also list no fewer than 6 ways in which they monetize their technology and the data they collect: When people leave home (this has sparked debate about user privacy—the thieves know if you’ve checked into Denny’s you’re not at home); when they check into a location; when they leave, and more: Foursquare for Business; Foursquare API; 10 levels of Super Users who can be directly targeted by companies; Foursquare Brands which allows companies to create pages and Tips on their pages; Specials which are regular specials promoted by venues; and Cross Site linking which partners with other online social media companies to cross-promote.
This all begs the question that the Epinions guys surely asked themselves but came to a different conclusion about: How does all this data collection monetization change the usefulness of the reviews? Just like the act of studying processes in the brain is complicated by the fact that the very act of observing the process could change the process, having so many disparate money making angles in itself relegates the importance of accurate information to something near the back of the line.
It also means that there are influences on the ratings that come directly from the introduction of commerce into the equation. If you’re the Mayor of Chili’s in Bakersfield and Chili’s is offering you Specials (as they are called at Fourquare) and you keep going back to keep your Mayoral status, are your reviews of that location not biased? In other words, if it’s in your best interest to keep showing up at the location if you want to keep your Mayoralty, that, by definition, is a conflict of interest. Just like Ronnie’s bias of chicken wings and his subsequent denigrating of those institutions that serve them. Imagine Foursquare for hospitals: “Congratulations Brad, you’re the Mayor of Cedars Sinai. Ain’t lupus grand! Now here’s a Special 10% off your next steroid treatment.” Isn’t that just as inappropriate as surgeon reviews on Yelp?
Online rating websites have managed to grow to great heights by concentrating on the commercial nature of their varied businesses, which is what businesses do. And this is all fine and dandy when we’re talking about cheeseburgers and house painters. But what about online review websites that focus on physicians? Many of the review sites (Yelp, Healthgrades, RateMDs, Vitals and WebMD) specialize in doctor rankings. (Yelp pretty much specializes in whatever people want to rate). Should the criteria remain the same for ranking pepperoni pizza as it is for the people who can determine whether you live or die? That’s a tricky question and one that is hotly debated on a daily basis.
IN BUSINESS FOR THEIR BUSINESS
“When I’m speaking to a surgeon, someone who is highly educated and in most cases a pretty level-headed person, there is one word that will get them to go berserk, and that word is Yelp,” says Tim Davis, who contacts surgeon’s offices for Surgeons.BEST, a new physician review website that uses four factual criteria to rate surgeons (board-certification, experience, schooling and transparency) and also includes peer reviews, all accomplished with an algorithm that weighs different schooling, experience and certifications.
“Almost to a one the surgeons I speak to detest Yelp,” says Davis. And it’s easy to see why. Just go to Yelp.com. Visually it’s an assault on the eyes. It looks like a website from the Netscape Navigator days with it’s tri-color text and cramped and crowded listings. Outside of the aesthetics one of the first things you’ll notice at Yelp is the Request an Appointment/Quote, Find a Table, Start an Order, or Get Cash Back buttons. This is how Yelp makes money. They incentivize businesses to get on Yelp not only to curate their online reviews but also to drive business to their locations. But Yelp doesn’t really care if your location or service gets additional business or not. Sure, they would like it, and many businesses do see an increase in sales, but that’s not why Yelp is in business. And this confuses both those looking online for reviews and those businesses that use Yelp to drive sales.
Terry Koosed, President and CEO of LA-based Bel Air Internet, tells the story of Yelp allegedly extorting him by removing many of his better reviews after he declined an ad sales call:
“Just as we were hovering close to reaching our next goal of 1000 five star reviews, a Yelp sales associate reached out to see if we’d like to advertise with them. We declined for various reasons, including the fact that our Yelp ratings speak volumes on their own – far more than any paid advertisement ever could.
We didn’t give it a second thought until we noticed that – suddenly – dozens of our five-star reviews were being removed from our page daily, until almost 200 had been filtered out over the course of just a few weeks. Plus, Yelp seemed to have a targeted strategy. Of the hundreds of our reviews that Yelp eliminated, they disproportionately targeted our five-star ones, effectively lowering my company’s overall rating almost a full star within a day.”
This is what Yelp co-founder Jeremy Stoppelman said about these types of allegations (they have been made many times around the world): “There has never been any amount of money you could pay us to manipulate reviews. We do have an algorithm that highlights the most useful and reliable reviews on our site which is about 75% of contributed content. I started Yelp to solve my own need of finding a great doctor, obviously we needed to protect consumers against fake reviews and spam to make sure the site is actually helpful.”
Outside of the ridiculous notion that Stoppelman started Yelp to “find a great doctor,” (Hey babe, I need to find a great doctor, I think I’ll spend 17 months building a website and another 3 years populating said website. Keep your fingers crossed I don’t die before I’m done!”) it’s an interesting response because Stoppelman doesn’t actually say Yelp doesn’t treat listings differently based on their advertiser status. He says no amount of money would get Yelp to “manipulate reviews.”
When Google returns paid advertisers at the top of the first page of search results, do we refer to those listings as having been “manipulated”? No. We don’t expect non-paying customers to be listed in the paid content section of a Google search page—i.e. at the top. That’s not “manipulating” anything. So why would Yelp respond to a claim that they treat paying customers differently than non-paying customers, just like Google does (and Yelp does), as “manipulating reviews,” when it’s standard practice in the industry to display paid customers first? Stoppelman responded to an allegation that was never made (also known as a ‘strawman’). Additionally, hiding behind an algorithm defense is strange. An algorithm can exist that determines Establishment A only gets 200 of their 5-star reviews returned upon query if they’re a non-advertiser, and Establishment B gets 500 returned because they are a paid advertiser. Algorithms don’t grow in the wild, they’re constructed by humans. Mr. Koosed is believable while Mr. Stoppelman, if you go by his defense, is not.
That being said, why should Yelp not elevate or promote paid advertisers on their website? Google does this billions of times a minute. No one on page 3 of a Google search is suing for not being on the paid advertiser area of the front page—it’s understood that those who pay go to the top. That’s how referral sites work. If Mr. Koosed thought he was showcasing all the benefits of choosing his company over the competition and using Yelp to drive sales (which he was very successful at) at no cost, he was fooling himself. At some point you’re either going to have to start paying to play (with dollars or data) or the ground will shift- these are not charities after all. It’s naïve for a business owner to think that some guy just developed this platform that’s getting him tons of customers because he altruistically just wanted the world to find their “best doctor.” Mr. Koosed probably wishes he had paid the modest advertising fee to keep his 5-star reviews intact, but he fell victim to the first rule of online anything: If you’re not paying, you’re the product.
Derek Brown, the founder of the Columbia and three other bars in Washington, DC said, “Getting mad about a bad Yelp review is like getting mad at people at an S&M convention for beating each other. The medium is designed to be a complaint factory.”
That may be true, but this does not bode well for an industry that claims to aim for unbiased reporting. As the guys at Epinions decided, when you make the overall business decision to monetize your reviews for profit, you had better be able to do so without violating the mantra of which physicians are well aware: Do No Harm. The guys at Epinions couldn’t, so they didn’t. Times have changed.
THE WERE SCALE
People are many times more likely (there is no scientific study determining how much more likely) to go online to denigrate a business than they are to write a positive review. That’s simply human nature. If we paid for and received what we believe was promised to us, then our opinion is that we got what we paid for. Consumers don’t go online and congratulate every business they interact with in their daily life who gives them what they paid for. So the negative reviews will always be part of the landscape.
With that said, there are still thousands of physicians with low online review scores getting new patients (and crappy restaurants getting new diners), so it would seem that people have built in a sense of wariness of online reviews, or at the very least they have begun to apply critical thinking along the lines of the Were Scale. The Were Scale is the measure taken when you read a review and the author of the review doesn’t use the correct application of were/where (or their/they’re, our/are, etc.) or some other tip that suggests that perhaps that person isn’t someone who necessarily has the faculties or attention to detail to be giving thumbs up or thumbs down on items or services you’re interested in. You might not take those reviews as seriously as you would one that is well written and thoughtful. Go to Bestbuy.com and shop for TV’s- you’ll quickly see it in action. The reviews at Best Buy run the gamut on the Were Scale and you can easily see how some reviews can be immediately discounted and others elevated. (A close relative of the Were Scale is the CAPSLOCK SCALE, and it’s a safe bet that you can ignore anyone who writes IN ALL CAPS.)
While judging people on their grammar and penmanship might seem unfair, outside of knowing who that person is (or like Amazon, with Verified Purchasers) it is literally the only way for you to determine heuristically whether a certain review or opinion has relevance to you. Remember the Zagats’ reviews? The authors of those reviews were nameless, faceless people you knew literally nothing about. You very well went to a sushi place highly recommended in Zagats Guide because a grandmother from Albuquerque who has still never eaten sushi submitted a review to Zagats that said “I love this place” after she had the french fries off the children’s menu. At least we’ve evolved to the Were Scale where we can try and gain a little insight into the state of mind of the reviewer (hopefully Ronnie from Hoboken used the Were Scale when he was sifting through all the reviews to include in Zagats, but we don’t know Ronnie so we don’t know where he himself falls on the Were Scale!).
As a restaurant review tool Zagats is almost useless, but it has been successful because it was all we had at the time. (Imagine a Zagat’s Guide pre-internet that included who the reviewers were, their age, gender, occupation, likes, dislikes and previous reviews. That would have put the original Zagats out of business overnight.)
So we know people give more credence to some reviews than they do others. And even using the Were Scale we are still all victims of our own biases. If someone providing a review is from a town or city we don’t appreciate we might discount that review or elevate another reviewer from a city we do like. (“Those fancy-pants in New York City are so out of touch with reality.”) Female reviews of auto repair shops might be more discounted (by men and women) than from a man. But those biases are not going to go away, they are intrinsic to how we make decisions in our daily lives, so they will be listened to, whether it distorts our decision making or not.
BETTER INFORMED?
Self-selection bias is the term used when the data being presented for analysis has itself been skewed, biased, or contaminated. That’s typically what you get when you go to the online physician review websites: many of the folks on there (“the community”) are there to complain. Ostensibly all those websites should be called Physician Complaint Websites. If you went to something called PhysicianComplaints.com and saw complaints, you would think, “That seems about right.” But you don’t go to PhysicianComplaints.com, you go to Yelp. Or HealthGrades. Or WebMD. And so when you see those complaints you think, “Wow, this doctor has complaints.” That’s self-selection bias and it’s doing you no favors.
A study by the Mayo Clinic found that physicians who had negative online reviews did not have significantly different Press Ganey Patient Satisfaction Survey numbers (PG PSS) than did physicians who had no negative online reviews. The report stated, “These findings are important, not only in disassociating formal PG PSS scores from online review comments but also in emphasizing that physicians need to be cognizant of their reputation both online and in-person. These data reveal no statistical difference in formal PG PSS scores between those physicians who have negative online reviews and those who do not. More importantly, there is a stark chasm between patient perception of the physician-related performance and non–physician-specific variables. More specifically, the adverse component of the negative online reviews is not necessarily related to physician-specific interactions with the patient, but rather to the non–physician-related complaints.”
In other words, 1. Doctors with negative online reviews are not necessarily any worse than doctors without, and 2. Doctors are being dinged online because of things they aren’t in direct control of: front desk staff, parking, insurance, referrals, and more. This should make physicians feel a little better, but it won’t. It won’t because you’ve never heard of the Press Ganey Patient Satisfaction Survey and they don’t have a cool website called Yelp! (they dropped the exclamation point sometime in 2012).
Physicians who practice in hospitals face an even more uphill battle according to Dr. Tara Lagu, an academic hospitalist in the Center for Quality of Care Research and Department of Medicine at Baystate Medical Center in Massachusetts: “Emergency medicine doctors cannot make their waiting rooms less crowded. Hospitalists and surgeons find it frustrating that it is nearly impossible to change the workings of any large organization such as a hospital. This can be discouraging because an interaction that we feel like we can’t control — a rude parking attendant or nasty receptionist — can ruin the whole visit for a patient.” The next time you start to write that negative review of a physician, make sure it was something that the doctor was in fact directly in control of. You will be surprised how often it turns out there were mitigating circumstances.
PUSHING BACK
Physicians are teaming up to try and get physician reviews removed from Yelp and other sites. In 2018 a group called Physicians Working Together, a non-partisan physician’s organization, organized a petition at Change.org to remove Yelp reviews of doctors. More than 33,000 physicians have signed the petition, but because these websites aren’t actually in business to provide accurate information anyway (remember where that priority fits?) this petition will not change anything. And it would also render Mr. Stoppleman’s supposed initial goal of creating a place where he can find “a great doctor,” a failure.
Writing in thehealthcareblog.com Colorado physician Dr. Niram Al-Agba details a high profile case in Germany where a dermatologist was allowed to be removed from Jameda, an online doctor ratings website (they really need an exclamation point, and maybe italics- Jameda!) because she argued that the pure anonymity of the site inspired hateful speech against her. Dr. Al-Agba also notes that just a few years prior a similar case was rejected because the court ruled that “patients have a right to be well informed.”
Dr. Al-Agba writes:
“Patient advocates would argue rating sites for physicians improve transparency for consumers. Physicians would counter with the argument that a medical clinic is not like a restaurant, hair salon, or shopping mall. We engage in a highly personal way with the public that is quite different from sitting down for a meal. The larger concern is whether or not Yelp.com patrons are actually “well-informed” by reading online physician reviews.
After a little research, it appears the answer is no. I used a local medical community as an example. The reviews overall are not very good; on average the medical clinics are 3.0/5.0 stars. Some reviews extol on physical appearance of the physician, be they female or male. One reviewer discusses being offended by seeing a transgender physician, an element which has little to do with the provision of medical services.”
A study by the Hospital for Special Surgery (HSS) reported issues with the ratings at Healthgrades.com, RateMDS.com and Vitals.com. “Although it is debatable whether these websites in their current form truly capture patient satisfaction and objectively evaluate the delivery of care, they represent a potential tool for both payors and healthcare systems to gauge how surgeons are assessed by their patients,” Dr. Anil Ranawat, a senior investigator and sports medicine surgeon at HSS, said in a statement.
“Historically, three key qualities — affability, availability, and ability, known as the ‘three A’s’ — have been suggested to promote a successful surgical career and favorable interactions with patients.” The report says that the three A’s are still relevant today, if not more so, as they were pre-internet.
One way physicians can stem negative reviews before they happen is to ensure the patient fills out a survey before they leave the office. That way if there are any issues they can be addressed immediately before the patient has time to speak to others and/or get riled up. Unfortunately for many practices the time/budget as well as logistical constraints don’t make this feasible. If you’re a patient and you feel something is not right, you’ll have a much better chance at resolution if you bring it up with the office as soon as you learn of it, and allow the practice the opportunity to remedy it. Unfortunately however, many people don’t want a solution, they just want to vent.
A PICKLE
Physicians are in a tricky place. They can’t stack the deck with positive reviews because that looks equally smarmy. They can’t be exempt from online criticism, and they have to measure how much of a flame-war they want to get into with unjustified or malicious (or untruthful) reviewers.
At the end of the day (and this may leave physicians feeling better, or it may not) the solution does not rest with the physician, it rests with the consumer. Consumers need to become more aware that the platforms on which they rely to get their physician information is a for-profit business, in business specifically to monetize the content that the consumer adds to the website. If the physician has no listing, they can’t get bad reviews (or conversely, get new customers through the portal), and Yelp can’t sell them advertising. If businesses can’t be sold advertising, then the whole project goes away. And that includes companies like Foursquare who are now exclusively data-tech sales companies and not consumer review companies.
In the mid-aughts there was an amusing commercial on TV for upmarket executive search website The Ladders that depicts what happens when ‘everyone is an expert,’ or specifically, “When you let everybody play, nobody wins,” as was their tagline. It shows two white-clad professional tennis players about to start a match when suddenly a shabbily dressed man from the stands jumps onto the court with a racket, literally runs over the professional tennis player, and starts batting balls everywhere. He is joined by dozens and then hundreds more as the court becomes a comedy of bodies and balls, with nothing getting accomplished. Asking people for their opinions about something they are not qualified to know much about is exactly what review sites are all about. And while that’s just fine when the discussion is about sushi, that fact becomes problematic when we’re talking about serious issues like healthcare.
“Third-party independent physician review websites do virtually nothing to ensure that reviews are from the people who have actually seen the doctor,” says Laura Mikulski, Vice President of Business Development & Physician Relations at Physician Referral Marketing. This is very true. Just like Facebook doesn’t really care what ends up in their feed (because again, Facebook is not a news organization- their job is to attract your attention 24/7 and monetize your data) the online physician review sites’ job isn’t to verify the information is correct, it is to monetize that information and data. Once people start to realize that it is their accurate information these companies are looking for, not the accurate information for the people and businesses on their website, things may begin to change. But if Facebook is any harbinger of things to come, it will be a long and slow road, and will very likely take government intervention.
So are physician review websites hurting or helping doctors and patients? Based upon all the research, it’s safe to assume that if online reviews are the sole basis for making a decision on care, they are not particularly useful, or at least don’t provide an accurate indicator of expected care, as the PG PSS has shown. The good news for physicians is that consumers are becoming more wary of negative commentary online as eventually everyone, in all walks of life, will have had someone denigrate them online (social media being the number one culprit). There are also more and more industries being reviewed and crowdsourced (Angie’s List, HomeLight, A Place for Mom) that it is almost impossible not to have been to a review/referral website, and the more consumers interact with these sites the more they will learn to expertly interpret the reviews. Imagine it as artificial intelligence- the more it learns, the smarter it gets, and the smarter it gets the more it learns.
Like anything else in the world, low-information consumers (those who rely heavily on online reviews alone) will make poorer decisions than those who seek a broad spectrum of impressions from third-party sources coupled with their own research and instincts. In other words, dumb consumers will be negatively influenced and smart ones will not. Don’t be a dumb consumer. Do your research, get input from across the board, and make an informed and educated decision.
After all, it’s only your health.
Comments