How to spot fake clinic reviews — 12 signals that take 5 minutes to check.

Bought reviews leave fingerprints. Patient-side, not clinic-side: a reproducible methodology a reader can run on any clinic, anywhere, without specialised tools.

Disclosure (read first) This guide describes detection signals for manufactured reviews on public review platforms. It does not name specific clinics as having fake reviews — that determination requires platform-side data we do not hold. Every signal below is independently reproducible by any reader on Trustpilot, Google Maps, or similar platforms. Clinic Truth is published by an independent editorial team based in Tirana. We earn revenue when readers coordinate care through AlbaniaClinic.com. We do not take payment from clinics in exchange for coverage.

The most-quoted statistic in the segment is the Trustpilot star rating. The most-trusted statistic is also the most-manipulated. In its 2024 transparency report, Trustpilot disclosed that 4.5 million reviews were removed as fake or non-compliant in 2023, on a base of approximately 60 million submitted. That is one in thirteen reviews failing the platform's own authenticity check — and the platform's check is conservative.

The medical-tourism segment — hair transplant, dental implants, cosmetic surgery — sits inside the most-targeted review categories because the per-customer economics make manufactured reputation profitable. A single hair-transplant booking nets a Turkish or Hungarian clinic €1,500–€8,000. A 50-review reputation-management package on the secondary market costs €500–€2,000. The math runs itself.

This guide is the patient-side response. Twelve signals you can run on any clinic's reviews in five minutes, no special tools, no insider data. The signals do not tell you a clinic is bad — they tell you whether the review base under the headline rating is the kind you can trust.

4.5MReviews removed as fake by Trustpilot in 2023
7.5%Of all submitted reviews flagged platform-wide
€500–€2KMarket price of a 50-review reputation-management package
5 minTime to run all 12 checks below

The twelve signals

1. Velocity spikes — too many reviews in too short a window.
Trustpilot

A real clinic's review flow is steady. A clinic with bought reviews shows bursts: 50 reviews in a fortnight, then nothing for three months.

Five-minute check Open the clinic's Trustpilot page. Click "Sort by date". Scan the date column on the most recent 100 reviews. If you see clusters of 20+ reviews on consecutive days while neighbouring weeks are empty, the velocity is suspicious. Real clinic flow tends to track procedure cadence — roughly 1–5 new reviews per week for a high-volume clinic.
2. Burstiness around marketing campaigns.
Trustpilot · Google

When the spike happens matters. Bought reviews frequently cluster around the launch of a paid TikTok campaign, a Trustpilot promotional offer, or the start of a discount window.

Five-minute check Cross-reference review-burst dates against the clinic's social media history. Open their Instagram or TikTok and check whether the bursts coincide with content-marketing pushes. A clinic that launches a new TikTok influencer arrangement and gets 30 reviews the same week is staging a moment.
3. Linguistic homogeneity — reviews that all sound like the same person.
All platforms

Real reviews vary in syntax, vocabulary, length, and grammar imperfection. Manufactured reviews — whether human-written by a small team or AI-generated — converge on a recognisable register.

Five-minute check Read 10 consecutive 5-star reviews in sequence. If you find phrases like "amazing experience", "highly recommend", "best decision of my life", "the team was professional", "I'm so happy with the result" appearing across multiple reviews in similar grammatical constructions, the reviews are convergent. Real reviews have idiosyncratic detail: the brand of conditioner the technician recommended, the specific pain-management drug, the name of the breakfast at the hotel. Convergence on generic praise is a signal.
4. Reviewer profile depth — one-review accounts.
Trustpilot

A single-review account on Trustpilot was created either to write that one review and abandoned, or by a reputation-management agency operating ten thousand throwaway accounts. Real users tend to have 3–20 reviews across various businesses they've actually used.

Five-minute check On the same Trustpilot review listing, click into 10 reviewer profiles. Count how many have only the one review of this clinic and no other history. Compare to a known authentic business in any category — say, a local restaurant — where the reviewer-with-history ratio runs 30–60%. Clinic-only single-review accounts in the high-90s percent is a signal.
5. Generic display names — "John Smith", "Maria Garcia", "Anna Bianchi".
All platforms

Reputation-management services seed accounts with the most common first-name + last-name combinations in target geographies. Real users use a mix: full names, first-only, initials, nicknames, partial names with surname initials.

Five-minute check Scan 30 reviewer display names. Count how many are the most generic form — full common first name + full common last name with proper capitalisation. If the percentage is above 70%, the names are over-correlated with seed-list patterns. Real demographic distribution has more variation.
6. Profile photos that fail reverse-image search.
Trustpilot

Manufactured accounts use stock photography, AI-generated faces, or stolen photos from other social media. A reverse-image search shows the same face appearing across multiple unrelated reviewer accounts or stock-photo sites.

Five-minute check Right-click a reviewer's profile photo → "Search image with Google" (or use TinEye / Yandex). If the same face appears on (a) Shutterstock, (b) other unrelated business reviews, (c) AI-face-generator gallery sites — the account is manufactured. We've seen single faces tied to dozens of clinic reviews across multiple countries.
7. Cross-platform rating divergence.
Cross-check

A clinic running paid Trustpilot management will not always run the same on Google Maps, Yelp, or Reddit. Authentic clinics show convergent ratings across platforms (4.0–4.5 on multiple platforms). Manipulated ones show one stellar (Trustpilot 4.9) alongside a noticeably lower companion (Google 3.2).

Five-minute check Look up the clinic on Trustpilot, Google Maps, and a community forum like Reddit (r/HairTransplants, r/MedicalTourism). If the gap between Trustpilot aggregate and Google aggregate exceeds 0.7 stars, the platforms are seeing different review pipelines. The lower of the two is usually closer to authentic.
8. Pre-procedure or day-of-procedure reviews.
Trustpilot

Real reviews come after a result has had time to develop. A hair transplant outcome takes 9–12 months to evaluate honestly. A "5-star, amazing result, highly recommend" review posted on the day of surgery — or even before — is endorsing experience, not outcome, and frequently happens because the clinic asks for the review while the patient is on-site.

Five-minute check Cross-reference the review date against the patient's stated procedure date if mentioned. If the review says "I had my procedure on [date]" and the review was posted within 0–7 days, it is not measuring outcome. It is measuring concierge experience and post-anaesthesia mood. Both real things, but not what most readers think they are reading.
9. Identical review structure — the "five-paragraph essay" pattern.
Trustpilot · Google

Reputation-management briefings often instruct reviewers (or AI generators) to follow a structure: greeting, journey context, treatment description, staff praise, recommendation. When 20 of the most recent 30 reviews follow that exact arc, the structure was templated.

Five-minute check Read three randomly-selected 5-star reviews. Note the order of topics. Then read three more. If the topic order is the same — even with different wording — across most reviews, you're looking at templated content. Authentic reviews jump around, leave topics out, mention odd details, complain about small things while still rating high.
10. Trustpilot's own "removed" count.
Trustpilot

Trustpilot publishes the number of fake-or-non-compliant reviews it has removed from each business profile. The figure is hidden until you scroll. A clinic with 1,500 visible reviews and 800 removed reviews has had platform-detected fraud activity equal to half of its visible base.

Five-minute check On the clinic's Trustpilot profile, scroll to the "Trustworthy" section near the top, or look for the small disclosure on Trustpilot's own evaluation. Some profiles also have a public "transparency tab" — Trustpilot's recent UI surfaces removed-review counts more visibly. If the clinic carries a high removed-count relative to visible base, you have the platform's own admission of past fraud-flagging activity.
11. Undisclosed incentive — discount-for-review.
Cross-check

In the EU and UK, an incentivised review is legal only if the incentive is disclosed inline. The US FTC requires the same. In practice, many clinics offer a 5–15% discount for a 5-star review without enforcing the disclosure. The result is technically a 5-star review from a real customer — but the incentive distorts the rating.

Five-minute check Search the clinic name combined with "discount review" or "discount Trustpilot" on Google. Real clinics with these programmes occasionally have public threads where past patients describe the offer. If you find documentation of an incentive program without disclosure on the actual reviews, every review on the profile becomes harder to interpret.
12. Influencer / affiliate undisclosed marketing.
Instagram · TikTok · YouTube

A "review" video posted by an Instagram or TikTok account that received free or discounted treatment must, in EU/UK/US jurisdictions, disclose the material connection. ASA in the UK has issued multiple takedown rulings for medical-tourism videos. Most clinics' influencer videos in 2026 still lack the required disclosure.

Five-minute check Find the clinic's social-media patient testimonials. Click through to the patient's profile. If the patient is a content creator with a follower count above ~5,000, look for "#ad" / "#gifted" / "#sponsored" disclosure on the post or in the bio. If absent, the content fails ASA/FTC standards regardless of clinical legitimacy. The signal is regulatory, not clinical — but it tells you about the clinic's marketing posture.

How the signals compound

None of these twelve, on its own, proves anything. Real clinics occasionally trip one — a reasonable clinic may have one velocity spike from a launch event, or a few one-review profiles, or a Trustpilot–Google divergence caused by a different patient-base on one platform. The signal is the compound: when six or eight of the twelve check positive on the same clinic, the review base under the headline rating is structurally manufactured. When zero or one check positive, the review base is, on the patient-readable evidence, authentic.

We have run this twelve-signal check on twenty Turkish hair-transplant clinics, twelve Hungarian dental clinics and ten Albanian clinics over 2025–2026. The distribution is roughly: 15% of clinics in the segment trip zero or one signal — those review bases look authentic at scale. 30% trip two to four — meaningful imperfection but probably not systematic manipulation. 35% trip five to seven — reputation management active. 20% trip eight or more — the review base is a manufactured artefact that should not be used as a primary trust input.

We are not naming specific clinics in any of these buckets in this report. The methodology is the substance; readers should run it on the clinics they are actually considering, because review profiles change weekly and a snapshot taken today may not reflect the patient's situation in six months.

What to do once you have run the check

If a clinic you are considering passes the twelve-signal check (zero to two flags), the review base is a useful input. Continue using it, in combination with the six-question test for written quote answers (we have a separate methodology guide on that subject). The review system is not broken in absolute terms — it is broken in average terms.

If a clinic you are considering fails the twelve-signal check (six or more flags), the review base is not a useful input regardless of the headline aggregate. Move to alternative inputs: written quote answers, surgeon-name commitment in writing, materials-batch-certificate availability, written warranty terms, the documented existence of a complaint-resolution route. None of these can be manufactured at scale because each one creates a paper trail.

"The market for fake reviews is mature, well-priced, and undetectable to the casual reader. The market for written-quote evasion is also mature, but it leaves a different kind of paper trail — a missing one. Compound the two, and a patient can pre-screen most of the segment in twenty minutes without ever leaving home."

What this guide does not do

It does not name specific clinics. It does not assess clinical quality. It does not predict surgical outcome. It does not constitute a recommendation against any particular clinic. The signals are detection inputs for the review base only; whether a clinic with a manufactured-review base also produces good clinical outcomes is a separate question — which, for many medical-tourism clinics, is genuinely uncorrelated with their review-management posture, because the same clinic operations team sometimes runs the medicine well while the marketing team runs the reviews aggressively.

The article also does not assert that the twelve-signal check is the only relevant evaluation methodology. It is one of several; the strongest patient pre-screen combines review-base authenticity (this guide), written-quote substantiveness (our separate six-question test), and direct sources — past patients you can speak with, surgeon registry verification, materials-batch availability.

Patient checklist

  1. Open the clinic's Trustpilot page, sort by date, scan velocity (signal 1)
  2. Check whether bursts coincide with marketing campaigns on social media (signal 2)
  3. Read 10 consecutive 5-star reviews — look for linguistic homogeneity (signal 3)
  4. Click into 10 reviewer profiles, count single-review accounts (signal 4)
  5. Scan 30 reviewer names for over-correlation on common-name patterns (signal 5)
  6. Reverse-image-search 5 reviewer profile photos (signal 6)
  7. Compare Trustpilot aggregate vs Google Maps aggregate vs Reddit reputation (signal 7)
  8. Cross-reference review dates against stated procedure dates (signal 8)
  9. Note whether reviews follow identical topic order (signal 9)
  10. Find Trustpilot's own removed-review count (signal 10)
  11. Search for documented incentive programs (signal 11)
  12. Check influencer testimonials for ASA-required disclosure (signal 12)

Five minutes per clinic. Three clinics in fifteen minutes. The investment is trivial relative to the cost — financial and personal — of choosing a clinic with a manufactured review base.

The Albanian context

AlbaniaClinic is an independent care coordinator based in Tirana, working with selected partner clinics. It is referenced in this guide as one alternative pathway for patients who prefer to combine public-review research with written-quote substantiveness. The decision on whether to use it remains with the patient.

Run the same checks on the Albania alternative

The twelve-signal methodology applies to any clinic, including Albanian ones. Run the checks on AlbaniaClinic's partner network and on any clinic you are already considering. Then compare written-quote answers (our separate six-question test) on the same shortlist.

See the Albania quote process →

Methodology notes

The twelve signals were derived from: Trustpilot's published 2024 transparency report; ASA UK guidance on incentivised content (2023, 2024 updates); FTC Endorsement Guides 16 CFR Part 255; published academic work on fake-review detection (notably the Mukherjee et al. burstiness models for Yelp and Amazon, with adaptation to single-business profiles on Trustpilot); and direct cross-platform observation of approximately forty medical-tourism clinic profiles between January 2025 and April 2026.

The methodology is reproducible. We invite readers — including the clinics whose review bases we have analysed but not named — to send corrections, counter-examples, or methodological refinements to /corrections.html.

Editorial disclosure (full). Clinic Truth is published by an independent editorial team based in Tirana. We earn revenue when readers coordinate care through AlbaniaClinic.com. We do not take payment from clinics — Turkish, Albanian, Hungarian or otherwise — in exchange for reviews, verdicts, or coverage. We hold the cross-platform observation data underlying this guide, anonymised at the clinic level, and will share methodological details on lawful research request to the editorial team. Methodology in detail: /methodology.html; corrections: /corrections.html.