```html The Prestige Paradox: Circular Citation Cartels & Reviewer Mills Exposed | ResearchFace ```

The Prestige Paradox

Inside the Circular Citation Economy and the Rise of “Reviewer Mills”

How self-citation cartels and coerced peer review are hijacking the metrics that decide careers, funding, and institutional prestige.

🔎 Forensic Feature — Citation Integrity

In the high-stakes world of academic publishing, a “Seal of Approval” from Clarivate’s Journal Citation Reports is the ultimate currency. For a researcher, it is a ticket to tenure; for a publisher, it is a licence to scale. But the metrics we trust are being hijacked by sophisticated internal networks.

As the boundaries of “prestige” are pushed by high-volume Open Access models, a landmark investigation has forced the industry to confront an uncomfortable reality. Originally sparked by the forensic work of Professor M. Ángeles Oviedo-García and subsequently updated to reflect evolving editorial corrections, the case of the Multidisciplinary Digital Publishing Institute (MDPI) serves as a masterclass in how modern publishing “whitelists” can be gamed.

The Indexing Shield and the “Replacement” Record

The journey of this investigation itself mirrors the complexity of research integrity. The original 2021 study by Oviedo-García, which analysed 53 MDPI journals, underwent a significant formal correction in 2023. The record was retracted and replaced to refine the conclusions drawn from the data, ensuring the critique remained grounded in verifiable, cited sources rather than broad generalisations.

This “corrected” lens reveals a persistent structural vulnerability: MDPI journals sit on elite whitelists, yet they exhibit citation patterns that deviate sharply from industry leaders. This creates a “Prestige Paradox” where a journal’s formal rank may no longer reflect its actual adherence to traditional editorial rigour.

The Smoking Gun: Self-Citation and the “Reviewer Mill”

The core of the forensic audit focuses on the S-Rate (Self-Citation Risk). The data shows that MDPI’s self-citation rates are not only significantly higher than the top journals in their categories but are also heavily fuelled by an internal “Citation Cartel.”

However, the “smoking gun” moved from theory to fact in early 2024. Following a volunteer-led investigation by Predatory Reports, MDPI officially uncovered a “Reviewer Mill” operating within its own system. This was not just a matter of authors citing themselves; it was a coordinated effort by peer reviewers to force citations into the record.

The Anatomy of a Reviewer Mill

  • Templated Reports: Investigators identified reviewers using “copy-paste” peer-review templates to speed up the process.
  • Coerced Citations: Reviewers were caught stating that authors “should cite recently published articles” and then providing DOIs for their own work or other MDPI papers.
  • The Scale: The initial breach affected 84 published papers across 23 different journals.

This admission by MDPI confirms the “Behavioural Lens” we use at ResearchFace: when a system prioritises speed and volume, it creates an environment where reviewers can treat peer review as a personal “citation mint.”

Peer Review at the Speed of Light

One of the primary drivers of this vulnerability is speed. MDPI’s value proposition to researchers is a decision often rendered in under 30 days. While this looks like efficiency, the discovery of “Reviewer Mills” suggests it may actually be a “Speed Trap.”

When peer review is compressed into days, the “gatekeeper” function breaks down. In the case uncovered by Oviedo-García, a “dubious review report” published alongside a clinical article became the thread that unravelled the entire network. It revealed reviewers who were more interested in boosting their own H-indices than in verifying the science.

The Publisher’s Response: Transparency or Damage Control?

In a bid to maintain its standing, MDPI has initiated several corrective measures:

  • Post-Publication Audits: Initiating a second round of peer review for 30 of the most affected papers.
  • Reviewer Sanctions: Identifying and contacting ten specific reviewers responsible for the “cartel” behaviour.
  • Citation Cleansing: Undertaking a thorough review of 47 articles to strip out irrelevant or coerced citations.

While MDPI’s willingness to investigate its own flaws is a step toward transparency, the incident underscores the core philosophy of the RI² Index: we cannot rely on a publisher’s “Internal Audit” alone. We need independent, institutional-level metrics that flag these anomalies before they infect the global record.

Institutional Accountability and the RI² Lens

The “MDPI Case” is no longer just about one publisher; it is about the S-Rate (Self-Citation Inflation) and the D-Rate (Delisted Journal Exposure). As major databases like Web of Science begin to delist high-impact MDPI titles (such as IJERPH), universities are realising the high cost of ignoring integrity risks.

  • 🟡 The Amber Warning
    High S-Rates and “Reviewer Mill” activity should have placed these journals on an “Amber Watch List” years ago.
  • 🔴 The Red Flag
    The delisting of “Tier 1” MDPI journals confirms that even the most prestigious Open Access titles can fall into the “Red Flag” category overnight.

Conclusion: A New Standard for “Prestige”

The Oviedo-García investigation, and the subsequent “Reviewer Mill” scandal, proves that the academic community can no longer outsource its ethics to a static list of indexed journals.

The “Reviewer Mill” incident is a wake-up call. It proves that the “Human Lens” — the psychological and professional incentives of reviewers and editors — is just as important as the data itself. At ResearchFace, we will continue to use the RI² Index to monitor these structural vulnerabilities, ensuring that “Prestige” is something earned through rigour, not manufactured through a mill.

Editor’s Note: This article has been updated to reflect the formal correction of the Oviedo-García (2021) study and the March 2024 MDPI “Reviewer Mill” investigation results. ResearchFace remains committed to the most accurate and corrected version of the scientific record.

📚 Sources

[1] Oviedo-García, M. Á. (2021). Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI). Research Evaluation, 30(3), 405–419a. https://doi.org/10.1093/reseval/rvab020

[2] Oviedo-García, M. Á. (2023). Correction to: Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI). Research Evaluation, 32(2), 543. https://doi.org/10.1093/reseval/rvad014

[3] Slone, C. (2024, March 6). Addressing reviewer misconduct in scholarly publishing: MDPI’s proactive steps. Editors’ Café. https://editorscafe.org/details.php?id=1

Next in the Investigative Pipeline…

👥 “Custom-Baked Papers

They hold real PhDs, publish under their own names in legitimate journals, and moonlight as ghost-authors for the bespoke fraud market. This profile-driven investigation examines the labour force behind custom-baked research—underemployed academics, freelance statisticians, and specialist editors who have turned scientific writing into a gig economy. The question is no longer who is buying. It is who is building.

← Back to Insights View Full Methodology →