The Experts Behind the Record

Reviewers

The scholars and researchers who assess whether AI-generated scholarship meets the standard — building the only systematic public record of AI capability in academic writing. No registration. No deadlines. Just your expertise.

8 Reviews Published
30+ Disciplines
~25 min Per Review
What Reviewers Found

This Is What Expert Review Reveals

AI articles look authoritative. Expert reviewers show exactly where that confidence breaks down. These are real findings from published reviews on this platform.

Civil and Environmental Engineering  ·  claude-sonnet-4-5-20250929

“In the introduction, the manuscript mentions that errors of more than 50% are not uncommon, which I don't believe is accurate for most of the equations in the literature. A statement like this definitely requires references to support it, which the manuscript lacks.”

From review of:
📄 → Read the full article & review
Civil and Environmental Engineering  ·  GPT-5.1

“The manuscript identifies some key references relevant to the topic, such as ASCE 61, but it misses several important references with test results.”

From review of:
📄 → Read the full article & review
Civil and Environmental Engineering  ·  GPT-5.1

“The material presented in the "Proposed Novel Insights and Future Directions" section has already been extensively discussed and developed in numerous studies over the past decade, diminishing the originality of this section. ”

From review of:
📄 → Read the full article & review
Education, Demography & Human Geography  ·  GPT-5.1

“ I have two observations about the use of sources. First, citations are consistently paraphrased, with almost no direct quotations, even in places where a succinct definition of a key concept might benefit from being quoted verbatim. Second, the citation pattern is highly uniform: sources are almost always cited at the end of sentences or paragraphs. In other words, the text predominantly employs an information-prominent pattern rather than an author-prominent pattern, or a mix of both. ”

From review of:
📄 → Read the full article & review

Many reviewers choose to remain anonymous — and we fully respect that. Findings above are drawn from published reviews. Named profiles appear below as reviewers opt into public attribution.

The Process

How Reviewing Works

No sign-up. No deadlines. Browse the article archive, find your area of expertise, and submit your evaluation.

Step 01

Read

Browse the article archive, filter by discipline, and choose an unreviewed article in your area of expertise.

Step 02

Evaluate

Assess the article for accuracy, reasoning, citation integrity, and methodology. Flag hallucinations, errors, and gaps.

Step 03

Document

Submit your review via our .edu-verified form. Choose to publish under your name or anonymously. It goes live alongside the article.

About 20–30 minutes

Review as many or as few articles as you like. No quotas, no commitments.

Read the guidelines first

Reviewer Guidelines →

Why Contribute

What Reviewers Gain

Reviewing AI-generated research is a genuinely novel scholarly exercise — and one that matters beyond this platform.

01

Pioneer Status

You are not just checking an AI’s work. You are contributing to the only systematic public record of whether AI can do genuine scholarship — and your evaluation is a permanent part of that record.

02

A Published Contribution

Every review is published alongside the article — attributed by name or anonymously, your choice. A permanent, citable record of your expert evaluation for your CV and portfolio.

03

Forensic AI Analysis

This is closer to forensic analysis than traditional peer review — hunting hallucinations, fabricated citations, and confident-sounding errors. Many reviewers find it genuinely engaging.

04

Real Data for a Historic Question

Your findings become part of a permanent, accumulating dataset tracking whether AI is approaching the threshold of genuine scholarly contribution. The better AI gets, the more significant the early record becomes.

Reviewer Recognition

Your Work Gets Credited

Every reviewer chooses their own attribution. Anonymous or named — both options are permanent, public, and linked directly to the articles you evaluated.

Anonymous Reviewer

Engineering, Computing & Technology

1 review published

Anonymous Reviewer

Engineering, Computing & Technology

1 review published

Anonymous Reviewer

Engineering, Computing & Technology

1 review published

Anonymous Reviewer

Fundamental Sciences

1 review published

Your Name Here

Your Discipline

opt into attribution
Anonymous option

Your review is published and linked to the article. Your identity stays private — permanently.

Named option

Your name appear on this page and on every article you reviewed. Permanently citable.

Start Today

Your Expertise Belongs
in This Record

If you hold a .edu email address and have domain expertise in any academic field, you can evaluate an article today.

We sincerely thank our reviewers for their time, insight, and invaluable contributions to advancing trustworthy AI-assisted scholarship.