peer-review
by K-Dense-AIThe peer-review skill helps you write formal, evidence-based manuscript and grant reviews. Use it to assess methodology, statistics, reproducibility, ethics, and reporting standards like CONSORT, STROBE, or PRISMA, with constructive feedback that authors and editors can act on.
This skill scores 74/100, which makes it an acceptable listing with clear caveats. Directory users get a real, non-placeholder workflow for structured manuscript and grant peer review, but should expect some adoption friction because the repository lacks companion scripts, references, or an install command and relies mainly on a long SKILL.md for guidance.
- Strong triggerability: the description and 'When to Use This Skill' section clearly target manuscript review, grant review, methodology assessment, statistics, and reporting standards.
- Good operational depth: the body is substantial (22k+ chars) with 9 H2s, 20 H3s, and explicit checklist-style evaluation content, which reduces guesswork for agents.
- Useful install decision signal: it is not a stub or placeholder, and the scope includes concrete standards like CONSORT, STROBE, and PRISMA.
- No supporting files or install command: the repo has no scripts, references, resources, or setup guidance, so users must rely on the markdown alone.
- The skill is specialized to formal scientific review; it is less suited for general claim-checking or broad scientific critique, which the description itself redirects to other skills.
Overview of peer-review skill
The peer-review skill helps you write a formal, evidence-based review of a manuscript or grant proposal instead of producing a generic critique. It is a good fit when you need a structured peer-review, clear methodological judgment, and constructive comments that a submitter, editor, or team can act on.
Use this peer-review skill when the job is to assess study design, statistical validity, reproducibility, ethics, and reporting quality. It is especially useful if you are reviewing work against checklist-style standards such as CONSORT, STROBE, or PRISMA, or if you need a review that reads like a real scientific evaluation rather than a loose summary.
What peer-review is best for
This skill works best for:
- Journal manuscript reviews
- Grant or fellowship application reviews
- Reviewer-style feedback on methods, claims, and reporting
- Revision guidance for authors who need specific fixes, not just opinions
It is less useful for casual fact-checking, broad evidence appraisal, or scoring frameworks that need separate quantitative rubrics.
Why this peer-review skill is different
The main advantage is structure: it pushes the review toward criteria-based judgment, not paragraph-by-paragraph reaction. That matters when you need to decide whether a paper is sound enough to advance, and where the biggest risks are: weak controls, unclear analysis, unsupported conclusions, missing reporting details, or poor reproducibility.
What users usually want
Most users want the peer-review skill to do three things well:
- Identify the most decision-relevant strengths and weaknesses
- Separate major concerns from minor editorial notes
- Phrase criticism in a professional, useful tone
If your goal is to improve the manuscript, the skill should help you produce comments that are specific enough for authors to revise.
How to Use peer-review skill
Install the peer-review skill
Install it with:
npx skills add K-Dense-AI/claude-scientific-skills --skill peer-review
After install, treat this as a workflow skill, not a one-line prompt. The quality of the review depends on how well you define the document type, field, review standard, and level of rigor expected.
Start with the right input
A strong peer-review prompt should include:
- Document type: manuscript, revision, protocol, or grant
- Domain: clinical, biology, engineering, social science, etc.
- Review target: journal-style review, internal critique, or author revision memo
- Any required checklist: CONSORT, STROBE, PRISMA, journal rubric
- Your tolerance for tone: strict, balanced, or highly constructive
Example of a useful input:
Review this clinical manuscript as a journal peer-review. Focus on trial design, statistical validity, reporting completeness, and whether conclusions exceed the data. Keep major and minor comments separate and make the feedback actionable.
Read the repo in the right order
For the peer-review guide, start with SKILL.md and read the sections that define when to use the skill and how the review should be structured. If your local copy includes supporting files, check README.md, AGENTS.md, metadata.json, and any rules/, resources/, references/, or scripts/ folders first because they usually contain the operational details that affect output quality.
If those files do not exist, do not assume the skill is incomplete; this repository appears lightweight, so the main instructions may live almost entirely in SKILL.md.
Workflow that produces better reviews
A practical peer-review workflow is:
- Identify the article type and review purpose.
- Tell the model what standards matter most.
- Ask for a structured review with priorities.
- Request revision-ready language, not just criticism.
If you want the best output, ask for sections such as overall assessment, major concerns, minor concerns, and author-facing recommendations. That format makes the review easier to use in editorial or revision workflows.
peer-review skill FAQ
Is peer-review the same as a generic critique?
No. A generic critique usually reacts to quality in an informal way, while this peer-review skill is designed for structured scientific evaluation. It is more useful when the output must resemble an actual reviewer report with criteria, priorities, and professional tone.
When should I not use peer-review?
Do not use it when you only need to judge whether claims are true, when you need a scoring model rather than a narrative review, or when the task is not tied to manuscript- or grant-style evaluation. In those cases, a claims-evaluation or rubric-based skill is usually a better fit.
Is this peer-review skill beginner-friendly?
Yes, if you can describe the document and the review goal clearly. The main risk for beginners is asking for “a review” without naming the field, the standard, or the audience. A good prompt makes it much easier for the skill to focus on the right issues.
What should I compare it with?
Use peer-review when the output should feel like a formal reviewer report. Use other skills if you need evidence verification, claim stress-testing, or numerical evaluation. The decision point is whether you need review-style judgment or a different analytical frame.
How to Improve peer-review skill
Give it better manuscript context
The strongest peer-review results come from context that changes the review criteria. Tell it whether the paper is observational, experimental, clinical, qualitative, or theoretical, and whether the target is a top-tier journal or an internal pre-review. That changes what counts as a major flaw.
Ask for the right kind of criticism
If you want useful output, specify what matters most:
- methods and controls
- statistics and interpretation
- reporting completeness
- novelty and significance
- ethics and reproducibility
This prevents the review from spending too much time on style when the real problem is design or analysis.
Provide constraints for a sharper review
The peer-review skill improves when you state constraints up front:
- “Focus on major concerns only”
- “Keep tone constructive for author revision”
- “Flag unsupported claims and overreach”
- “Separate scientific issues from writing issues”
Those constraints make the review more decision-useful and reduce generic commentary.
Iterate after the first pass
After the first review, ask for a second pass that tightens weak areas. For example:
- “Turn these comments into a reviewer report for a high-impact journal.”
- “Shorten the minor comments.”
- “Make the recommendations more actionable for authors.”
- “Re-rank the concerns by severity.”
That iteration usually improves the final peer-review output more than asking for a longer initial review.
