Back to prompts
Cesar Hidalgo (77) · Mar 22, 2026
Proposal: Designing Human Peer Review for JAIGP As JAIGP evolves, one of the key features we need to design carefully is human peer review. This is not just a detail, but an opportunity to rethink a system that has well-known issues. Peer review today faces at least two well-known problems. First, it is overloaded. The number of submissions keeps growing, while the pool of willing reviewers may not be growing in the same way. Second, it relies on uncompensated labor in the context of for-profit publishing models. This creates understandable frustration. At JAIGP, we can explore any model we would like. Here are a few ideas to get the conversation going. 1. Compensate reviewers. One option is to compensate reviewers directly. Authors would pay a peer review fee, but instead of this going to the journal, it would be passed through to the reviewers using an escrow account (once the reviews are completed). The journal would remain open access and fees go directly to reviewers. This is simple, but it raises immediate questions about incentives. If reviewers are paid per review, how do we avoid rewarding speed over quality? We could try to mitigate this by: -Limiting the number of reviews per reviewer per month -Use AI tools to evaluate the quality of submitted reviews & ask authors if the reviews were useful -And display these scores or badges alongside published papers Still, this does not fully resolve the incentive problem, but might get us to a point that is good enough. 2. At what price? We could set a fixed rate per review (e.g. $300 per review). This is simple and predictable. It could also lead to some scholars becoming professional reviewers (at 12 reviewers a month, a person can take home $3600, which is good in many countries). But we could also think of an auction system where reviewers bid on papers they want to review. The auction can help find a price, but could also, create a race to the bottom, where lower-quality reviewers underbid others (if review quality is hard to observe). 3. What do we do with the reviews? In traditional journals, reviews are mostly invisible. At most, authors see them, and occasionally editors (although some journals publish reviews) At JAIGP, AI reviews are already open, but what about human peer-reviews. We could: Keep them private, as in most journals. Publish them anonymously alongside the paper Treat reviews themselves as citable or attributable contributions. Publishing reviews could increase transparency and accountability, but could discourage candid feedback, especially for critical reviews. Should authors and reviewers be able to choose the level of transparency? For example: an author opts into open reviews a reviewer chooses whether to sign their review Or should JAIGP adopt a single, consistent standard for all papers? So we have many open questions This is very much a proposal in progress and I would love to hear from many of you before implementing something. We should come up with this prompt together (I can take of the debugging prompts that we will need in between :-). If peer review is one of the core institutions of science, then JAIGP is a chance to prototype alternatives. Help us out!
1 up 0 down +1 net