Journals with open-discussion forums lend themselves well for peer
review exercises to train early career scientists.
The peer review process is an integral part of the scientific endeavor, yet most reviewers have no formal training. The learning process may have occurred by observing what reviewers write from experiences as authors or by advice from an advisor. There are resources available from publishers and scientific associations, such as Nature Research's Focus on Peer Review master class (Nature Masterclasses, 2023), American Chemical Society's Reviewer Lab (ACS Reviewer Lab, 2023), and Wiley's Peer Review Training (Wiley, 2023). There are also published articles by researchers describing strategies and tips, like “Learning the Ropes of Peer Reviewing” (Pain, 2008), “How to write a thorough peer review” (Stiller-Reeve, 2018), “`Refereeing Template': A Guide to Writing an Effective Peer Review” (Berlinguette et al., 2021), and “The Golden Rule of Reviewing” (McPeek et al., 2009). Gratifyingly, there is a growing number of outlets to help recognize the reviewers' behind-the-scenes contributions to the peer review process, such as Publons (now Web of Science), and reviewer awards by journals. These resources are great, but structured implementation of these tips and templates are required to train early career scientists.
Here, we describe a framework to apply this peer-reviewing advice to a
workshop for trainees. For instance, instructors can run peer reviewer
training workshops within their groups or classrooms to provide formal
schooling in this important process. Research outlets like
Authors of data publications benefit from rigorous peer review, especially
in an open-access, interactive forum like that of
Since
We, the authors of this paper, have collective experience with manuscripts
published as preprints in
I am an assistant professor, and my research group is composed of young
researchers new to the peer review process. To help provide transparency to
the process of publishing research, I ran a workshop within my research
group (2 PhD, 3 MSc, and 2 BSc students) using an
I gained my first review experience as a participant in a collaborative
student review of an
As an author of several large geospatial data publications, I have found the
group-review assignment to be capable of providing considerably more discussion
than a single-party review within the allotted time. My initial
We reflect on our respective experiences as an instructor, trainee, and
author to offer recommendations for a workshop using open-discussion forums
to provide peer review guidance for early career scientists. The workshop
could be embedded into a senior undergraduate or graduate course and count
towards credit, or it could be conducted within a research group. The workshop would be
suited for a group of 20 participants or less to ensure adequate time for
discussion and feedback. The instructor chooses a recently posted discussion
paper and plans three to four group interactions around the manuscript. The goal of
the primary exercise is to submit an open review comment, reflecting the
concerted efforts of the students and compiled by the instructor (who has an
account with the open-discussion journal). Throughout the workshop, the
students read the manuscript and come together to brainstorm on the merits
– or lack thereof – of the science (and data products) presented. We
recommend that the instructor provides different tasks for which the
trainees can volunteer. Examples of tasks related to peer review for
We can also recommend an additional session within the workshop where students are asked to develop potential applications of the data relevant to their interests. This element goes beyond the fundamental components of dataset review and focuses on developing students' creativity, as well as their technical abilities and understanding of statistical methods and other analytics. Consideration of potential applications, even as a proof of concept, can also encourage closer examination of the precision, accuracy, or quality control of the dataset and manuscript under review.
The outcomes of the workshop are for early career scientists to learn how to ask critical questions, how to formulate suggestions for improvement using a teaching tone, and how to summarize a research article. In sum, the goals are to take part in the peer review process, to learn about the iterative process of the scientific method, and to appreciate the value of constructive criticism.
There is an intrinsic benefit when experienced scientists are investing in
the future of the peer review process. If all reviewers go through a
training program first, then we collectively raise the bar of the quality of
the peer review process. Overall, the exposure to both the review process
and the concept of openly shared, quality-assured data is important in
training the next generation of scientists as well as promoting critical
thinking among our trainees. We see a win–win situation for the trainee and
the author involved. The concept of open data is necessary to advance
knowledge more effectively, and participating in all aspects of the open-data
review process – as a reviewer, student trainee, and author – ensures
the continued availability of high-quality datasets in
NBD wrote the manuscript with significant contributions from KCS and SPC. All authors were involved in a review-training exercise in the past either as an instructor, student or author.
The contact author has declared that none of the authors has any competing interests.
Publisher's note: Copernicus Publications remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
We acknowledge the reviewers who provided excellent critical feedback. We also thank David Carlson for his support and for connecting the authors together.
This paper was edited by Sam Illingworth and reviewed by Mathew Stiller-Reeve and Mengze Li.