The research ecosystem is increasingly facing challenges of efficiency and impact, and metaresearch, the study of research itself, has become an important field. As a growing community with interdisciplinary approaches to answering research questions, the field needs a place for communicating research findings.
There are established journals in allied fields, like Scientometrics and Quantitative Science Studies, which look into more quantitative measurements of science systems. Then there are journals, like Research Policy and Research Evaluation, that look into science and technology from a qualitative perspective. However, with scientific publishing grappling with monumental challenges—including accessibility for both readers and authors, efficiency, and the authenticity of peer review—it is imperative that the research community looks beyond these journals and supports innovation.
Enter MetaROR, a platform designed to reimagine how metaresearch is reviewed, communicated, and curated. Founded by the Research on Research Institute (RoRI) (where I am a research fellow) and the Association for Interdisciplinary Meta-Research and Open Science (AIMOS), MetaROR uses a “publish-review-curate” (PRC) model and accepts various forms of scholarship, papers, policy briefs, blogs, and work in other formats.
Last December, in Cape Town, during the Second Global Diamond Summit, I had a chance to chat about MetaROR with my colleague at RoRI, André Brasil, who is one of the researchers closely involved in its design and day-to-day operations. After the conference, we corresponded, developing our conversation into the below Q and A, in which we explore the platform’s vision, challenges, and potential.
Moumita Koley (MK): Let's start with the basics. What is MetaROR, and how does its “publish-review-curate” model work?
André Brasil (AB): MetaROR, short for MetaResearch Open Review, is a platform designed to implement a special flavor of the “publish-review-curate” model in the metaresearch environment. It is a joint initiative of the Research on Research Institute (RoRI) and the Association for Interdisciplinary Meta-Research and Open Science (AIMOS), adopting a model that works in three stages. First, authors publish their work using the preprint servers or institutional repositories of their preference to make their findings immediately accessible to the community. Second, they submit their work to MetaROR, which arranges the open peer review process, with reviews published transparently alongside the original work. Finally, in the “curate” phase, an editor produces an assessment report based on the reviews as well as their experience and view of the manuscript under analysis.
There is no binary decision, so we don’t accept or reject the submission. Instead, we provide the authors with a constructive view of their work. This model not only accelerates the dissemination of knowledge but also encourages a more transparent and collaborative review culture.
MK: What motivated the creation of MetaROR? How does it address gaps in the offerings of existing journals catering to the metaresearch community?
AB: MetaROR was born out of the recognition that the existing scholarly publishing system faces a crisis. Reviewers, for instance, are often overwhelmed by requests, and publishing an article can take a very long time—often exceeding a year, even under ideal conditions. This inefficiency is compounded by a lack of transparency in the system. When you review for a journal, that feedback often remains hidden, especially if the manuscript is rejected, meaning valuable insights never see the light of day.
There’s also growing discomfort within the academic community regarding the voluntary nature of reviewing, especially given the high fees many publishers charge authors. This has created a sense of imbalance, where reviewers contribute their expertise for free, yet authors and readers face significant financial barriers to publishing and reading.
Preprinting has emerged as a promising alternative, providing a faster way to share knowledge. However, preprints lack the peer review processes that can be relevant for ensuring and improving the quality of research. That’s where the “publish-review-curate” model comes in. It introduces a transparent peer review layer to preprints and culminates in a curation process that highlights the strengths and weaknesses of a piece of work. This approach contributes to the shared knowledge ecosystem by making the review process more constructive and visible.
In this context, focusing on the metaresearch community is a deliberate choice. The challenges we face—inefficiency, lack of transparency, and inequity—are common to all disciplines. But improving scholarly communication is a substantial object of study for us. For instance, at the Research on Research Institute, we have a research project investigating the peer-review system. In that sense, MetaROR is also a platform where we can experiment and materialize our findings, allowing us to contribute to creating a model that can inspire similar initiatives in other fields.
MK: Platforms like MetaROR often struggle to sustain themselves. How do you plan to fund the platform, given the reliance on a diamond model and potential dependency on volunteers?
AB: Sustainability is a significant challenge for any open access initiative, and MetaROR is no exception, of course. Today, the platform relies on institutional support from RoRI and AIMOS as founding organizations. MetaROR is, as mentioned, a research project with the necessary support to develop the platform and the concepts behind it. However, we are also planning for the long term. We have already developed and are refining a governance plan to ensure that MetaROR can eventually operate independently of its founding organizations. This plan focuses on creating a sustainable framework for the future, built on diversified funding streams and community-driven governance.
A key element of MetaROR’s journey is its vibrant volunteer community. As is traditional in scholarly communication, researchers contribute their time as editors and reviewers. MetaROR leverages this tradition but also goes a step further. Our open review system provides contributors with tangible recognition for their work. Reviews on MetaROR are openly accessible and attributed to the reviewer so that they are part of the knowledge-creation process, offering visibility and career benefits to reviewers—something that traditional journals often fail to do.
MetaROR’s free-to-publish, free-to-read model also resonates with the values of openness and equity. I find this approach more motivating as a reviewer, knowing that my efforts are open and transparent, contributing to an open and inclusive system where the scientific community is the one that profits.
MK: Open peer review can be intimidating for early career researchers, particularly when identities are revealed. How does MetaROR address these concerns while ensuring transparency?
AB: This is a valid concern, and we’ve taken steps to strike a balance between transparency and inclusivity. On MetaROR, reviews are always open, but reviewers can choose whether to disclose their identities or remain anonymous. This flexibility allows those who might feel vulnerable—early career researchers, for instance—to participate in the review process without fear of repercussion. At the same time, we actively promote a constructive and respectful review culture, providing clear guidelines and training to reviewers to ensure that the feedback is fair, professional, and supportive.
MK: Traditional journals often rely on editorial boards comprising respected researchers to build trust. MetaROR, as a platform-driven model, lacks this “personal touch.” How do you instill a sense of reliability and trust in the community?
AB: Actually, we do not lack this “personal touch,” quite the opposite. We have a noteworthy editorial team led by Ludo Waltman and Kathryn Zeiler as editors in chief. Just as an example, Ludo was also editor in chief of The Journal of Informetrics and later became one of the founders of the Quantitative Science Studies journal, being experienced and recognized within that community. You are entirely correct in saying that trust is built significantly by the people behind an initiative, and that is no different at MetaROR.
However, we recognize another significant challenge around editorial boards that we are working on addressing. At MetaROR, we welcome research from multiple research-on-research disciplines, from philosophy of science to higher education studies, science and technology studies, research policy and more. We know these disciplines have distinct theoretical backgrounds and publishing frameworks that must be respected. Part of choosing a journal to publish your work involves identifying with their profiles and knowing your work aligns with their editorial perspectives. At MetaROR, our editors are scholars already representing various research-on-research disciplines. As we grow, we want to expand this, building safe and relatable communities within the platform where authors can be sure their work is welcomed, understood, and valued.
MK: Does MetaROR provide a space for data deposition, and how do you ensure that datasets are easily accessible?
AB: MetaROR does not serve as a direct repository for data. This relates, in part, to our focus on the review and curate parts of PRC. We want authors to continue using the institutional repositories and preprint servers they favor due to personal preference or disciplinary tradition. What we do, regardless of the source of the submission, is display an edited version of the manuscript in our platform, including any links to datasets that we strongly encourage authors to deposit in trusted, field-specific repositories. Furthermore, we are also working with COAR, the Coalition of Open Access Repositories, to implement its COAR Notify service. This allows different platforms to communicate with each other so our reviews and assessment reports are made available to the original preprint servers, and any data connections will be integrated, ensuring that they are easy to locate and access.
MK: Will MetaROR assign DOIs to contributions? How does the platform ensure that published work is discoverable and integrated into the broader scholarly ecosystem? Especially with the confusion regarding various versions and their citations and total citation counts of an article.
AB: Yes, MetaROR will assign individual DOIs to each review and assessment report. This is a crucial step to ensure the proper citation of these contributions and to formally recognize the time and effort reviewers invest in their work.
Regarding versioning, MetaROR views every review report as a valuable scholarly contribution in its own right, so new DOIs will be assigned in case of a second round of reviews. For example, if I submit a manuscript to MetaROR and you review it, your review will receive its own DOI. If, after receiving the curation and reading the reviews, I decide to produce a revised version of my manuscript and resubmit it —although this is not mandatory at all—if you review the second version, your new review will receive a new DOI. Importantly, all these DOIs are linked back to the DOI of the original submission, issued by the original source, like SocArXiv, OSF (Open Science Framework), Zenodo, and so on. This way, we can create a transparent and navigable system of contributions.
One challenge we are addressing with the broader community involves the management of preprint DOIs. Not all preprint servers generate versioned DOIs for the preprints deposited there, which can create complications when tracking the specific version of a reviewed manuscript. For systems like MetaROR, it is critical to point back to the exact version of the submission that was assessed. While this remains a challenge, the good news is that there is increasing collaboration across the scholarly ecosystem to improve these practices. We’re actively engaged in those efforts and optimistic about progress.
MK: Where do you see MetaROR in the next five to ten years? What impact do you hope it will have on the field of metaresearch and open science practices? Do you see the need for indexing MetRoR in scholarly databases such as Scopus and Web of Science? Will the ongoing debate regarding eLife (which, like MetaROR, uses the PRC model) affect MetaROR?
AB: These are challenging questions, as they touch on several complex and interconnected issues. I’ll try to address the key points succinctly, although each could warrant a much longer discussion.
Starting with the question of indexing and the recent challenges faced by eLife, it’s important to note that eLife is one of our key partners in the development of MetaROR. Together with eLife and Kotahi, the platform powering MetaROR, we are not only advancing the publish-review-curate model but also building the necessary infrastructure to support it. Naturally, what happened with eLife’s indexation is a concern for us, because it reflects a resistance of the traditional actors to embrace an evolution from the binary accept/reject model. At MetaROR, we believe the idea that research can be neatly categorized as either “good” or “bad” is fundamentally flawed. Every piece of research has strengths and weaknesses, and our evaluation systems must reflect these shades of gray. By adopting the particular flavor of PRC we use, MetaROR aims to make these nuances visible, contributing to a fairer and more constructive review process. We do hope the whole system can evolve in that direction.
Regarding indexing in databases such as Web of Science and Scopus, I have mixed feelings. On one hand, being indexed is important in the current system because evaluations, funding decisions, and even career progression often rely on these databases. On the other hand, I am an advocate for open and community-driven databases like OpenAlex, OpenAIRE, and national current research information systems. These resources provide more transparent, accessible, and equitable alternatives for evaluation. Ideally, the reliance on proprietary indexing systems will diminish if we move toward broader adoption of such alternatives. For now, indexation in Web of Science or Scopus would be an advantage, but I hope and expect that it will no longer be a critical factor in the future.
As for the long-term vision for MetaROR, I see it evolving to meet the needs of the scholarly system. Currently, we are focused on implementing and refining the PRC model. At the same time, we’re also partnering with traditional journals in the metaresearch community to enhance the efficiency of the publishing process. For example, an author could submit their manuscript to MetaROR for review. After receiving constructive feedback and making potential improvements, they could submit the revised manuscript to a partner journal. The partner journal could then utilize the existing MetaROR reviews as part of its decision process, significantly speeding up the overall system. In this way, MetaROR supports researchers who still need to publish in traditional journals, acknowledging the realities of the current academic ecosystem.