1932

CREDIT: paul campbell via Unsplash+

Small But Mighty Platform Shows What Modular Publishing Can Accomplish

ResearchEquals doesn’t yet have a large user base. But by making it easy to share outputs at all stages of the research process and supporting the curation of high-quality content, it stands out among its peers.

By Abbey Lewis

|

LAYOUT MENU

Insert PARAGRAPH
Insert H2
Insert H3
Insert Unordered List
Insert Ordered List
Insert IMAGE CAPTION
Insert YMAL WITH IMAGES
Insert YMAL NO IMAGES
Insert NEWSLETTER PROMO
Insert QUOTE
Insert VIDEO CAPTION
Insert Horizontal ADVERT
Insert Skyscrapper ADVERT

LAYOUT MENU

ResearchEquals is a modular open access publishing platform that allows users to publish all products of the research process, regardless of the stage of the project or the type of item. Each individual item published through ResearchEquals is assigned a DOI, making all of the content on the platform reliably accessible and citable.

The platform’s small user population may give researchers pause. But its flexibility across disciplines and its inclusive concept of research outputs make it easy to share one’s work at any stage, and its user-curated collections effectively surface high-quality content. I recommend it.

Product Overview/Description

Although structurally lightweight, ResearchEquals solves some complex problems of both traditional and open access publishing. The platform makes all research outputs free to publish and access, relieving financial pressure, reducing publication bias, increasing process transparency, and equalizing the value of all output types. Users can add references and backlinks from other sources and connect outputs published anywhere, creating a robust representation of the research process, its products, and the ways in which it has influenced or been influenced by other knowledge.

ResearchEquals’ founder, Chris Hartgerink, describes the platform as a place where researchers can publish and curate the research process from start to finish. The features and functions available reflect Hartgerink’s broad conception of what a researcher is and what could be considered research. ResearchEquals places description, curation, and even dissemination of research in the hands of researchers themselves, unrestricted by disciplinary norms or publishing conventions.

User Experience

Depending on how much detail they wish to include, users can quickly add outputs to ResearchEquals or curate collections. The Content pane offers rich options for structure through headings, tables, lists, embedded files, and even emoji. Users can upload files of up to 100MB, with no limit on the total number of files. When drafting an output, I felt confident that what I was seeing would be consistent with the final published look. I could easily add elements that supported many varieties of structure and media (Figure 1).

A screenshot of the full content pane with the menu open for selecting heading and content block types

FIGURE 1

One notable exception to this smooth functionality is the “References” pane, which is not editable in draft mode. Instead, when the output is published, references are extracted from hyperlinks in the content pane, condensing any citation, regardless of style, down to a URL or DOI. If users want to list sources such as print books or archival materials or use citation styles that do not incorporate a hyperlink, they will need to make changes to include some kind of URL—for example, a link to a library catalog or finding aid.

Outputs are only editable in draft mode. Once an item is published, it is immediately assigned a DOI and cannot be edited further. In this way, ResearchEquals is similar to traditional publishing, introducing a permanent product into the information landscape.

“Collections” are user-curated links between outputs, whether published by the same researcher(s) or others. Outputs added to a collection do not necessarily need to reside within ResearchEquals, and users can submit outputs for a collection that they do not own, as long as that collection is publicly visible. Users can add detailed descriptions for collections submissions in a comment field, which includes a drop-down menu of headings and blocks similar to those available for outputs (Figure 2). The owner of the collection is able to accept, decline, or edit any submissions. Upon being made public, collections are also assigned a DOI.

A screenshot of the collection submission window, showing the menu for adding heading and content blocks to the comment field

FIGURE 2

Users can browse outputs, organizations, people, or collections, which are listed newest to oldest without additional categorization. They can also link to outputs or collections using the DOI, listed as a persistent identifier under the Metadata tab. DOIs generated through ResearchEquals bring users to the item in ResearchEquals.

In March 2026, ResearchEquals implemented a much-needed basic metadata search feature.

Accessibility

I checked the top-level pages of the site using accessibilitychecker.org, which indicated adherence to WCAG 2.2 best practices. The platform provides researchers the elements they need to create accessible content, including headings, numbered lists, and alt text for images. That said, there is significant variation in user adoption of these features, in both content created within ResearchEquals and files that have been uploaded or embedded.

Contracting and Pricing Provisions

ResearchEquals launched in 2022 with a “pay to close” business model that allowed researchers to reserve increasing rights to their material for increasing fees, up to €549.99 for “all rights reserved.” In the four years the model was in place, no users chose to pay. Now the site offers four licensing options free to researchers and readers: Creative Commons Attribution 4.0 International, Creative Commons CC0 License, GNU General Public License Version 3.0, and MIT License (Figure 3).

A screenshot of the output metadata pane, showing the four available license types

FIGURE 3

Hartgerink has also experimented with service models that charged for certain features, such as the ability to curate a collection, but, today, all of the platform’s features are free to use and access.

ResearchEquals’ development was supported by grants from the Mozilla Foundation and the Shuttleworth Foundation. Further shifts in the business model are likely to occur as the platform seeks to sustain itself through service revenue.

Competitive or Related Products

Since its launch, ResearchEquals has been mentioned alongside Octopus, another modular publishing platform. Both encourage publication of outputs throughout the research process, but Octopus does so through eight discrete types that loosely align with components of traditional academic papers. Following Hartgerink’s inclusive ideas of what might be considered research, ResearchEquals has 38 output types ranging from conceptual (idea, premise) to concrete (medical ethical review, reproducibility report), with preprint as the default. Octopus encourages open peer review of all research outputs, while ResearchEquals emphasizes the curation of collections within the platform as a means of recognizing quality content.

Other notable modular platforms include Zenodo, the repository from OpenAIRE that offers options to close access, and Figshare, which allows larger file sizes suitable for data.

Within the busy realm of modular publishing, ResearchEquals is distinguished by its expansive recognition of research outputs and pliable format for users to render and curate them.

Critical Evaluation

Modular publishing is an emerging area of the research information landscape. Like many forms of open access publishing, it exposes tension between the norms of traditional publishing, the professional and academic needs of researchers, and the benefits of making research broadly available. ResearchEquals’ greatest strength is its dedication to increasing access to research. The features and functions available to users through the platform all prioritize that value.

Curation might be easily overlooked as one of the platform’s important offerings, but some notable users have produced excellent collections. Annual Reviews has created topical collections of articles published across its journals on Climate Change, Refugees, Economic Empowerment, and Global Health (note: Annual Reviews is the publisher of Katina). Hannah Harrison and Philip Loring have created a collection of what they call Pubcasts, podcast-style audio recordings based on peer-reviewed articles; the protocol for creating a Pubcast is a separate ResearchEquals output.

Even though it lacks the formality and decisiveness of peer review, curation provides ResearchEquals a necessary quality control measure. At the time of writing, ResearchEquals has just over 300 outputs, but the ease of publishing there could lead to it becoming overrun with low-quality content. I contributed to this problem when trying to learn more about the platform; my experiment will live on ResearchEquals forever, complete with its own DOI. Although the interface is uncomplicated, a little more help text could prevent this kind of “research.” Still, my mistake is unlikely to be curated into a collection that implies it has research value.

The permanence of a DOI, usually thought of as a benefit, could be a drawback when publishing research without sufficient quality checks. The control ResearchEquals gives researchers over publication should be paired with a mechanism to prevent flawed or non-accessible information from being disseminated. Curating knowledge is complex work, evidenced by guidelines like Findability, Accessibility, Interoperability, and Reuse (FAIR) and Ownership, Control, Access, Protection (OCAP) that protect the integrity of information and the rights of people. The assumption on ResearchEquals is that researchers themselves are best equipped to undertake that challenge. That may be true in many cases, but there are reasons to be cautious before making outputs available.

The low number of outputs on ResearchEquals might give researchers pause when it comes to placing their own work on the platform. Tim Provenzano correctly raised this issue in his Katina review of Octopus, which has seen few users engage in open peer review, a feature the platform is intended to facilitate.

ResearchEquals, however, is not designed to facilitate peer review, but to facilitate access and collection, functions that aren’t affected by the total number of users. A researcher could use the platform, for example, to curate data, images, code, and other outputs into a collection for a conference presentation, providing attendees with a single DOI to access varied and robust information about their work. The limited outputs on ResearchEquals do inhibit its usefulness for discovering research, but it’s not intended as a discovery tool. Its main purpose, increasing accessibility for all research outputs, is accomplished through its broad recognition of what those outputs can be and the control it gives researchers over description and curation.

The platform’s flexibility to meet the needs of researchers should appeal to potential users from many disciplines, most obviously in the sciences, but also in the humanities and social sciences. Aside from practical concerns, researchers’ interest in the platform may be informed by some underlying philosophical questions: How should editorial or evaluative measures be used in modular publishing? What does it mean if all outputs are equally publishable? How does engagement with internal or external audiences convey research quality? Hartgerink acknowledges that some changes to the platform are necessary to establish a self-sustaining service model and seems eager to implement features that will encourage adoption of the platform on institutional scales.

Recommendation

Modular publishing and open access publishing more generally are in flux while developers learn which features encourage widespread adoption of their platforms. For researchers who want to share their work and are willing to take responsibility for publication, curation, and identifying interested audiences, ResearchEquals could have many applications.

Note: This review has been updated to correct an error in the description of the implementation of ResearchEquals’ basic metadata search feature: this occurred in 2026, not 2022.

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error