Sign up

A “Meta” Analysis of Systematic Review Software

October 14, 2021

Update: Our systematic literature review of systematic literature review softwares was published! Check it out here. (May 2, 2022)

Over the past few weeks, the research team at Nested Knowledge has been carefully answering an important question: how does our software stack up against other web-based tools for executing systematic literature reviews? Our team figured the best way to go about answering that question was a detailed, systematic review of various systematic review software. Yeah, you read that right. A systematic review of systematic review software. Very meta. This initiative, spearheaded by Kathryn Cowie, our dedicated project manager, help from Asad Rahmatullah, our previous implementation specialist, along with guidance from NK founder Kevin Kallmes, we set out to see how NK’s systematic lit review tools: AutoLit and Synthesis, stacked up against the rest.

As mentioned, the objective of this project was to compare software used to aid healthcare-related systematic literature reviews. We compared Nested Knowledge, Covidence, DistillerSR, Rayyan, EPPI-Reviewer, Cadima, and other software tools that aid SRs and MAs. We identified these various tools by reviewing references, reviewing library guides, and searching the Systematic Review (SR) Toolbox. We excluded all statistical software, posters and abstracts related to this topic, and software that was limited to animal science research or environmental sciences. Additionally, we excluded review of the more unspecialized tools such as Endnote and Microsoft Excel.

Once we began to review the various systematic review tools available online, we devised a way to organize and rank these tools based on shared (or lack thereof) functionality. We grouped features into the following categories:

  1. Managing Information: Were there unique protocol, distinct user roles, ability to monitor/audit?
  2. Finding Articles: Could users search databases, import manual PDF’s, and retrieve full texts?
  3. Choosing Articles: Were abstracts able to be screened, was there an adjudication feature, any machine learning involved?
  4. Data Mining: Would a user be able to tag or label an article of interest, and if so, could that data be extracted in a qualitative or quantitative fashion?
  5. Data Presentation: How could the results of this systematic literature review be visualized, such as a PRISMA flow diagram?
  6. Access: Were these tools free and open to use? Were they updatable (“living”)? Any options to collaborate?


While this process took time, we reached a few conclusions. Among the 22 systematic review tools we investigated, Nested Knowledge, DistillerSR and EPPI-Reviewer Web offered the highest-density of systematic review focused tools. The most commonly offered feature was screening, while extraction was lacking across the board. The notion of updatable or living reviews, which has been strongly advocated for in the scientific community, was similarly underprovided by the tools our team reviewed: with the exception of Nested Knowledge. As of publishing this blog, Nested Knowledge is the only fully updatable or “living” review tool for the full life-cycle of systematic review and meta-analysis.

We hope that our findings will promote further development of existing tools as well as new ones that can be updatable. Given how the flux of new information in scientific literature is rapid, there must be a way investigators can have a solid foundation on which to build new research, rather than repeating a mundane process from scratch. Feel free to reach out to our team with any questions, concerns, or ideas!

Have a question?

Send us an email and we’ll get back to you as quickly as we can!