ASReview: smart source screening software for academia and beyond

ASReview

It marks the start of many doctoral programmes, and a considerable challenge for all sorts of researchers: the systematic review. This type of structured literature research takes up a great deal of time and manpower. However, there is hope on the horizon: the new machine learning-based Automated Systematic Review application can save researchers a great deal work. The technical side of things is already up and running - the developers recently won the Victorine Initiative award in recognition of their work. The next step for the team is to make the programme suitable for end users.

You wouldn't necessarily expect to see them all together at the same table: software developers, information specialists from the university library and researchers from various faculties. But here they are, hunched over their overworked laptops in the new ASReview Innovation Lab at Bolognalaan 101 to test the new software in the spirit of real cooperation. How's the installation process going? Can you enter search queries and import files? How user-friendly is the front end? Crucially, the researchers have asked the testers to share their 'light-bulb moments' and frustrations.

Machine learning

The test marks a tense and exciting moment. It was only two years ago that professor of Statistics Rens van de Schoot turned to Associate Professor Daniel Oberski in despair and blurted out: 'I'm doing a selection from tens of thousands of sources; can't we get a machine to do that?' Daniel, who works in the Applied Data Science research focus area, couldn't see why not. The two men submitted a successful application to the Research IT innovation fund. Information and Technology Services (ITS) decided to allocate two engineers to the project, who then succesfully developed a learning programme. Rens: 'As a researcher, I now only have to review 5-10 % of the sources. I mark each title 'yes' or 'no' to indicate whether it's relevant to my research project. The machine then starts learning right away and gains a better understanding of what I'm looking for at every iteration. Once I've helped it get started, the computer will automatically screen the other 90-95%.'
 

ASReview Innovation Lab

Tested by experts

Researchers from the Faculty of Law, Economics and Governance provided datasets for the testing process. Lars Tummers, professor of Public Management & Behaviour, is supervising the testing process. He conducts a lot of meta analyses and systematic reviews as a part of his work. 'For example, we might do research on the stereotyping of civil servants, or on interventions to improve the well-being of care professionals. Finding out what's been written on these subjects involves going through tens of thousands of sources. My PhD students and I would normally select the most relevant ones ourselves, using Scopus. That way, you know you've gone through every available source. Still, it is a lot of work and there's always a chance you miss an important publication. The computer can also overlook something and return a false negative. That's why it's so important not to limit yourself to an automatic database search: I always get experts to check the results of the source screening. They'll be sure to notice if any relevant titles are missing.'

Giving back to society

Jonathan de Bruin, ICT developer at the ITS Research Engineering team, supervised the process. As he explains, this approach certainly isn't standard procedure from the researchers' perspective. 'We'd gathered enough information for scientific publication six months ago. Under normal circumstances, the researcher would then publish and move on. In this case, we want our research to benefit society. ASReview is open-source software. The programme also has plenty of non-academic applications; for example, it can be used to search for case law, news items on specific subjects or patents. That's why we're continuing the tests and building a user-friendly interface: we want the software to benefit society. That takes teamwork. As far as I'm concerned, this is the university of the future. Researchers need to keep going so that others can also benefit from the results, even if the project is no longer interesting from a scientific perspective.'

So, how did the trial go?

It went well: the test yielded loads of tips, suggestions and minor software errors, which really helped our developers. Rens is pleased: 'One participant who had recently published a review paper applied our software to their original data. One hour in, she told us the programme was only returning relevant abstracts! That would normally take weeks, so it was a real breakthrough for our team. We'd never dreamt we'd get that kind of result.'

Interested in testing the software and identifying areas for improvement? Get in touch with Rens van de Schoot.