Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Only Sessions at Location/Venue 
 
 
Session Overview
Location: Room 212
Date: Tuesday, 18/Jun/2019
11:15am - 12:30pmWorkshop A-04
Room 212 
 
ID: 164 / Workshop A-04: 1
Workshop session
Topics: Roadmap of our Profession
Keywords: Systematic Reviews; LibGuides; Instruction

Surveying the Systematic Review Support Landscape: A Content Analysis of LibGuides

Katharine Alix Hayden, Zahra Premji, Helen Pethrick, Jennifer Lee, Heather Ganshorn

University of Calgary, Canada

Our role as librarians is changing from an advising, supportive role to teaching students, researchers, and faculty systematic review methodology. Specifically, we teach how to conduct comprehensive systematic search strategies (i.e. data collection) during instruction sessions or workshops and, more often, during one-on-one consultation. Librarians are finding it difficult to keep up with the ever-increasing demand for assistance and instruction. As well, researchers and students frequently seek librarian assistance for guidance on all aspects of the systematic review methodology including managing the data, data extraction, and quality assessment.

Librarians often develop online pathfinders or research guides, called LibGuides, in response to this increased demand for guidance and assistance. LibGuides are web-based content management systems that are extremely simple and user-friendly to set up and are currently used throughout academic libraries worldwide. Funded by a Teaching and Learning grant from our University, we were interested in learning how academic libraries used their LibGuides as a means for building capacity for systematic reviews. We wanted to discover best practice LibGuides that provide online (videos, tutorials, written) instructional support for conducting systematic reviews.

We will discuss the results from our recent content analysis of 19 academic libraries’ LibGuides which focused on systematic reviews. The LibGuides were from Australia, Canada, United Kingdom, and the United States. The guides were analyzed for the type of resource: educational (internal), education (external), tools (educational), tools (informational), service, or informational, within each phase of the systematic review methodology. We discovered interesting trends which we will use as a springboard for discussion during the workshop.

The aim of our workshop is to engage with participants on ways to add more instructional resources and content to systematic review LibGuides, and to develop guides that will help build capacity for systematic reviews in their own institutions.

Learning outcomes : Participants will have a greater understanding of how LibGuides support systematic reviews. They will be able to analyze systematic review LibGuides/pathfinders to determine the type of content. They will be able to create and redevelop their own systematic review LibGuides to be more instructional.

Type of interactivity : Interactivity is woven throughout the workshop. We will first ask participants about their experiences designing and developing LibGuides. Participants will be asked for their reflections on their own systematic review LibGuides, as well as on our results from our content analysis. We will also share participants’ best practice LibGuides (provided to the presenters in advance). We will then have a Knowledge Café, where small groups of 5 or 6 participants will discuss how to design and develop a LibGuide for systematic reviews that focuses on education, not only information. Participants will move from one group to another 2 times, and then will come back together as a large group for a final exchange of ideas. The final group discussion will focus on the key elements needed to develop a LibGuide that can build capacity for systematic reviews.

Level : Introductory

Target audience : Participants with an interest in further developing their online resources/pathfinders/LibGuides for supporting systematic reviews.

Preparation for the session : Yes (will be communicated by presenters prior to the conference via email.)

Biography and Bibliography
Presenters:
K. Alix Hayden (MLIS MSc PhD), Nursing & Kinesiology Librarian
Zahra Premji (PhD MLIS), Research and Learning Librarian
Collaborators:
Helen Pethrick (BHSc / BA Student), Research Assistant
Jennifer Lee (MISt), Chemistry, Computer Science, Environmental Design, Mathematics & Statistics, Physics & Astronomy Librarian
Heather Ganshorn (MLIS), Director, Science & Engineering Library

The presenters/collaborators are health science librarians at Libraries and Cultural Resources, University of Calgary, Alberta, Canada. They provide extensive consultation to faculty and students conducting systematic reviews. In addition, Dr. Premji co-teaches a graduate course on systematic reviews, and Dr. Hayden provides extensive support for an undergraduate nursing course on systematic reviews. As well, the presenters/collaborators are co-authors on numerous knowledge synthesis studies and have worked with University of Calgary researchers/students, as well as other organizations including the World Health Organization and the 5th International Consensus Statement on Sport Concussions.
 
2:00pm - 3:15pmWorkshop B-04
Room 212 
 
ID: 246 / Workshop B-04: 1
Workshop session
Topics: Technology Uptake
Keywords: systematic review, machine learning, crowdsourcing, artificial intelligence, technology

Human and artificial intelligence: new technologies and processes to find studies for systematic reviews ( 2 x 75 min)

James Thomas1, Anna Noel-Storr2, Claire Stansfield1

1EPPI-Centre, UCL, United Kingdom; 2Radcliffe Department of Medicine, University of Oxford, United Kingdom

BACKGROUND: The large and growing number of research publications, coupled with poor search precision, can make identifying all studies eligible for inclusion in a systematic review both challenging and time consuming. Machine learning and text mining technologies have great potential, but may best be considered as aids to human effort, rather than replacements. Emerging approaches to finding research are not limited to technological solutions though, and new human processes – including ‘crowdsourcing’ - are showing that it is possible to make the study identification process more efficient.

AIMS: To present, and for participants to have hands-on experience with, some of the latest automation and crowdsourcing tools to support study identification in systematic reviews. To consider critically the evidence base that supports the use of the tools. To discuss their use as a group, and how users might contribute to their further development and evaluation

CONTENT: We will outline and experience the ways in which new technologies are being applied to searching and study selection in systematic reviews. We will provide overviews of: Current applications for searching, including approaches that aim to improve sensitivity and/or precision, or to aid database translation; Current applications for study selection, including approaches that aim to reduce the number needed to screen or expedite quality assurance; Living systematic reviews: how we can utilise new technologies to maintain the currency of a given review – or suite of reviews; How some study identification tasks can be carried out at scale – outside the scope of individual reviews – making study identification much more efficient, and reducing duplication of effort on a global scale.

We will also summarise and discuss the current evidence base to consider as a group how mature particular technologies are, whether they are ready for use, or what additional development and evaluation is necessary.

Learning outcomes : Participants should be able to: Differentiate some ways that new technologies and processes – including machine learning, text mining and crowdsourcing - help with study identification; Be familiar – and have interacted – with some of the latest tools which utilise these new technologies and processes; Be developing a critical awareness of the evidence base and the issues that need to be borne in mind when using these tools; Have an introductory understanding of how some of the new technologies work.

Type of interactivity : Most of the time will be devoted to hands-on experience with tools, and discussion about their use. Please bring a laptop / tablet with you to try the online tools for yourself. We will adopt the following pattern of activity for each technology we cover:

  1. Introductory presentation to include: how the technology works, how it can be used, and what evidence is available to support its use;
  2. Individual and paired hands-on experience with using the tool;
  3. Group discussion (with feedback) on the strengths and weaknesses, acceptability and usability of the tool.

For those who attended our EAHIL workshop in 2018, this year’s workshop will additionally cover crowdsourcing as well as providing up-to-the-minute overviews of the latest technologies and their evaluations. A new theme will be a focus on human-machine interaction: rather than thinking that the machine will be able to do all the work, we consider how the human and machine together are able to achieve more than either operating alone.

Level : Intermediate

Target audience : Information specialists, librarians, and review authors; also of relevance for commissioners and users of reviews

Preparation for the session : No

Biography and Bibliography
James Thomas is Professor of Social Research and Policy at the EPPI-Centre, UCL, London. His research is centred on improving policy and decision-making through the use of research. He has written extensively on research synthesis, including meta-analysis and methods for combining qualitative and quantitative research in mixed method reviews. He also designed EPPI-Reviewer, software which manages data through all stages of a systematic review, which incorporates machine learning/AI. He is principal investigator of the Evidence Reviews Facility for the Department of Health and Social Care, England, a large programme of policy-relevant systematic reviews with accompanying methodological development. James is co-lead of Cochrane ‘Project Transform’ which is implementing new technologies and processes to improve the efficiency of systematic reviews. He is also co-investigator on a major Collaborative Award from the Wellcome Trust, led by Susan Michie (UCL), to develop novel technologies to organise, synthesise and present the behavioural science literature.

Anna Noel-Storr has worked for Cochrane since 2008 as an information specialist for the Cochrane Dementia and Cognitive Improvement Group based at the University of Oxford. During that time she has played a leading role in the development and implementation of crowdsourcing in health evidence production. This began with the 'Trial Blazers' study for which she won the Thomas C Chalmers Award in 2013. Since then, she has led a number of initiatives exploring the role of crowdsourcing and citizen science in systematic review production and evidence synthesis. She currently leads Cochrane Crowd, a component of Cochrane ‘Project Transform’.This work involves the development of a crowd platform offering willing contributors a range of micro-tasks to dive into, all of which are designed to enhance Cochrane’s content and speed up the review production process without any compromise on the exceptionally high quality expected of Cochrane systematic reviews.

Claire Stansfield is an Information Scientist at the EPPI-Centre, UCL Institute of Education, London and is involved in developing and applying research methods for systematic literature searching across a range of policy areas in health promotion, public health, social care and international development. She also supports research groups internationally to learn and use literature searching methods for systematic reviews, particularly within the international development field.

Thomas J, Noel-Storr A, Marshall I, Wallace B, McDonald S, Mavergames C, Glasziou P, Shemilt I, Synnot A, Turner T, Elliott J; Living Systematic Review Network. Living systematic reviews: 2. Combining human and machine effort. J Clin Epidemiol. 2017 Nov;91:31-37. doi: 10.1016/j.jclinepi.2017.08.011
Thomas-Human and artificial intelligence-246_a.pdf
 
3:45pm - 5:00pmWorkshop B-04 Cont'd: Human and artificial intelligence
Room 212 
Date: Wednesday, 19/Jun/2019
5:00pm - 6:00pmSIG 6: SIG meeting EVLG

The aim of the European Veterinaries Libraries Group (EVLG) is to unite all those who are interested in and/or employed in the animal health information field. It’s also to develop and encourage cooperation between libraries in veterinary medicine and to present a forum to exchange ideas and to discuss mutual problems.

Room 212 

 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: EAHIL Workshop 2019
Conference Software - ConfTool Pro 2.6.129+TC+CC
© 2001 - 2019 by Dr. H. Weinreich, Hamburg, Germany