Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Issue title: Evaluating online health information sources using a mixed methods approach: Part 3
Guest editors: Vera Granikov and Piere Pluye
Article type: Research Article
Authors: El Sherif, Reema; * | Langlois, Alexisb | Pandu, Xiaob | Nie, Jian-Yunb | Thomas, Jamesc | Hong, Quan Nhac | Pluye, Pierrea
Affiliations: [a] Department of Family Medicine, McGill University, Montréal, QC, Canada | [b] Recherche appliquée en linguistique informatique, Université de Montréal, Montréal, QC, Canada | [c] EPPI-Centre, Department of Social Science, UCL Institute of Education, University College London, England, UK
Correspondence: [*] Corresponding author: Reem El Sherif, Chemin 5858 Côte-des-Neiges, Suite 300 Montréal, QC, Canada. Tel.: +1 514 632 3616; E-mail: [email protected].
Abstract: Mixed studies reviews include empirical studies with diverse designs (qualitative, quantitative and mixed methods). To make the process of identifying relevant empirical studies for such reviews more efficient, we developed a mixed filter that included different keywords and subject headings for quantitative (e.g., cohort study), qualitative (e.g., focus group), and mixed methods studies. It was tested for six journals from three disciplines. We measured precision (proportion of retrieved documents being relevant), sensitivity (proportion of relevant documents retrieved), and specificity (proportion of non-relevant documents not retrieved). Records were coded before applying the filter and compared with retrieved records, and descriptive statistics were performed, suggesting the mixed filter has high sensitivity, but lower precision and specificity (close to 50%). Next, based on the success of the filter, we developed an automated text classification system that can automatically select empirical studies in order to facilitate systematic mixed studies reviews. Several algorithms were trained and validated with 8,050 database records that were previously manually categorized. Decision trees had the best results and surpassed the accuracy of the filter by 30% when using full-text documents. This algorithm was then adapted into an online format that can be used by researchers to analyze their bibliography and categorize records into “empirical” and “nonempirical”.
Keywords: Bibliographic database, systematic review, information retrieval, empirical research, search filter, automated classifier
DOI: 10.3233/EFI-190347
Journal: Education for Information, vol. 36, no. 1, pp. 101-105, 2020
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]