A comparative analysis of active learning for biomedical text mining

Usman Naseem, Matloob Khushi, Shah Khalid Khan, Kamran Shaukat, Mohammad Ali Moni

Research output: Contribution to journalArticlepeer-review

45 Citations (Scopus)
9 Downloads (Pure)

Abstract

An enormous amount of clinical free-text information, such as pathology reports, progress reports, clinical notes and discharge summaries have been collected at hospitals and medical care clinics. These data provide an opportunity of developing many useful machine learning applications if the data could be transferred into a learn-able structure with appropriate labels for supervised learning. The annotation of this data has to be performed by qualified clinical experts, hence, limiting the use of this data due to the high cost of annotation. An underutilised technique of machine learning that can label new data called active learning (AL) is a promising candidate to address the high cost of the label the data. AL has been successfully applied to labelling speech recognition and text classification, however, there is a lack of literature investigating its use for clinical purposes. We performed a comparative investigation of various AL techniques using ML and deep learning (DL)- based strategies on three unique biomedical datasets. We investigated random sampling (RS), least confidence (LC), informative diversity and density (IDD), margin and maximum representativenessdiversity (MRD) AL query strategies. Our experiments show that AL has the potential to significantly reducing the cost of manual labelling. Furthermore, pre-labelling performed using AL expediates the labelling process by reducing the time required for labelling. 

Original languageEnglish
Article number23
Pages (from-to)1-18
Number of pages18
JournalApplied System Innovation
Volume4
Issue number1
DOIs
Publication statusPublished - 15 Mar 2021

Fingerprint

Dive into the research topics of 'A comparative analysis of active learning for biomedical text mining'. Together they form a unique fingerprint.

Cite this