ECOVNet: A highly effective ensemble based deep learning model for detecting COVID-19

Nihad K. Chowdhury, Ashad Kabir, Md. Muhtadir Rahman, Noortaz Rezoana

Research output: Contribution to journalArticlepeer-review

25 Citations (Scopus)
50 Downloads (Pure)


The goal of this research is to develop and implement a highly effective deep learning model for detecting COVID-19. To achieve this goal, in this paper, we propose an ensemble of Convolutional Neural Network (CNN) based on EfficientNet, named ECOVNet, to detect COVID-19 from chest X-rays. To make the proposed model more robust, we have used one of the largest open-access chest X-ray data sets named COVIDx containing three classes—COVID-19, normal, and pneumonia. For feature extraction, we have applied an effective CNN structure, namely EfficientNet, with ImageNet pre-training weights. The generated features are transferred into custom fine-tuned top layers followed by a set of model snapshots. The predictions of the model snapshots (which are created during a single training) are consolidated through two ensemble strategies, i.e., hard ensemble and soft ensemble, to enhance classification performance. In addition, a visualization technique is incorporated to highlight areas that distinguish classes, thereby enhancing the understanding of primal components related to COVID-19. The results of our empirical evaluations show that the proposed ECOVNet model outperforms the state-of-the-art approaches and significantly improves detection performance with 100% recall for COVID-19 and overall accuracy of 96.07%. We believe that ECOVNet can enhance the detection of COVID-19 disease, and thus, underpin a fully automated and efficacious COVID-19 detection system.
Original languageEnglish
Article numbere551
Pages (from-to)1-25
Number of pages25
JournalPeerJ Computer Science
Publication statusPublished - 26 May 2021


Dive into the research topics of 'ECOVNet: A highly effective ensemble based deep learning model for detecting COVID-19'. Together they form a unique fingerprint.

Cite this