Unknown

Dataset Information

0

Self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation.


ABSTRACT: Although deep learning-based computer-aided diagnosis systems have recently achieved expert-level performance, developing a robust model requires large, high-quality data with annotations that are expensive to obtain. This situation poses a conundrum that annually-collected chest x-rays cannot be utilized due to the absence of labels, especially in deprived areas. In this study, we present a framework named distillation for self-supervision and self-train learning (DISTL) inspired by the learning process of the radiologists, which can improve the performance of vision transformer simultaneously with self-supervision and self-training through knowledge distillation. In external validation from three hospitals for diagnosis of tuberculosis, pneumothorax, and COVID-19, DISTL offers gradually improved performance as the amount of unlabeled data increase, even better than the fully supervised model with the same amount of labeled data. We additionally show that the model obtained with DISTL is robust to various real-world nuisances, offering better applicability in clinical setting.

SUBMITTER: Park S 

PROVIDER: S-EPMC9252561 | biostudies-literature | 2022 Jul

REPOSITORIES: biostudies-literature

altmetric image

Publications

Self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation.

Park Sangjoon S   Kim Gwanghyun G   Oh Yujin Y   Seo Joon Beom JB   Lee Sang Min SM   Kim Jin Hwan JH   Moon Sungjun S   Lim Jae-Kwang JK   Park Chang Min CM   Ye Jong Chul JC  

Nature communications 20220704 1


Although deep learning-based computer-aided diagnosis systems have recently achieved expert-level performance, developing a robust model requires large, high-quality data with annotations that are expensive to obtain. This situation poses a conundrum that annually-collected chest x-rays cannot be utilized due to the absence of labels, especially in deprived areas. In this study, we present a framework named distillation for self-supervision and self-train learning (DISTL) inspired by the learnin  ...[more]

Similar Datasets

| S-EPMC10963853 | biostudies-literature
| S-EPMC11232177 | biostudies-literature
| S-EPMC8242586 | biostudies-literature
| S-EPMC7856273 | biostudies-literature
| S-EPMC8583247 | biostudies-literature
| S-EPMC9455583 | biostudies-literature
| S-EPMC10794204 | biostudies-literature
| S-EPMC9018897 | biostudies-literature
| S-EPMC10287612 | biostudies-literature
| S-EPMC11601481 | biostudies-literature