skip to main content
Language:
Search Limited to: Search Limited to: Resource type Show Results with: Show Results with: Search type Index

Automatic Assignment of Radiology Examination Protocols Using Pre-trained Language Models with Knowledge Distillation

AMIA ... Annual Symposium proceedings, 2021, Vol.2021, p.668-676 [Peer Reviewed Journal]

2021 AMIA - All rights reserved. ;2021 AMIA - All rights reserved. 2021 ;EISSN: 1942-597X ;PMID: 35308920

Full text available

Citations Cited by
  • Title:
    Automatic Assignment of Radiology Examination Protocols Using Pre-trained Language Models with Knowledge Distillation
  • Author: Lau, Wilson ; Aaltonen, Laura ; Gunn, Martin ; Yetisgen, Meliha
  • Subjects: Humans ; Language ; Natural Language Processing ; Radiography ; Radiology ; Support Vector Machine
  • Is Part Of: AMIA ... Annual Symposium proceedings, 2021, Vol.2021, p.668-676
  • Description: Selecting radiology examination protocol is a repetitive, and time-consuming process. In this paper, we present a deep learning approach to automatically assign protocols to computed tomography examinations, by pre-training a domain-specific BERT model (BERT ). To handle the high data imbalance across exam protocols, we used a knowledge distillation approach that up-sampled the minority classes through data augmentation. We compared classification performance of the described approach with n-gram models using Support Vector Machine (SVM), Gradient Boosting Machine (GBM), and Random Forest (RF) classifiers, as well as the BERT model. SVM, GBM and RF achieved macro-averaged F1 scores of 0.45, 0.45, and 0.6 while BERT and BERT achieved 0.61 and 0.63. Knowledge distillation boosted performance on the minority classes and achieved an F1 score of 0.66.
  • Publisher: United States: American Medical Informatics Association
  • Language: English
  • Identifier: EISSN: 1942-597X
    PMID: 35308920
  • Source: GFMER Free Medical Journals
    MEDLINE
    PubMed Central

Searching Remote Databases, Please Wait