Entity-Enriched Neural Models for Clinical Question Answering
View Researcher's Other CodesDisclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).
Please contact us in case of a broken link from here
Authors | Wei-Hung Weng, Peter Szolovits, Bhanu Pratap Singh Rawat, Preethi Raghavan |
Journal/Conference Name | WS 2020 7 |
Paper Category | Artificial Intelligence |
Paper Abstract | We explore state-of-the-art neural models for question answering on electronic medical records and improve their ability to generalize better on previously unseen (paraphrased) questions at test time. We enable this by learning to predict logical forms as an auxiliary task along with the main task of answer span detection. The predicted logical forms also serve as a rationale for the answer. Further, we also incorporate medical entity information in these models via the ERNIE architecture. We train our models on the large-scale emrQA dataset and observe that our multi-task entity-enriched models generalize to paraphrased questions ~5% better than the baseline BERT model. |
Date of publication | 2020 |
Code Programming Language | Unspecified |
Comment |