Title
|
|
|
|
Scalable few-shot learning of robust biomedical name representations
|
|
Author
|
|
|
|
|
|
Abstract
|
|
|
|
Recent research on robust representations of biomedical names has focused on modeling large amounts of fine-grained conceptual distinctions using complex neural encoders. In this paper, we explore the opposite paradigm: training a simple encoder architecture using only small sets of names sampled from high-level biomedical concepts. Our encoder post-processes pretrained representations of biomedical names, and is effective for various types of input representations, both domain-specific or unsupervised. We validate our proposed few-shot learning approach on multiple biomedical relatedness benchmarks, and show that it allows for continual learning, where we accumulate information from various conceptual hierarchies to consistently improve encoder performance. Given these findings, we propose our approach as a low-cost alternative for exploring the impact of conceptual distinctions on robust biomedical name representations. |
|
|
Language
|
|
|
|
English
|
|
Source (book)
|
|
|
|
Proceedings of the 20th workshop on Biomedical Language Processing
|
|
Publication
|
|
|
|
Association for Computational Linguistics
,
2021
|
|
ISBN
|
|
|
|
978-1-954085-40-4
|
|
DOI
|
|
|
|
10.18653/V1/2021.BIONLP-1.3
|
|
Volume/pages
|
|
|
|
p. 23-29
|
|
Full text (Publisher's DOI)
|
|
|
|
|
|
Full text (open access)
|
|
|
|
|
|