Concept information
Preferred term
UmlsBERT
Definition(s)
- "contextual embedding model capable of integrating domain knowledge during pre-training. It was trained on biomedical corpora and uses the Unified Medical Language System (UMLS) clinical metathesaurus" (source: https://github.com/gmichalo/UmlsBERT)
Broader concept(s)
Bibliographic citation(s)
- • Michalopoulos, G., Wang, Y., Kaka, H., Chen, H., & Wong, A. (2021). UmlsBERT : Clinical domain knowledge augmentation of contextual embeddings using the unified medical language system metathesaurus. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 1744‑1753. doi:10.18653/v1/2021.naacl-main.139
based on
has design country
- Canada
has for input language
has repository
implements
is encoded in
is executed in
has for license
In other languages
-
French
URI
http://data.loterre.fr/ark:/67375/LTK-LS5TZ62L-B
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}