skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

UmlsBERT  

Definition(s)

  • "contextual embedding model capable of integrating domain knowledge during pre-training. It was trained on biomedical corpora and uses the Unified Medical Language System (UMLS) clinical metathesaurus" (source: https://github.com/gmichalo/UmlsBERT)

Broader concept(s)

Bibliographic citation(s)

  • • Michalopoulos, G., Wang, Y., Kaka, H., Chen, H., & Wong, A. (2021). UmlsBERT : Clinical domain knowledge augmentation of contextual embeddings using the unified medical language system metathesaurus. Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 1744‑1753. https://doi.org/10.18653/v1/2021.naacl-main.139

based on

has application field

has design country

  • Canada

has for input language

implements

is encoded in

is executed in

has for license

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-LS5TZ62L-B

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/22/23