skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

SapBERT  

Definition(s)

  • « modèle pré-entraîné qui auto-aligne l'espace de représentation d'entités biomédicales. » (Liu et al., 2021, p. 4228).

Broader concept(s)

Bibliographic citation(s)

  • • Liu, F., Shareghi, E., Meng, Z., Basaldella, M., & Collier, N. (2021). Self-alignment pretraining for biomedical entity representations. In K. Toutanova, A. Rumshisky, L. Zettlemoyer, D. Hakkani-Tur, I. Beltagy, S. Bethard, R. Cotterell, T. Chakraborty, & Y. Zhou (Eds.), Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 4228–4238). Association for Computational Linguistics. doi:10.18653/v1/2021.naacl-main.334

based on

has design country

  • États-Unis
  • Royaume-Uni

has for input language

is an application of

implements

is encoded in

has for license

In other languages

  • English

  • Self-aligning pretrained BERT

URI

http://data.loterre.fr/ark:/67375/LTK-W63LC46L-T

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24