skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

BioMed-RoBERTa  

Definition(s)

  • A RoBERTa Language model pretrained on biomedical texts.

Broader concept(s)

Bibliographic citation(s)

  • • Gururangan, S., Marasović, A., Swayamdipta, S., Lo, K., Beltagy, I., Downey, D., & Smith, N. A. (2020). Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks. ArXiv:2004.10964 [Cs]. http://arxiv.org/abs/2004.10964

based on

has application field

has design country

  • United States

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-TNGSLL0L-5

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24