skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

DistilBERT  

Definition(s)

  • "a general-purpose pre-trained version of BERT, 40% smaller, 60% faster, that retains 97% of the language understanding capabilities." (Sahn et al., 2020 ).

Broader concept(s)

Synonym(s)

  • Distilled BERT

Bibliographic citation(s)

  • • Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2020). DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter. https://arxiv.org/abs/1910.01108v4

based on

has application field

has design country

  • United States

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-ZK7T7Z7V-N

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24