skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

DistilBERT  

Definition(s)

  • "a general-purpose pre-trained version of BERT, 40% smaller, 60% faster, that retains 97% of the language understanding capabilities." (Sahn et al., 2020).

Broader concept(s)

Synonym(s)

  • Distilled BERT

Bibliographic citation(s)

base of

based on

has application field

has design country

  • United States

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-ZK7T7Z7V-N

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/22/23