skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

ELECTRA  

Definition(s)

  • Pre-trained language model "by replacing some tokens with plausible alternatives sampled from a small generator network." (Clark et al., 2020).

Broader concept(s)

Synonym(s)

  • Efficiently Learning an Encoder that Classifies Token Replacements Accurately

Bibliographic citation(s)

  • • Clark, K., Luong, M.-T., Le, Q. V., & Manning, C. D. (2020). ELECTRA : Pre-training text encoders as discriminators rather than generators. arXiv:2003.10555 [cs]. http://arxiv.org/abs/2003.10555

has application field

has design country

  • United States

has for input language

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-QB1X83XF-5

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24