skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

OPT  

Definition(s)

  • a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters, to be shared fully and responsibly with researchers (after Zhang et al., 2022).

Broader concept(s)

Synonym(s)

  • Open Pre-trained Transformers

Bibliographic citation(s)

  • • Zhang, S., Roller, S., Goyal, N., Artetxe, M., Chen, M., Chen, S., Dewan, C., Diab, M., Li, X., Lin, X. V., Mihaylov, T., Ott, M., Shleifer, S., Shuster, K., Simig, D., Koura, P. S., Sridhar, A., Wang, T., & Zettlemoyer, L. (2022). OPT: Open Pre-trained Transformer anguage Models (arXiv:2205.01068). arXiv. doi:10.48550/arXiv.2205.01068

has design country

  • États-Unis

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-CQ6NWPMQ-3

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24