skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

MiniLM  

Definition(s)

  • « modèles simples et rapides pré-entraînés pour la compréhension et la génération du langage » (source : https://github.com/microsoft/unilm/tree/master/minilm)

Broader concept(s)

Bibliographic citation(s)

  • • Wang, W., Wei, F., Dong, L., Bao, H., Yang, N., & Zhou, M. (2020). MiniLM : Deep self-attention distillation for task-agnostic compression of pre-trained transformers. arXiv:2002.10957 [cs]. http://arxiv.org/abs/2002.10957

base of

has application field

has for input language

implements

is encoded in

is executed in

has for license

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-NV2360BM-D

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/22/23