Follow
Thomas Wolf
Thomas Wolf
Co-founder at HuggingFace
Verified email at polytechnique.edu - Homepage
Title
Cited by
Cited by
Year
Transformers: State-of-the-art natural language processing
T Wolf, L Debut, V Sanh, J Chaumond, C Delangue, A Moi, P Cistac, ...
Proceedings of the 2020 conference on empirical methods in natural language …, 2020
12950*2020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
V Sanh, L Debut, J Chaumond, T Wolf
arXiv preprint arXiv:1910.01108, 2019
6205*2019
Multitask prompted training enables zero-shot task generalization
V Sanh, A Webson, C Raffel, SH Bach, L Sutawika, Z Alyafeai, A Chaffin, ...
arXiv preprint arXiv:2110.08207, 2021
11802021
Bloom: A 176b-parameter open-access multilingual language model
T Le Scao, A Fan, C Akiki, E Pavlick, S Ilić, D Hesslow, R Castagné, ...
10952022
Transfer learning in natural language processing
S Ruder, ME Peters, S Swayamdipta, T Wolf
Proceedings of the 2019 conference of the North American chapter of the …, 2019
6392019
Transfertransfo: A transfer learning approach for neural network based conversational agents
T Wolf, V Sanh, J Chaumond, C Delangue
arXiv preprint arXiv:1901.08149, 2019
4912019
Datasets: A community library for natural language processing
Q Lhoest, AV del Moral, Y Jernite, A Thakur, P von Platen, S Patil, ...
arXiv preprint arXiv:2109.02846, 2021
414*2021
Movement pruning: Adaptive sparsity by fine-tuning
V Sanh, T Wolf, A Rush
Advances in neural information processing systems 33, 20378-20389, 2020
3552020
Two-dimensional superconductivity at a Mott insulator/band insulator interface LaTiO3/SrTiO3
J Biscaras, N Bergeal, A Kushwaha, T Wolf, A Rastogi, RC Budhani, ...
Nature communications 1 (1), 89, 2010
3362010
Starcoder: may the source be with you!
R Li, LB Allal, Y Zi, N Muennighoff, D Kocetkov, C Mou, M Marone, C Akiki, ...
arXiv preprint arXiv:2305.06161, 2023
298*2023
A hierarchical multi-task approach for learning embeddings from semantic tasks
V Sanh, T Wolf, S Ruder
Proceedings of the AAAI conference on artificial intelligence 33 (01), 6949-6956, 2019
2592019
Natural language processing with transformers
L Tunstall, L Von Werra, T Wolf
" O'Reilly Media, Inc.", 2022
2292022
Diffusers: State-of-the-art diffusion models
P Von Platen, S Patil, A Lozhkov, P Cuenca, N Lambert, K Rasul, ...
1932022
Transformers: State-of-the-art natural language processing
W Thomas, D Lysandre, S Victor, C Julien, D Clement, M Anthony, ...
Proceedings of the 2020 conference on empirical methods in natural language …, 2020
1802020
Zephyr: Direct distillation of lm alignment
L Tunstall, E Beeching, N Lambert, N Rajani, K Rasul, Y Belkada, ...
arXiv preprint arXiv:2310.16944, 2023
1152023
The stack: 3 tb of permissively licensed source code
D Kocetkov, R Li, LB Allal, J Li, C Mou, CM Ferrandis, Y Jernite, M Mitchell, ...
arXiv preprint arXiv:2211.15533, 2022
1052022
Large-scale transfer learning for natural language generation
S Golovanov, R Kurbanov, S Nikolenko, K Truskovskyi, A Tselousov, ...
Proceedings of the 57th Annual Meeting of the Association for Computational …, 2019
1002019
Open llm leaderboard
E Beeching, C Fourrier, N Habib, S Han, N Lambert, N Rajani, ...
Hugging Face, 2023
982023
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter (2019)
V Sanh, L Debut, J Chaumond, T Wolf
arXiv preprint arXiv:1910.01108, 1910
921910
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. CoRR abs/1910.01108 (2019)
V Sanh, L Debut, J Chaumond, T Wolf
URL: http://arxiv. org/abs/1910 1108, 1910
911910
The system can't perform the operation now. Try again later.
Articles 1–20