Huggingface's Transformers: State-of-the-art Natural Language Processing · The Large Language Model Bible Contribute to LLM-Bible

Huggingface's Transformers: State-of-the-art Natural Language Processing

Wolf Thomas, Debut Lysandre, Sanh Victor, Chaumond Julien, Delangue Clement, Moi Anthony, Cistac Pierric, Rault Tim, Louf RĂ©mi, Funtowicz Morgan, Davison Joe, Shleifer Sam, Von Platen Patrick, Ma Clara, Jernite Yacine, Plu Julien, Xu Canwen, Scao Teven Le, Gugger Sylvain, Drame Mariama, Lhoest Quentin, Rush Alexander M.. Arxiv 2019

[Paper] [Code]    
Applications Has Code Model Architecture Pretraining Methods Reinforcement Learning Tools Training Techniques Transformer

Recent progress in natural language processing has been driven by advances in both model architecture and model pretraining. Transformer architectures have facilitated building higher-capacity models and pretraining has made it possible to effectively utilize this capacity for a wide variety of tasks. \textit{Transformers} is an open-source library with the goal of opening up these advances to the wider machine learning community. The library consists of carefully engineered state-of-the art Transformer architectures under a unified API. Backing this library is a curated collection of pretrained models made by and available for the community. \textit{Transformers} is designed to be extensible by researchers, simple for practitioners, and fast and robust in industrial deployments. The library is available at \url{https://github.com/huggingface/transformers}.

Similar Work