Torchscale: Transformers At Scale · The Large Language Model Bible Contribute to LLM-Bible

Torchscale: Transformers At Scale

Ma Shuming, Wang Hongyu, Huang Shaohan, Wang Wenhui, Chi Zewen, Dong Li, Benhaim Alon, Patra Barun, Chaudhary Vishrav, Song Xia, Wei Furu. Arxiv 2022

[Paper]    
Applications Efficiency And Optimization Language Modeling Large Scale Training Model Architecture Pretraining Methods Tools Training Techniques Transformer

Large Transformers have achieved state-of-the-art performance across many tasks. Most open-source libraries on scaling Transformers focus on improving training or inference with better parallelization. In this work, we present TorchScale, an open-source toolkit that allows researchers and developers to scale up Transformers efficiently and effectively. TorchScale has the implementation of several modeling techniques, which can improve modeling generality and capability, as well as training stability and efficiency. Experimental results on language modeling and neural machine translation demonstrate that TorchScale can successfully scale Transformers to different sizes without tears. The library is available at https://aka.ms/torchscale.

Similar Work