Tinyllama: An Open-source Small Language Model · The Large Language Model Bible Contribute to LLM-Bible

Tinyllama: An Open-source Small Language Model

Zhang Peiyuan, Zeng Guangtao, Wang Tianduo, Lu Wei. Arxiv 2024

[Paper] [Code]    
Attention Mechanism Efficiency And Optimization GPT Has Code Model Architecture RAG

We present TinyLlama, a compact 1.1B language model pretrained on around 1 trillion tokens for approximately 3 epochs. Building on the architecture and tokenizer of Llama 2, TinyLlama leverages various advances contributed by the open-source community (e.g., FlashAttention and Lit-GPT), achieving better computational efficiency. Despite its relatively small size, TinyLlama demonstrates remarkable performance in a series of downstream tasks. It significantly outperforms existing open-source language models with comparable sizes. Our model checkpoints and code are publicly available on GitHub at https://github.com/jzhang38/TinyLlama.

Similar Work