Speculative Contrastive Decoding · The Large Language Model Bible Contribute to LLM-Bible

Speculative Contrastive Decoding

Yuan Hongyi, Lu Keming, Huang Fei, Yuan Zheng, Zhou Chang. Arxiv 2023

[Paper]    
Efficiency And Optimization Ethics And Bias RAG

Large language models~(LLMs) exhibit exceptional performance in language tasks, yet their auto-regressive inference is limited due to high computational requirements and is sub-optimal due to the exposure bias. Inspired by speculative decoding and contrastive decoding, we introduce Speculative Contrastive Decoding~(SCD), a straightforward yet powerful decoding approach that leverages predictions from smaller language models~(LMs) to achieve both decoding acceleration and quality improvement. Extensive evaluations and analyses on four diverse language tasks demonstrate the effectiveness of SCD, showing that decoding efficiency and quality can compatibly benefit from one smaller LM.

Similar Work