Tppoet: Transformer-based Persian Poem Generation Using Minimal Data And Advanced Decoding Techniques · The Large Language Model Bible Contribute to LLM-Bible

Tppoet: Transformer-based Persian Poem Generation Using Minimal Data And Advanced Decoding Techniques

Panahandeh Amir, Asemi Hanie, Nourani Esmaeil. Arxiv 2023

[Paper]    
Model Architecture Pretraining Methods Reinforcement Learning Training Techniques Transformer

Recent advances in language models (LMs), have demonstrated significant efficacy in tasks related to the arts and humanities. While LMs have exhibited exceptional performance across a wide range of natural language processing tasks, there are notable challenges associated with their utilization on small datasets and their ability to replicate more creative human capacities. In this study, we aim to address these challenges by training a Persian classical poetry generation model using a transformer architecture on a specialized dataset with no pretraining. Additionally, we propose a novel decoding method to enhance coherence and meaningfulness in the generated poetry, effectively managing the tradeoff between diversity and quality. Furthermore, the results of our training approach and the proposed decoding method are evaluated through comprehensive set of automatic and human evaluations and showed its superior capability to generate coherent and meaningful poetry in compare to other decoding methods and an existing Persian large language model (LLM).

Similar Work