Proswitch: Knowledge-guided Instruction Tuning To Generate Professional And Non-professional Styled Text · The Large Language Model Bible Contribute to LLM-Bible

Proswitch: Knowledge-guided Instruction Tuning To Generate Professional And Non-professional Styled Text

Zong Chang, Chen Yuyan, Lu Weiming, Shao Jian, Zhuang Yueting. Arxiv 2024

[Paper]    
Applications Fine Tuning Language Modeling Pretraining Methods Training Techniques

Large Language Models (LLMs) have demonstrated efficacy in various linguistic applications, including text summarization and controlled text generation. However, studies into their capacity of switching between styles via fine-tuning remain underexplored. This study concentrates on textual professionalism and introduces a novel methodology, named ProSwitch, which equips a language model with the ability to produce both professional and non-professional responses through knowledge-guided instruction tuning. ProSwitch unfolds across three phases: data preparation for gathering domain knowledge and training corpus; instruction tuning for optimizing language models with multiple levels of instruction formats; and comprehensive evaluation for assessing the professionalism discrimination and reference-based quality of generated text. Comparative analysis of ProSwitch against both general and specialized language models reveals that our approach outperforms baselines in switching between professional and non-professional text generation.

Similar Work