Examining The Emergence Of Deductive Reasoning In Generative Language Models · The Large Language Model Bible Contribute to LLM-Bible

Examining The Emergence Of Deductive Reasoning In Generative Language Models

Belcak Peter, Lanzendörfer Luca A., Wattenhofer Roger. Arxiv 2023

[Paper]    
GPT Model Architecture Pretraining Methods Training Techniques Transformer

We conduct a preliminary inquiry into the ability of generative transformer models to deductively reason from premises provided. We observe notable differences in the performance of models coming from different training setups and find that the deductive reasoning ability increases with scale. Further, we discover that the performance generally does not decrease with the length of the deductive chain needed to reach the conclusion, with the exception of OpenAI GPT-3 and GPT-3.5 models. Our study considers a wide variety of transformer-decoder models, ranging from 117 million to 175 billion parameters in size.

Similar Work