Numeracy From Literacy: Data Science As An Emergent Skill From Large Language Models · The Large Language Model Bible Contribute to LLM-Bible

Numeracy From Literacy: Data Science As An Emergent Skill From Large Language Models

Noever David, Mckee Forrest. Arxiv 2023

[Paper]    
Fine Tuning GPT Model Architecture Pretraining Methods Transformer

Large language models (LLM) such as OpenAI’s ChatGPT and GPT-3 offer unique testbeds for exploring the translation challenges of turning literacy into numeracy. Previous publicly-available transformer models from eighteen months prior and 1000 times smaller failed to provide basic arithmetic. The statistical analysis of four complex datasets described here combines arithmetic manipulations that cannot be memorized or encoded by simple rules. The work examines whether next-token prediction succeeds from sentence completion into the realm of actual numerical understanding. For example, the work highlights cases for descriptive statistics on in-memory datasets that the LLM initially loads from memory or generates randomly using python libraries. The resulting exploratory data analysis showcases the model’s capabilities to group by or pivot categorical sums, infer feature importance, derive correlations, and predict unseen test cases using linear regression. To extend the model’s testable range, the research deletes and appends random rows such that recall alone cannot explain emergent numeracy.

Similar Work