--- library_name: transformers pipeline_tag: text-generation license: llama3 language: - en base_model: - meta-llama/Meta-Llama-3-8B --- # Llama 3 Finetuned Historical Model (1880 - 1910) This model was finetuned using [DoRA](https://arxiv.org/abs/2402.09353) adapters, from the [Llama3 8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) model. It was finetuned on 10M words from the Gutenberg Corpus attributed to the time period 1880 - 1910. ### Model Sources - **Repository:** https://github.com/comp-int-hum/historical-perspectival-lm - **Paper (ArXiv):** https://arxiv.org/abs/2504.05523 - **Paper (Hugging Face):** https://huggingface.co/papers/2504.05523 ## Downloading the Model Load the model like this: ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("Hplm/dora_llama_model_1880_1910", torch_dtype=torch.float16) tokenizer = AutoTokenizer.from_pretrained("Hplm/dora_llama_model_1880_1910") ``` ## License Built with Meta Llama 3, and under the [meta-llama](https://www.llama.com/llama3/license/) licence. ## Citation ``` @article{fittschen_diachroniclanguagemodels_2025, title = {Pretraining Language Models for Diachronic Linguistic Change Discovery}, author = {Fittschen, Elisabeth and Li, Sabrina and Lippincott, Tom and Choshen, Leshem and Messner, Craig}, year = {2025}, month = apr, eprint = {2504.05523}, primaryclass = {cs.CL}, publisher = {arXiv}, doi = {10.48550/arXiv.2504.05523}, url = {https://arxiv.org/abs/2504.05523}, urldate = {2025-04-14}, archiveprefix = {arXiv}, journal = {arxiv:2504.05523[cs.CL]} } ```