Finetuned model to perform Word-Sense Disambiguation for Latin and Ancient Greek preverbs.

Precision: 0.8538

Recall: 0.8358

Micro F1: 0.8447

For more details, please see:

Farina, A., & Ciletti, M. (2025). Probing Preverbs: Evaluating Large Language Models on Latin and Ancient Greek Preverbed Motion Verbs. Proceedings of Historical Languages and AI. March 5–6, 2026, Humboldt University Berlin (Germany). https://github.com/farina-andrea/preverb_semantics_LLMs
  • Developed by: Michele Ciletti
  • License: apache-2.0
  • Finetuned from model : unsloth/llama-3.3-70b-instruct-unsloth-bnb-4bit

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
3
Safetensors
Model size
71B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train MikCil/PREMOVE_llama3.3-70b_float16