ODISSeptember 13, 2025·7 min read Towards EnergyGPT: Building an LLM That Speaks EnergyWhy We Fine-Tuned LLaMA 3.1 on Two Decades of Energy Literature
General-purpose LLMs fall short in the energy sector — hallucinated references, confused units, vague thermodynamics. We built EnergyGPT by fine-tuning LLaMA 3.1-8B on 2.17 billion tokens of curated energy text, and the results show meaningful gains in technical depth, coherence, and domain relevance.
Amal Chebbi, Ph.D., Babajide Kolade, Ph.D., PE