EPlus-LLM

EPlus-LLM v2

Natural Language Interface for Automated Building Energy Modeling via LLMs
A prototype project exploring the use of fine-tuned large language models to automate building energy modeling from natural language input.

Illustration of EPlus-LLMv2 for Auto-building energy modeling

πŸŽ‰ News

  • ⚑️ [2025/01/01]: A prompting-based method for auto-building energy modeling has been released. Paper here.
  • πŸ”₯ [2024/05/016]: We first successfully implement natural language-based auto-building modeling by fine-tuning a large language model (LLM). Paper here.

πŸš€ Key Features

  • Scalability: Auto-generates EnergyPlus models, including varying geometry sizes and internal loads.
  • Accuracy & Efficiency: Achieves 100% modeling accuracy while reducing manual modeling time by over 95%.
  • Interaction & Automation: A user-friendly human-AI interface for seamless model creation and customization.

πŸ—οΈ Target Users

This current platform is designed for engineers, architects, and researchers working in building performance, sustainability, and resilience. It is especially useful during early-stage conceptual design when modeling decisions have the greatest impact.

πŸš€ Quick Start

Here provides a code snippet to show you how to load the EPlus-LLM and auto-generate building energy models.

Open In Colab

# ⚠️ Please make sure you have GPU.
# ⚠️ Please make sure your EnergyPlus version is 9.6 for successful running.
# ⚠️ Download the v1_nextpart.idf file from the EPlus-LLM repo and place it in your current working directory.
import torch
from transformers import (
    AutoModelForSeq2SeqLM, 
    AutoTokenizer,
)

# Load the EPlus-LLM model
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-large")
model = AutoModelForSeq2SeqLM.from_pretrained("EPlus-LLM/EPlus-LLMv1"
                                              # , force_download=True # If you cannot download the model
                                              )

# Generation config
generation_config = model.generation_config
generation_config.max_new_tokens = 2000
generation_config.temperature = 0.1
generation_config.top_p = 0.1
generation_config.num_return_sequences = 1
generation_config.pad_token_id = tokenizer.eos_token_id
generation_config.eos_token_id = tokenizer.eos_token_id

# Please provide your input here β€” a description of the desired building
# For more details, please refer to the paper: https://doi.org/10.1016/j.apenergy.2024.123431
input="Simulate a building that is 30.00 meters long, 15.00 meters wide, and 3.50 meters high. The window-to-wall ratio is 0.28. The occupancy rate is 8.00 m2/people, the lighting level is 6.00 W/m2, and the equipment power consumption is 8.80 W/m2."
input_ids = tokenizer(input, return_tensors="pt", truncation=False)
generated_ids = model.generate(input_ids = input_ids.input_ids,
                           attention_mask = input_ids.attention_mask,
                           generation_config = generation_config)
generated_output = tokenizer.decode(generated_ids[0], skip_special_tokens=True)
generated_output = generated_output.replace("_", " ")
generated_output = generated_output.replace("|", "\n")

# Load the rest port of IDF file.
file_path = "v1_nextpart.idf" # File is in the repo, please download.
output_path = "v1_final.idf"
with open(file_path, 'r', encoding='utf-8') as file:
    nextpart = file.read()
final_text = nextpart + "\n\n" + generated_output
with open(output_path, 'w', encoding='utf-8') as f:
    f.write(final_text)
    
# Output the building energy model in IDF file
print(f"Building Energy Model Auto-Generated: {output_path}")

πŸ“ Citation

If you find our work helpful, feel free to give us a cite.

@article{jiang2025EPlus-LLM,
  author    = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
  title     = {EPlus-LLM: A large language model-based computing platform for automated building energy modeling},
  journal   = {Applied Energy},
  volume    = {367},
  pages     = {123431},
  year      = {2024},
  month     = {Aug},
  doi       = {https://doi.org/10.1016/j.apenergy.2024.123431}}

@article{jiang2025prompting,
  author    = {Gang Jiang and Zhihao Ma and Liang Zhang and Jianli Chen},
  title     = {Prompt engineering to inform large language models in automated building energy modeling},
  journal   = {Energy},
  volume    = {316},
  pages     = {134548},
  year      = {2025},
  month     = {Feb},
  doi       = {https://doi.org/10.1016/j.energy.2025.134548}}

@article{jiang2025EPlus-LLMv2,
  author    = {Gang Jiang and Jianli Chen},
  title     = {Efficient fine-tuning of large language models for automated building energy modeling in complex cases},
  journal   = {Automation in Construction},
  volume    = {175},
  pages     = {106223},
  year      = {2025},
  month     = {July},
  doi       = {https://doi.org/10.1016/j.autcon.2025.106223}}
Downloads last month
22
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for EPlus-LLM/EPlus-LLMv1

Finetuned
(194)
this model