Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question
2

Connecting Energyplus simulation with OpenAI Gym Environment through Python API

asked 2022-08-12 00:43:07 -0600

Andaman's avatar

updated 2022-08-12 10:41:40 -0600

I'm trying to implement OpenAI Gym environment (for reinforcement learning training) with EnergyPlus building environment. I recently made an effort to achieve this by connecting Python with EnergyPlus through Python API. Currently, my understanding is that E+ simulation is run with few lines of code as below

ARGS = [
    '--weather',
    weather_file_path,
    '--output-directory',
    output_dir,
    '--readvars',
    idf_file_path
]

api = EnergyPlusAPI()
state = api.state_manager.new_state()
api.runtime.run_energyplus(state, ARGS)

and the only way I can interact with each timestep is through callback functions provided by Runtime API.

OpenAI Gym environment requires us to implement a "step" function in a class which will be repeatedly called during RL model training. A pseudo-code of what flow inside a step function with EnergyPlus Python API should look like is shown below

 def step(self, action)
    set_eplus_actuators_at_start_timestep(action)
    new_state = get_eplus_sensor_values_at_end_timestep(.....)
    reward = calculate_reward_or_penalty(new_state)
    return new_state, reward

However, since each action towards EnergyPlus running simulation can only be done through callback functions, the step function above cannot be achieved through typical single-process programming.

Any suggestion on how to solve this?

edit retag flag offensive close merge delete

Comments

I had trouble with this too, but never took the time to try and figure it out - I'm not sure it's possible since you have to pass callback functions to calling points before running the simulation. These callback functions aren't necessarily static since you can pass an object into it, and keep track of data and take actions thru that object.

I have 2 repos for RL (PyTorch) + EnergyPlus - they are a mess but maybe they'll help https://github.com/mechyai/RL-EmsPyhttps://github.com/mechyai/rl_bca

mechyai's avatar mechyai  ( 2022-08-17 22:01:19 -0600 )edit

This function https://github.com/mechyai/RL-EmsPy/b... decorates a standard callback function, which you can pass an 'observation' and 'actuation' method for a given calling point

mechyai's avatar mechyai  ( 2022-08-17 22:02:19 -0600 )edit

Thanks for your comments. I will definitely check your repos!

I'm currently trying to separate OpenAI Gym env from Eplus with multiprocessing. By using Pipe to pause and wait for input from each other, I might be able to sync the runtime between both processes.

Andaman's avatar Andaman  ( 2022-08-23 22:27:24 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2023-03-06 10:24:11 -0600

antoineg's avatar

You can find a working example in rllib-energyplus. It integrates EnergyPlus python API with OpenAI Gym (actually, Gymnasium, a fork by the original authors of Gym - Gym isn't maintained anymore) and Ray RLlib.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Training Workshops

Careers

Question Tools

2 followers

Stats

Asked: 2022-08-12 00:43:07 -0600

Seen: 682 times

Last updated: Mar 06 '23