Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Reinforcement Learning Workflow with EnergyPlus

asked 2020-02-05 05:05:36 -0500

Eko123 gravatar image

updated 2020-02-05 07:26:05 -0500

I am currently working on a machine learning prohect and was wondering if there is a way to use reinforcement learning in combination with EnergyPlus. I would like to use EnergyPlus as an environment for training my model. The neural network is going to be written in Python using Tensorflow. I want the network to control temperature, humidity, etc. by changing inputs in the idf. Does anyone have experiences with taks like this? How do I link EnergyPlus to Python? Did some research and I currently don't know which interface fits the best. Is this possible with the Functional Mockup Unit (FMU) or this?

Any help is appreciated.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2020-02-06 03:16:05 -0500

updated 2020-02-14 04:04:49 -0500

RL with Energyplus is a nice topic :) have also worked on this accordingly.
In my case i'm doing a PhD where i built a framework for doing exactly that on multiple buildings, you can contact me if you need further information (is not yet disclosed for public).
And then apply RL, and so to answer your question of linking E+ with Python:

  • do your own code to interact with model classes
  • use Eppy python library or use other libraries you find out there or go to first item and do one yourself

According to my experience it really Eppy was not able to automate very well because is not fully developed and it not supports very well conda environments, so have done it with own scripts.
Having known this is also important notice that it helped some users who like you wanted to interact with code.
You can give a try for any of these and check how it works for you.
Added contacts for interested in Reinforcement Learning and multiple buildings energy simulation (UBEMs):
Email: francisco.pires.costa at
So Cheers.

edit flag offensive delete link more



Sounds really interesting. I would like to know more about that framework and your PhD. I already checked eppy but as you said it doesn't automate well. I am currently looking at pyEP and EPS but it really seems like I will have to code something own and do some changes on the existing stuff. Did you manage to get any satisfying results till now?

Eko123 gravatar imageEko123 ( 2020-02-06 05:05:09 -0500 )edit

I'm in an MIT PhD Program, and basically it took me some/much effort only to script something for actually doing the framework and represent it in a pioneer manner.
Now i'm connecting it with instruments and check how RL performs, so the RL results are very promising for sure. Cheers.

EngFranc gravatar imageEngFranc ( 2020-02-10 10:36:00 -0500 )edit

Hello, this topic sound interesting fro me. I want to ask some questions regarding the usage of RL + E+ envs, to train DRL based agents. How I can contact to you guys? Cheers.

Elektro gravatar imageElektro ( 2020-02-13 07:21:33 -0500 )edit

Hi @Elektro it seems in this forum is not possible to contact other users in private messages, so if you or anyone interested, or working on this, (both reinforcement learning and Urban Building Energy Modeling also known as for multiple energy plus buildings simulations) here check contacts i have edited in my answer.

EngFranc gravatar imageEngFranc ( 2020-02-14 04:00:48 -0500 )edit

Thanks. I will send you an email.

Elektro gravatar imageElektro ( 2020-02-14 04:26:11 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer


Question Tools



Asked: 2020-02-05 05:05:36 -0500

Seen: 203 times

Last updated: Feb 14