Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Reinforcement Learning Workflow with EnergyPlus

asked 2020-02-05 05:05:36 -0600

Eko123 gravatar image

updated 2020-02-05 07:26:05 -0600

I am currently working on a machine learning prohect and was wondering if there is a way to use reinforcement learning in combination with EnergyPlus. I would like to use EnergyPlus as an environment for training my model. The neural network is going to be written in Python using Tensorflow. I want the network to control temperature, humidity, etc. by changing inputs in the idf. Does anyone have experiences with taks like this? How do I link EnergyPlus to Python? Did some research and I currently don't know which interface fits the best. Is this possible with the Functional Mockup Unit (FMU) or this?

Any help is appreciated.

edit retag flag offensive close merge delete


Hi, I'm wondering if you have any updates on this project/workflow? I'm also working on similar things. I checked pyEp, it didn't work.

yapan gravatar image yapan  ( 2020-05-07 00:54:58 -0600 )edit

I switched to MATLAB for my task so I don't have any updates regarding the workflow with python. But it seems that there is a python API by now.

Eko123 gravatar image Eko123  ( 2020-12-04 06:58:12 -0600 )edit

How did you manage to integrate MATLAB for this? I'd prefer Python but will use whatever is easiest to get to the RL problem.

mechyai gravatar image mechyai  ( 2021-02-17 22:24:31 -0600 )edit

Sorry for the late answer. I didn't check this post for a while. There is a Co-Simulation Toolbox for MATLAB and Simulink. I integrated it via Simulink and the toolbox but it is also possible to do it in MATLAB directly. Check this Link.

Eko123 gravatar image Eko123  ( 2021-03-16 05:03:53 -0600 )edit

@Eko123 Thank you!

mechyai gravatar image mechyai  ( 2021-03-16 11:53:39 -0600 )edit

2 Answers

Sort by ยป oldest newest most voted

answered 2020-02-06 03:16:05 -0600

updated 2020-12-04 03:32:04 -0600

RL with Energyplus is a nice topic :) have also worked on this accordingly.
In my case i'm doing a PhD where i built a framework for doing exactly that on multiple buildings, you can contact me if you need further information (is not yet disclosed for public).
And then apply RL, and so to answer your question of linking E+ with Python:

  • do your own code to interact with model classes
  • use Eppy python library or use other libraries you find out there or go to first item and do one yourself

According to my experience it really Eppy was not able to automate very well because is not fully developed and it not supports very well conda environments, so have done it with own scripts.
Having known this is also important notice that it helped some users who like you wanted to interact with code.
You can give a try for any of these and check how it works for you.
Added contacts for interested in Reinforcement Learning and multiple buildings energy simulation (UBEMs):
Research Team:
Email: francisco.pires.costa at
So Cheers.

UPDATE: I will site this thread in my PhD dissertation (which is going to be delivered this week) where a new UBEM paradigm and respective Implementation will be offered, which will enable users to integrate data science workflows like reinforcement learning and so on with current UBEM implementations. Will post here a link to the public version. This is a begin of a very interesting possibility to model cities, for anyone interested do contact me. Cheers.

edit flag offensive delete link more



Sounds really interesting. I would like to know more about that framework and your PhD. I already checked eppy but as you said it doesn't automate well. I am currently looking at pyEP and EPS but it really seems like I will have to code something own and do some changes on the existing stuff. Did you manage to get any satisfying results till now?

Eko123 gravatar image Eko123  ( 2020-02-06 05:05:09 -0600 )edit

I'm in an MIT PhD Program, and basically it took me some/much effort only to script something for actually doing the framework and represent it in a pioneer manner.
Now i'm connecting it with instruments and check how RL performs, so the RL results are very promising for sure. Cheers.

EngFranc gravatar image EngFranc  ( 2020-02-10 10:36:00 -0600 )edit

Hello, this topic sound interesting fro me. I want to ask some questions regarding the usage of RL + E+ envs, to train DRL based agents. How I can contact to you guys? Cheers.

Elektro gravatar image Elektro  ( 2020-02-13 07:21:33 -0600 )edit

Hi @Elektro it seems in this forum is not possible to contact other users in private messages, so if you or anyone interested, or working on this, (both reinforcement learning and Urban Building Energy Modeling also known as for multiple energy plus buildings simulations) here check contacts i have edited in my answer.

EngFranc gravatar image EngFranc  ( 2020-02-14 04:00:48 -0600 )edit

Thanks. I will send you an email.

Elektro gravatar image Elektro  ( 2020-02-14 04:26:11 -0600 )edit

answered 2021-03-16 06:45:19 -0600

JohanGustafsson gravatar image


Sorry for late answer as I did not see this before until now. We have developed a development platform using

(DesignBuilder + Energyplus) + BCVTB + Python/Keras

This way we can perform RL/ML on EnergyPlus models. I would be very happy to let other people use the platform and open source the entire work on Github or similar! Please send me an email if you want to get involved!


edit flag offensive delete link more


Johan, this is fantastic! I am in need of a framework like this for research that I am doing. I will be contacting you, thank you! (please check your spam)

mechyai gravatar image mechyai  ( 2021-03-16 11:57:04 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Training Workshops

Question Tools



Asked: 2020-02-05 05:05:36 -0600

Seen: 1,015 times

Last updated: Dec 04 '20