Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question
6

Reinforcement Learning Workflow with EnergyPlus

asked 2020-02-05 05:05:36 -0500

Eko123's avatar

updated 2020-02-05 07:26:05 -0500

I am currently working on a machine learning prohect and was wondering if there is a way to use reinforcement learning in combination with EnergyPlus. I would like to use EnergyPlus as an environment for training my model. The neural network is going to be written in Python using Tensorflow. I want the network to control temperature, humidity, etc. by changing inputs in the idf. Does anyone have experiences with taks like this? How do I link EnergyPlus to Python? Did some research and I currently don't know which interface fits the best. Is this possible with the Functional Mockup Unit (FMU) or this?

Any help is appreciated.

edit retag flag offensive close merge delete

Comments

Hi, I'm wondering if you have any updates on this project/workflow? I'm also working on similar things. I checked pyEp, it didn't work.

yapan's avatar yapan  ( 2020-05-07 00:54:58 -0500 )edit

I switched to MATLAB for my task so I don't have any updates regarding the workflow with python. But it seems that there is a python API by now.

Eko123's avatar Eko123  ( 2020-12-04 06:58:12 -0500 )edit

How did you manage to integrate MATLAB for this? I'd prefer Python but will use whatever is easiest to get to the RL problem.

mechyai's avatar mechyai  ( 2021-02-17 22:24:31 -0500 )edit

Sorry for the late answer. I didn't check this post for a while. There is a Co-Simulation Toolbox for MATLAB and Simulink. I integrated it via Simulink and the toolbox but it is also possible to do it in MATLAB directly. Check this Link.

Eko123's avatar Eko123  ( 2021-03-16 05:03:53 -0500 )edit

@Eko123 Thank you!

mechyai's avatar mechyai  ( 2021-03-16 11:53:39 -0500 )edit

4 Answers

Sort by ยป oldest newest most voted
5

answered 2020-02-06 03:16:05 -0500

updated 2020-12-04 03:32:04 -0500

RL with Energyplus is a nice topic :) have also worked on this accordingly.
In my case i'm doing a PhD where i built a framework for doing exactly that on multiple buildings, you can contact me if you need further information (is not yet disclosed for public).
And then apply RL, and so to answer your question of linking E+ with Python:

  • do your own code to interact with model classes
  • use Eppy python library or use other libraries you find out there or go to first item and do one yourself

According to my experience it really Eppy was not able to automate very well because is not fully developed and it not supports very well conda environments, so have done it with own scripts.
Having known this is also important notice that it helped some users who like you wanted to interact with code.
You can give a try for any of these and check how it works for you.
Added contacts for interested in Reinforcement Learning and multiple buildings energy simulation (UBEMs):
Research Team: http://in3.dem.ist.utl.pt/team/member...
URL: luxmagna.org
Email: francisco.pires.costa at tecnico.ulisboa.pt
So Cheers.

UPDATE: I will site this thread in my PhD dissertation (which is going to be delivered this week) where a new UBEM paradigm and respective Implementation will be offered, which will enable users to integrate data science workflows like reinforcement learning and so on with current UBEM implementations. Will post here a link to the public version. This is a begin of a very interesting possibility to model cities, for anyone interested do contact me. Cheers.

edit flag offensive delete link more

Comments

2

Sounds really interesting. I would like to know more about that framework and your PhD. I already checked eppy but as you said it doesn't automate well. I am currently looking at pyEP and EPS but it really seems like I will have to code something own and do some changes on the existing stuff. Did you manage to get any satisfying results till now?

Eko123's avatar Eko123  ( 2020-02-06 05:05:09 -0500 )edit
1

I'm in an MIT PhD Program, and basically it took me some/much effort only to script something for actually doing the framework and represent it in a pioneer manner.
Now i'm connecting it with instruments and check how RL performs, so the RL results are very promising for sure. Cheers.

EngFranc's avatar EngFranc  ( 2020-02-10 10:36:00 -0500 )edit
1

Hello, this topic sound interesting fro me. I want to ask some questions regarding the usage of RL + E+ envs, to train DRL based agents. How I can contact to you guys? Cheers.

Elektro's avatar Elektro  ( 2020-02-13 07:21:33 -0500 )edit
1

Hi @Elektro it seems in this forum is not possible to contact other users in private messages, so if you or anyone interested, or working on this, (both reinforcement learning and Urban Building Energy Modeling also known as for multiple energy plus buildings simulations) here check contacts i have edited in my answer.
Cheers

EngFranc's avatar EngFranc  ( 2020-02-14 04:00:48 -0500 )edit
1

Thanks. I will send you an email.

Elektro's avatar Elektro  ( 2020-02-14 04:26:11 -0500 )edit
3

answered 2021-03-16 06:45:19 -0500

JohanGustafsson's avatar

Hi,

Sorry for late answer as I did not see this before until now. We have developed a development platform using

(DesignBuilder + Energyplus) + BCVTB + Python/Keras

This way we can perform RL/ML on EnergyPlus models. I would be very happy to let other people use the platform and open source the entire work on Github or similar! Please send me an email if you want to get involved!

Johan

johan@zyax.se

edit flag offensive delete link more

Comments

Johan, this is fantastic! I am in need of a framework like this for research that I am doing. I will be contacting you, thank you! (please check your spam)

mechyai's avatar mechyai  ( 2021-03-16 11:57:04 -0500 )edit
1

answered 2022-03-14 17:10:14 -0500

mechyai's avatar

For my own research, I have created a Python framework using the E+ EMS Python API to help manage creating a RL environemnt with a running E+ simulation. My code and its documentation can be found in my GitHub repo here https://github.com/mechyai/RL-EmsPy.

This is still a work in progress, but EmsPy, BcaEnv, and MdpManager classes of my Repo are working well and I am running PyTorch Deep RL models.

edit flag offensive delete link more
1

answered 2022-08-11 07:56:27 -0500

antoineg's avatar

updated 2022-08-11 07:58:17 -0500

I can site 2 projects that aim at simplifying setup of a reinforcement learning training environment using EnergyPlus. They avoid the need to use a co-simulation tool (ie BCVTB) and integrate with recent RL training framework.

disclaimer: I'm maintainer on the first, author of the second.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Careers

Question Tools

8 followers

Stats

Asked: 2020-02-05 05:05:36 -0500

Seen: 2,063 times

Last updated: Aug 11 '22