Question-and-Answer Resource for the Building Energy Modeling Community
Get s tarted with the Help page
Ask Your Question
5

How do you run your energyplus modeling jobs? (Cloud, local?)

asked 2019-02-24 08:54:19 -0500

Kevin gravatar image

Hi,

I'm a Cloud Computing guy and because of one of our customers I'm new in this community and the energy modeling space. I'm happy to learn more about EnergyPlus.

I'm wondering how you guys usually run your modeling jobs?

Do you compute on your local machine combined with e.g. Mathlab? Or do you have bigger modeling jobs running e.g. at Azure Batch?

Further on: would it help, if you get an interface where you can upload / specify IDF / IPW files and get the job computed at the best price possible?

What are the biggest downsides of your current solution?

Cheers, Kevin

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
5

answered 2019-02-24 10:13:56 -0500

updated 2019-02-25 12:58:08 -0500

One option you can consider for cloud runs is OpenStudio Server. It's a docker instance that includes OpenStudio, EnergyPlus, and Radiance. OpenStudio is a energy modeling platform that uses EnergyPlus and Radiance as simulation engines. While the primary use case for OpenStudio Server is running OpenStudio Analyses, you can setup a workflow that loads in external IDF files, bypassing the OpenStudio model format (OSM). I can point you to an example if you are interested in that.

The OpenStudio Parametric Analysis Tool algorithmic mode uses OpenStudio Server deployed on Amazon EC2, but it can also be deployed on other cloud services or on an internal organization network.

Updated Answer Below:

If you use OpenStudio Server, mentioned above, it can handle algorithmic sampling (using R), run management, and results visualization. If however you want to handle the parametric analysis elsewhere, you can also just run individual datapoints with the OpenStudio CLI. An OpenStudio workflow (OSW), which is run by the CLI, describes the application of a series of measures to a seed OSM then to an IDF, and lastly post simulation measures generate reports. In our BESTEST-GSR GitHub repository we have a sample "Bring your own IDF" OSW intended as a pattern for users to follow who want to work with the OpenStudio framework in some way but don't want to directly work with the OpenStudio Model (OSM) format. This example workflow still has a seed OSM file but the contents of that model is replaced by an EnergyPlus measure that loads in a user specified IDF file.

For reference, if you do decide to work with the OSM format and the OpenStudio Model API, then you are still in the end producing IDF files, but using OpenStudio API as a modeling tool. You can decide with your client what is best for your use case.

In the "Bring your own IDF" example above, the external IDF file is brought in where measures D-F are shown in the diagram below image description

edit flag offensive delete link more

Comments

Thank you David for your insights, this helps me already very much! Could you guide me to this workflow? This sounds like something our customer needs :)

Maybe some words what we're trying to achieve: we've been approached and asked if we could optimize the Cloud workloads and AWS bills for a startup, which is doing huge energy modelling jobs to build better buildings.

Kevin gravatar image Kevin  ( 2019-02-25 11:21:34 -0500 )edit
2

@Kevin, I updated the answer with additional information including a "Bring your own IDF" example workflow.

David Goldwasser gravatar image David Goldwasser  ( 2019-02-25 12:24:52 -0500 )edit

@David Goldwasser in the "Bring your own IDF" example how would you predict simulation speed on the cloud compares to a typical local machine? Does the cloud workflow have more advanced parallel simulation opportunities than on a local machine? Say, for an upload of 1,000 IDFs.

JustinLueker gravatar image JustinLueker  ( 2020-10-28 14:07:04 -0500 )edit
1

@JustinLueker, when you use OpenStudio Server which supports cloud computing, we paralyze the analysis to different cores. We don't split an individual EnergyPlus run up to different cores. Speed depending on specs of your computer or cloud resources, but as a rough guess if you have for example 100 cores and 1000 simulations, it will take 10x the length of one run cycle. This isn't as clear cut if you have some simulations that run much slower or faster than others, like a hospital vs. an office.

David Goldwasser gravatar image David Goldwasser  ( 2020-10-28 14:35:09 -0500 )edit
2

answered 2019-02-26 12:37:57 -0500

I normally compute locally, either on my work laptop or via RDP on local modeling pc.

Some projects I work on have sensitive information and are either not allowed to compute in the cloud or you have to be very careful if you do. For this scenario, it would be nice to have an easy way to have our IT create a public cloud that we can use to batch models locally.

For projects that we are allowed to compute in the cloud, it would be nice to have a site where we could upload IDF/EPW, OSW, etc. and have it run at the best possible price. It would also be nice to have an option to run the files for the shortest possible time (for when there is a deadline or if you are trying to get quick feedback in a design charette or something).

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

 

Question Tools

3 followers

Stats

Asked: 2019-02-24 08:54:19 -0500

Seen: 677 times

Last updated: Feb 26 '19