Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page

Radiance Measure Fail - too many points

Hello everyone,

I'm running daylight calculations using OS 2.3 and the Radiance Daylight Measure from the BCL.

When I run the simulation the workflow is stopped imediately after the Radiance Measure is "completed" and the E+ annual simulation is skipped. If I just run the simulation without adding the Radiance Measure the annual calcs are executed without a problem.

Everytime I run the Radiance Measure I get this message in the stdout file:

ERROR: Too many calculation points in model (13024). Consider reducing the number or resolution of illuminance maps in this model.

Does anyone know if this could be remediated by using a more powerful computer? If not, does anyone know what would be the maximum allowable number of calculation points? Is there any other workaround?

I'd greatly appreciate any help.

edit retag close merge delete

Sort by » oldest newest most voted

The reason this failsafe was put in is that it's very easy to fill an entire building with calculation points, and in many cases, the Radiance Measure as written will generate too much data to be reasonably handled, even on a "more powerful computer". Stress testing this measure way back when we found we could easily blow up even the most robust VMs available on Amazon's EC2 service. Coupled with the energy modeling industry expectation of running thousands of models or more, we chose to hamstring the measure in this way.

If you want to disable this, you could make a copy of the measure, and modify the block starting with this line:

if 1000 < rfluxmtxDim.to_i && rfluxmtxDim.to_i < 2999

In the current version of the measure on the bcl, this block begins at line 770. Either increase the allowable number of points for the warning and the fail (and be ready to wait!), or remove all the conditionals and let L776 stand alone.

more

Thanks a lot Sir!

I changed the code like you mentioned and my memory started to blow up.

My complete model has 111 illuminance maps (about 30k points). I ran a few tests and my computer can handle up to around 5k points (8GB RAM, Core i-5 CPU @ 2.5GHz).

I think I could make six copies of my model, in order to selectively delete illuminance maps and run calculations in batches... however, before I start this time-consuming process, do you know if a virtual machine in AWS could be an option for a model this size?

( 2017-10-24 09:01:32 -0600 )edit
1

5k points, eh? Sounds about right. I was in the same boat and just to be conservative I dialed the limit back to 3k points. Good for you, taking the initiative and modding the code to try and get your work done. As I mentioned, even the large AWS options (memory and storage, they can both be overtaxed with this Radiance measure) will eventually falter with these large datasets. The expectation that brute force can handle any model is simply a bad idea. It's something I've tried time and again, unfortunately. =(

( 2017-10-24 09:50:57 -0600 )edit
1

Your solution is one approach. The central idea is that you need to model specific areas and let those represent all the similar areas of your building. I've mentioned elsewhere that the OpenStudio model needs to be extended to support this notion natively; in the meantime, yes you can model specific areas and apply those schedules to other similar areas.

( 2017-10-24 09:53:34 -0600 )edit

Thanks a lot, I really appreciate the help.

Have a great day!

( 2017-10-24 10:42:48 -0600 )edit