Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Revision history [back]

My usual start point for that type of stuff is nproc - 2, so n# of threads minus 2, in your case that'd be 14.

In python, that's multiprocessing.cpu_count() - 2 *generally speaking, but do make sure that's the case as it returns the number of online processors and sometimes could return less than actually available depending on your hardware and power management options.

I reserve usually 2 threads for the managing part (and not freezing the system while I do small stuff on the side like browsing): after all, if python is going to be orchestrating your runs and potentially doing some post-processing things, then it must have available resources to do that.

If your python code is going to do heavy things in pre and/or post processing, then I would reserve more resources for non-energyplus runs.

As usual, your mileage will greatly vary, so just try it out for yourself on a subset of your analysis if it's large. Benchmark, and advise.

My usual start point for that type of stuff is nproc - 2, so n# of threads minus 2, in your case that'd be 14.

In python, that's multiprocessing.cpu_count() - 2 *generallygenerally speaking, but do make sure that's the case as it returns the number of online processors and sometimes could return less than actually available depending on your hardware and power management options.

I reserve usually 2 threads for the managing part (and not freezing the system while I do small stuff on the side like browsing): after all, if python is going to be orchestrating your runs and potentially doing some post-processing things, then it must have available resources to do that.

If your python code is going to do heavy things in pre and/or post processing, then I would reserve more resources for non-energyplus runs.

As usual, your mileage will greatly vary, so just try it out for yourself on a subset of your analysis if it's large. Benchmark, and advise.