Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Revision history [back]

@Vishak yes, you can do this. For any optimization/calibration there are a few steps.

  1. Adding runner.RegisterValue to measure: Any OpenStudio measure can add runner.registerValue objets to it, although this is mostly commonly used for reporting measures that are reporting values generated using simulation results. NREL published reporting measures already contain this objets.
  2. Adding outputs to the Analysis: In the OpenStudio Analysis you can setup specific runner.registerValuesentries as outputs. This is typically done using the Output tab of PAT in algorithmic mode. Some runner.registerValues are hard coded. If you don't see the one you need, you can add one manually and type the name, but it will only work if it matches a runner.registerValue generated by the measure.
  3. Setting up objective functions: To use outputs for any kind of optimization, you need to setup one or more outputs as objective functions. In the Outputs tab you can set "Objective Function" column to true for specific outputs, add in a target value, often but not always 0, and then if there is more than one objective function then you should also set the weighting factor between the measures.
  4. Make sure you have an algorithm type selected that supports optimization, such as RGENOUD or NSGA2.
  5. Lastly, make sure you have setup one or more arguments in the measures as variables that the algorithm can alter to minimize the objective function.

I know steps 4 and 5 are kind of obvious but just wanted to list them as something someone can do to pre-flight check the analysis before kicking it off.

For many users objective functions may come from a single reporting measure. For example on our time series calibration measure it CVRMSE and NMBE might each be objective functions, but there is nothing stopping you from running multiple reporting measures, each adding their own objective functions. In your case I would not add two copies of the time series calibration measure. It was intended to look at load profile throughout the day at hourly or simulation timestep intervals. You second reporting measure should be standard or Enhanced Calibration Report. This uses the Tabular EnergyPlus results which already includes monthly data.

I will say that while it is very easy to have a good monthly CVRMSE and a bad hourly or timestep CVRMSE, the reverse is less common. It would be hard to have a good time series CVRMSE and a bad monthly CVRMSE. So if you have annual data for both, then I don't know how much is added with the monthly objective function. But a common occurrence might be having time series for 3 weeks and monthly for the year. In that case having both sets of objective functions would be critical.

@Vishak yes, you can do this. For any optimization/calibration there are a few steps.

  1. Adding runner.RegisterValue to measure: Any OpenStudio measure can add runner.registerValue objets objects to it, although this is mostly commonly used for reporting measures that are reporting values generated using simulation results. NREL published reporting measures already contain this objets.objects.
  2. Adding outputs to the Analysis: In the OpenStudio Analysis you can setup specific runner.registerValuesentries as outputs. This is typically done using the Output tab of PAT in algorithmic mode. Some runner.registerValues are hard coded. If you don't see the one you need, you can add one manually and type the name, but it will only work if it matches a runner.registerValue generated by the measure.
  3. Setting up objective functions: To use outputs for any kind of optimization, you need to setup one or more outputs as objective functions. In the Outputs tab you can set "Objective Function" column to true for specific outputs, add in a target value, often but not always 0, and then if there is more than one objective function then you should also set the weighting factor between the measures.
  4. Make sure you have an algorithm type selected that supports optimization, such as RGENOUD or NSGA2.
  5. Lastly, make sure you have setup one or more arguments in the measures as variables that the algorithm can alter to minimize the objective function.

I know steps 4 and 5 are kind of obvious but just wanted to list them as something someone can do to pre-flight check the analysis before kicking it off.

For many users objective functions may come from a single reporting measure. For example example, on our time series calibration measure it both CVRMSE and NMBE might each be objective functions, but there is nothing stopping you from running multiple reporting measures, each adding their own objective functions. In your case I would not add two copies of the time series calibration measure. It was intended to look at load profile throughout the day at hourly or simulation timestep intervals. You second reporting measure should be standard or Enhanced Calibration Report. This uses the Tabular EnergyPlus results which already includes monthly data.

I will say that while it is very easy to have a good monthly CVRMSE and a bad hourly or timestep CVRMSE, the reverse is less common. It would be hard to have a good time series CVRMSE and a bad monthly CVRMSE. So if you have annual data for both, then I don't know how much is added with the monthly objective function. But a common occurrence might be having time series for 3 weeks and monthly for the year. In that case having both sets of objective functions would be critical.

@Vishak yes, you can do this. For any optimization/calibration there are a few steps.

  1. Adding runner.RegisterValue to measure: Any OpenStudio measure can add runner.registerValue objects to it, although this is mostly commonly used for reporting measures that are reporting values generated using simulation results. NREL published reporting measures already contain this these objects.
  2. Adding outputs to the Analysis: In the OpenStudio Analysis you can setup specific runner.registerValuesentries as outputs. This is typically done using the Output tab of PAT in algorithmic mode. Some runner.registerValues are hard coded. If you don't see the one you need, you can add one manually and type the name, but it will only work if it matches a runner.registerValue generated by the measure.
  3. Setting up objective functions: To use outputs for any kind of optimization, you need to setup one or more outputs as objective functions. In the Outputs tab you can set "Objective Function" column to true for specific outputs, add in a target value, often but not always 0, and then if there is more than one objective function then you should also set the weighting factor between the measures.
  4. Make sure you have an algorithm type selected that supports optimization, such as RGENOUD or NSGA2.
  5. Lastly, make sure you have setup one or more arguments in the measures as variables that the algorithm can alter to minimize the objective function.

I know steps 4 and 5 are kind of obvious but just wanted to list them as something someone can do to pre-flight check the analysis before kicking it off.

For many users objective functions may come from a single reporting measure. For example, on our time series calibration measure both CVRMSE and NMBE might each be objective functions, but there is nothing stopping you from running multiple reporting measures, each adding their own objective functions. In your case I would not add two copies of the time series calibration measure. It was intended to look at load profile throughout the day at hourly or simulation timestep intervals. You second reporting measure should be standard or Enhanced Calibration Report. This uses the Tabular EnergyPlus results which already includes monthly data.

I will say that while it is very easy to have a good monthly CVRMSE and a bad hourly or timestep CVRMSE, the reverse is less common. It would be hard to have a good time series CVRMSE and a bad monthly CVRMSE. So if you have annual data for both, then I don't know how much is added with the monthly objective function. But a common occurrence might be having time series for 3 weeks and monthly for the year. In that case having both sets of objective functions would be critical.