Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Revision history [back]

100% Relative Humidity and Condensation during Summer in a Residential Model?

Hi Everyone, I am using OS for a research project (graduate level, not a homework assignment or class project) looking at thermostat setbacks in single family detached residential homes. The issue I am currently running into is that running fixed infiltration rates of .35 or 0.5 ACHnat (to match the real homes we are reflecting) my model is telling me that with the setbacks and occupancy schedules I have, there are large portions of time (2000+ hours) where my humidity levels are above 80% RH, and over 200 hours where it is over 98% based on zone relative humidity.

I am running a custom weather file that runs a "design week" for half the year in heating mode and half the year in cooling mode to test numerous control schemes over the course of the year in one model and compress the number of simulations I need to do, and this is only an issue during cooling season where the zone temp only trips the AC on for ~100 hours over the 6 month period. I have the opposite issue (extremely low RH) in the winter but the heat is active a significant amount of the time so I can understand with dry cold constant infiltration how then heating that air would lower the RH in the zone. My concern is that on the summer side having 200+ hours of "rain" inside is completely unrealistic and therefore I may not be modeling it properly. Since it is happening every week the issue appears to be independent of my various setback control schemes.

I understand I can force dehumidification with a Zone:humidistat and associated schedule and dehumidifier but I want to stay true to a traditional existing residential HVAC system which does not typically possess such a mode, and I am still concerned there is something else in my model that needs checking. As a thought exercise I did integrate a dehumidifier and run it in two models. One kept all of my original thermostat, occupancy, and load schedules, and the second (which I'll refer to as the 68 Model) kept occupancy and load schedules but changed the thermostat setpoint to a constant 68F for all cooling times. Both dehumidifiers were set to maintain RH <=70%. While the first model dehumidifier only ran ~150 hours longer (1068 hours v 916) than the 68 model, it removed 7.8 times the water by mass, and consumed 7.8 time the electricity by energy consumption. That makes sense since in the 68 model the cooling coil was on a significantly large percentage of the cooling season so it was able to reduce the amount of supplemental dehumidification needed. However given supplemental was still needed even in the 68 model where in a typical home it wouldn't be, I am still concerned that I am not properly modeling either my vapor/latent loads, or my infiltration (could of course be something else too!).

I am not running any outdoor air in the HVAC system and the reason my minimum infiltration rate is to be .35ACHnat is that is the lowest infiltration rate that ASHRAE 90.2/62.2 allows before requiring outdoor air to be brought into the home, which again is not a feature present on most existing residential HVAC systems. 0.5 ACHnat is used to represent an older draftier house and .35 is for more recent but still ASHRAE compliant design.

I know that a constant infiltration rate is probably a large part of the issue. While it appears based on my output data the infiltration rate is actually slightly higher in the winter (.4) and lower in the summer (.33) and varies on a minute by minute basis around those centered values during each season (based on wind and outdoor pressure perhaps?) the issue still stands I do not have a way to account for when the house will be positively pressurized due to HVAC operation (or negatively from kitchen or bathroom exhaust?) and how this may or may not be contributing to my issue. Since my main concern is the summer and the HVAC turns on so infrequently to meet my testing setpoints I am skeptical occasional and short lived positive pressure will make that much of a difference since the fan is designed to run in an on-off cycle with the cooling or heating coil/burner and not constantly.

Based on what I have described has anyone seen anything similar or have some initial suggestions? Is there an OS variable that I can use to determine how much vapor is coming into the space through infiltration (I'm thinking because of partial pressures simply multiplying infiltration mass flow by outdoor humidity ratio is not the answer...?) and how much is being generated by my various internal loads? Happy to clairfy anything stated above in an effort to get to a solution. Thank you for your thoughts an patience! Zach

100% Relative Humidity and Condensation during Summer in a Residential Model?

Hi Everyone, I am using OS for a research project (graduate level, not a homework assignment or class project) looking at thermostat setbacks in single family detached residential homes. The issue I am currently running into is that running fixed infiltration rates of .35 or 0.5 ACHnat (to match the real homes we are reflecting) my model is telling me that with the setbacks and occupancy schedules I have, there are large portions of time (2000+ hours) where my humidity levels are above 80% RH, and over 200 hours where it is over 98% based on zone relative humidity.

I am running a custom weather file that runs a "design week" for half the year in heating mode and half the year in cooling mode to test numerous control schemes over the course of the year in one model and compress the number of simulations I need to do, and this is only an issue during cooling season where the zone temp only trips the AC on for ~100 hours over the 6 month period. I have the opposite issue (extremely low RH) in the winter but the heat is active a significant amount of the time so I can understand with dry cold constant infiltration how then heating that air would lower the RH in the zone. My concern is that on the summer side having 200+ hours of "rain" inside is completely unrealistic and therefore I may not be modeling it properly. Since it is happening every week the issue appears to be independent of my various setback control schemes.

I understand I can force dehumidification with a Zone:humidistat and associated schedule and dehumidifier but I want to stay true to a traditional existing residential HVAC system which does not typically possess such a mode, and I am still concerned there is something else in my model that needs checking. As a thought exercise I did integrate a dehumidifier and run it in two models. One kept all of my original thermostat, occupancy, and load schedules, and the second (which I'll refer to as the 68 Model) kept occupancy and load schedules but changed the thermostat setpoint to a constant 68F for all cooling times. Both dehumidifiers were set to maintain RH <=70%. While the first model dehumidifier only ran ~150 hours longer (1068 hours v 916) than the 68 model, it removed 7.8 times the water by mass, and consumed 7.8 time the electricity by energy consumption. That makes sense since in the 68 model the cooling coil was on a significantly large percentage of the cooling season so it was able to reduce the amount of supplemental dehumidification needed. However given supplemental was still needed even in the 68 model where in a typical home it wouldn't be, I am still concerned that I am not properly modeling either my vapor/latent loads, or my infiltration (could of course be something else too!).

I am not running any outdoor air in the HVAC system and the reason my minimum infiltration rate is to be .35ACHnat is that is the lowest infiltration rate that ASHRAE 90.2/62.2 allows before requiring outdoor air to be brought into the home, which again is not a feature present on most existing residential HVAC systems. 0.5 ACHnat is used to represent an older draftier house and .35 is for more recent but still ASHRAE compliant design.

I know that a constant infiltration rate is probably a large part of the issue. While it appears based on my output data the infiltration rate is actually slightly higher in the winter (.4) and lower in the summer (.33) and varies on a minute by minute basis around those centered values during each season (based on wind and outdoor pressure perhaps?) the issue still stands I do not have a way to account for when the house will be positively pressurized due to HVAC operation (or negatively from kitchen or bathroom exhaust?) and how this may or may not be contributing to my issue. Since my main concern is the summer and the HVAC turns on so infrequently to meet my testing setpoints I am skeptical occasional and short lived positive pressure will make that much of a difference since the fan is designed to run in an on-off cycle with the cooling or heating coil/burner and not constantly.

Based on what I have described has anyone seen anything similar or have some initial suggestions? Is there an OS variable that I can use to determine how much vapor is coming into the space through infiltration (I'm thinking because of partial pressures simply multiplying infiltration mass flow by outdoor humidity ratio is not the answer...?) and how much is being generated by my various internal loads? Happy to clairfy anything stated above in an effort to get to a solution. Thank you for your thoughts an patience! Zach