100% Relative Humidity and Condensation during Summer in a Residential Model?
Hi Everyone, I am using OS for a research project (graduate level, not a homework assignment or class project) looking at thermostat setbacks in single family detached residential homes. The issue I am currently running into is that running fixed infiltration rates of .35 or 0.5 ACHnat (to match the real homes we are reflecting) my model is telling me that with the setbacks and occupancy schedules I have, there are large portions of time (2000+ hours) where my humidity levels are above 80% RH, and over 200 hours where it is over 98% based on zone relative humidity.
I am running a custom weather file that runs a "design week" for half the year in heating mode and half the year in cooling mode to test numerous control schemes over the course of the year in one model and compress the number of simulations I need to do, and this is only an issue during cooling season where the zone temp only trips the AC on for ~100 hours over the 6 month period. I have the opposite issue (extremely low RH) in the winter but the heat is active a significant amount of the time so I can understand with dry cold constant infiltration how then heating that air would lower the RH in the zone. My concern is that on the summer side having 200+ hours of "rain" inside is completely unrealistic and therefore I may not be modeling it properly. Since it is happening every week the issue appears to be independent of my various setback control schemes.
I understand I can force dehumidification with a Zone:humidistat and associated schedule and dehumidifier but I want to stay true to a traditional existing residential HVAC system which does not typically possess such a mode, and I am still concerned there is something else in my model that needs checking. As a thought exercise I did integrate a dehumidifier and run it in two models. One kept all of my original thermostat, occupancy, and load schedules, and the second (which I'll refer to as the 68 Model) kept occupancy and load schedules but changed the thermostat setpoint to a constant 68F for all cooling times. Both dehumidifiers were set to maintain RH <=70%. While the first model dehumidifier only ran ~150 hours longer (1068 hours v 916) than the 68 model, it removed 7.8 times the water by mass, and consumed 7.8 time the electricity by energy consumption. That makes sense since in the 68 model the cooling coil was on a significantly large percentage of the cooling season so it was able to reduce the amount of supplemental dehumidification needed. However given supplemental was still needed even in the 68 model where in a typical home it wouldn't be, I am still concerned that I am not properly modeling either my vapor/latent loads, or my infiltration (could of course be something else too!).
I am not running any outdoor air ...