Why the internal temperature of the model started from a value close to the setpoint temperature, rather than starting from room temperature like a real-world scenario. [closed]
I have a question that troubles me a lot and I would like to have some clarification on it.
Let's imagine that we have a freezer in the real world, which has been stopped at t=0. The internal temperature would be around room temperature. At t≠0, the freezer starts running, so the internal temperature drops to the desired temperature, and then it stabilizes due to on/off cycles of the compressor.
It is not the same case in EnergyPlus. I ran a freezer model and the temperature started from a value close to the desired temperature, as if the refrigeration system was already working.
Any explanation as to why EnergyPlus is doing this?
For which E+ application/model: walking-in freezer? refrigerated warehouse? As suggested previously by Aaron Boranian here, I don't think E+ outputs "freezer" temperatures at runtime. How did you manage to track these temperatures? My understanding is that E+ product refrigeration models assume ~continuous operation. Sure, there are restocking schedule inputs, yet unlike what you're describing.
I was talking about a refrigerated warehouse, but after thinking about it, I concluded that in the real case, the refrigeration system in the warehouse is already runing, and then you add the products to the warehouse. So it is normal that the temperature starts from a the desired point.
OK. Thanks for clarifying. You could summarize your observations as an answer to your own question.