Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Revision history [back]

Whether you have a diesel generator or not in your model, it doesn't matter from a conceptual standpoint.

Ultimately you'll have to decide when it runs or not, which is what you'll have to do in order to say when electricity is more or less expensive in your case.

Work from what you know: if you know when the power outages usually occur, etc. In the summer it's likely during the day, when there's more cooling needs and transmission lines are stressed and overheated.

Go as granular as what you know can provide. A monthly or seasonal typical time of day tariff might be more than enough.

For example, you could assume that during the winter there's an average 2-hour long power outage that can be anywhere between 10 am and 4 pm. Assume a linear, or if you think it's centered on a given value assume a curved distribution and assign that to your tariff for January.

In the summer you might postulate there's an average of 4 hours of power outage, between 12pm and 4 pm. Assume for example it's centered on 2 pm (when peak cooling is likely to happen) with a curved distribution and assign that to your summer months, etc.

One very important thing: the utility company in the area where you project is located might actually keep track of power outage events. It'd be much better to use historical events to build the above tariffs than to guesstimate.