Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question
3

Output hourly HDR for uncontrolled windows

asked 2017-02-25 16:08:52 -0500

Determinant gravatar image

updated 2017-08-07 13:21:02 -0500

I'd like to do glare analysis (evalglare) using HDR images generated in OS/Radiance. First step is to get the images generated. The below is the approach I'm proposing.

For each valid hour (I intend to eventually filter for the hours I care about):

somehow convert OS/Radiance's annual-sky.mtx to a single hour, say, hour.sky then:

oconv materials/materials.rad model.rad skies/hour.sky > octrees/images.oct
#glare_sensor.vth is a file specifying a fisheye view from the glare sensor
rpict -av .3 .3 .3 -ab 1 -vf glare_sensor.vth octrees/images.oct > temp.hdr
evalglare -vta -vv 180 -vh 180 temp.hdr

But, how do I go from annual-sky.mtx to hour.sky?

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
2

answered 2017-03-23 14:30:12 -0500

Determinant gravatar image

I got this to work via separate Python code that:

  1. Creates a library of skies with gendaylit for all the hours in the weather files I'm looking at.
  2. Makes an octree using oconv with the appropriate sky and the model.rad and materials.rad generated by OS/Radiance only if the sun is within view of the window (by azimuth)
  3. Makes a fisheye HDR image with rpictand that octree and the parameters from the glare sensor files generated by OS/Radiance
  4. Outputs the glare metrics using evalglare
edit flag offensive delete link more

Comments

+1 for __python__.

__AmirRoth__ gravatar image__AmirRoth__ ( 2017-03-23 14:39:30 -0500 )edit

Nice work! How about the rendering parameters? Definitely have a look at that paper I linked to below (et al.) for ideas on optimal rendering parameters, to balance the accuracy and time. You can generally pull way back on the number of ambient bounces for these glare images, with caveats.

rpg777 gravatar imagerpg777 ( 2017-03-23 15:33:14 -0500 )edit
1

Yeah, that's the paper I was thinking of when I considered taking this approach. ab 0 is what I used and it sped things up considerably, as did rendering only when the sun was within the azimuth view range of the window (given that the paper alluded to DGPs being sufficient for diffuse conditions).

Determinant gravatar imageDeterminant ( 2017-03-23 20:46:08 -0500 )edit
3

answered 2017-02-27 16:36:16 -0500

This is awesome and ambitious and not without its problems and challenges.(!)

First of all, the dctimestep command in your example suggests repurposing the three phase stuff that's in the OpenStudio Radiance measure; this method will produce images that are far too coarse in resolution to be valid for glare evaluation (last I checked, dctimestep still only works with the Tregenza sky discretization (145 + 1)). You will also need a command to generate the daylight coefficients, and they need to be luminance coefficients (opposed to illuminance coefficients as we generate for the illuminance maps).

Also be aware that blindly globbing all the views could result in a ton of images to generate, and this takes enormous amounts of time and requires a lot of disk space to store, and memory to merge. We ran into enough problems just trying to manage all the matrices for all the illuminance maps, and these are usually on the order of 100-2000 rows, and a single 200px square image is 40K rows.

A better approach might be to generate regular (single point-in-time) images, with a continuous sky model and an actual source for the sun. Which brings up the question of where are these views you're rendering? Are they the viewpoints on the window that are automatically placed by the Radiance forward translator, or are they user-defined "glare sensor" points? If the latter, your rendering parameters will need to be a tad higher (read: take longer). The views should be defined as angular fisheye views as well if your intent is to pass the renderings to evalglare.

There really are a lot of things to consider before rendering a bunch of images. I recommend reading Jan Wienold's work on this, starting with this paper, and considering his proposed "Enhanced Simplified DGP" methodology, which uses the vertical eye illuminance we already calculate with OpenStudio but augments this information with a simplified image that could be generated relatively easily.

edit flag offensive delete link more

Comments

1

First clarification, I was only going to do this for uncontrolled windows, which, on another thread I think you mentioned was single-phase. Also, I'm only planning on doing one view. A single glare sensor. I would for sure consider enhanced DGP but am taking a first stab at just getting images produced. Does this change any of your above answer? If I can ask a possibly ignorant question my main problem that I thought I had was that the Radiance documentation on dctimestep didn't mention where the view for the image was input (is it in the ...%....hdr file??

Determinant gravatar imageDeterminant ( 2017-02-27 17:02:34 -0500 )edit
1

@Determinant I'm following your conversations with Rob here. Since OpenStudio is designed to use Radiance as a support for E+ most of what you want to do needs a lot of customization. You can do most of this stuff with Honeybee[+] if you're willing to use Rhino and Grasshopper. Just another option to consider.

Mostapha Roudsari gravatar imageMostapha Roudsari ( 2017-02-27 21:38:14 -0500 )edit
1

Just a minor correction -- dctimestep handles Reinhart skies, which are variable resolution. It can also handle non-Klems matrices, but not as XML files. That would have to be given as a matrix file with header, created with rfluxmtx or rmtxop. Which brings me to your first dctimestep command for generating images -- you seem to have an extra view matrix there in the second position. I don't think this belongs. The HDR image sequence should serve as your view matrix in this case, as you are just using it in daylight coefficient mode.

GregWard gravatar imageGregWard ( 2017-02-28 10:30:30 -0500 )edit
1

Thanks, everyone! Yeah Greg, I was ASSuming these were controlled windows and therefore 3-phase; I knew dctimestep could deal with Reinhart resolutions, but didn't know about this non-Klems transmission matrix option! SO, that's cool. Thanks for the clarification on the controlled/uncontrolled issue @Determinant, but I still think Mo has a point; for straight up glare evaluation Honeybee[+] may be an easier way to go. If you want all the building energy stuff too though, you may wanna stay the course with OpenStudio.

rpg777 gravatar imagerpg777 ( 2017-02-28 10:43:41 -0500 )edit

OK, I've been reading up to understand where you're all coming from but there's still gaps for me. Let's start with an example command I think is in the ballpark to make a sequence of images: dctimestep -o tstep%04d.hdr dcomp%03d.hdr annual-sky.mtx. tstep%04d.hdr is the output, dcomp%03d.hdr is the view matrix, annual-sky.mtx are the skies. I think annual-sky-mtx will have already been made by this point in the RadianceMeasure, but, how do I make dcomp%03d.hdr?

Determinant gravatar imageDeterminant ( 2017-03-09 21:13:02 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

 



Question Tools

1 follower

Stats

Asked: 2017-02-25 16:08:52 -0500

Seen: 162 times

Last updated: Mar 23 '17