Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

internal server error after successful runs

asked 2019-01-27 19:21:25 -0500

Ruth Urban's avatar

I have an OpenStudio Parametric Analysis Tool run that appears to run each measure combination successfully. However, when I hit "View Analysis" so I can download the results.csv file, I get the following error: "Executor error during find command: OperationFailed: Sort operation used more than the maximum 33554432 bytes of RAM. Add an index, or specify a smaller limit. (96)" Any clues?

Alternatively--is there a better way to match the very long string of characters identifying a run in the local results folder with the design alternative I've assigned? I've been pulling the long string of characters/design alternative pairing off the results.csv file.


edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2019-01-28 12:03:24 -0500

@Ruth Urban are you running the analysis locally (http://localhost:8080/) or are you running it on the cloud? I think I have seen the error message you described when trying to view the server and get results.csv on a large local run. I have not seen this error on cloud analysis.

If your goal is to view a specific datapoint, you can map the design alternative name the the Datapoint ID by expanding the datpoint in the Bottom of the run tab in PAT as shown below. image description

If you want a summary CSV across all datapints, and you can' t get that in PAT because of a server issue, I can share a ruby script witch will make a results.csv file for any collection of datapints. It operates by parsing through the out.osw file for each datpoint to extract runner.registerValue objets.

edit flag offensive delete link more


Thanks David! I was running locally, and this morning, running on AWS, I didn't get that error. However, I also shortened the design alternative names, so I wasn't sure which change removed the error. I will continue to run in the cloud--yesterday's run was done locally only because I wasn't in any particular hurry.

Your ruby script seems like a very useful thing. I hadn't realized that the out.osw file had the datapoint ID in it--good to know for when I screw up saving the results.csv file.

Ruth Urban's avatar Ruth Urban  ( 2019-01-28 12:24:46 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer


Question Tools


Asked: 2019-01-27 19:21:25 -0500

Seen: 142 times

Last updated: Jan 28 '19