Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question
7

How can I automate downloads of weather data from EnergyPlus with Python?

asked 2016-05-24 12:23:22 -0500

scbarrett410's avatar

updated 2017-05-29 17:39:21 -0500

I am looking to script downloads from energyplus.net – I want to automate the download of an epw weather file by location – a process you can perform manually on the energyplus.net site under the Weather tab. I don't need to utilize Energy Plus's simulation capabilities, I just want the raw data. I am still new to scripting, but I assume that I need the site's API. I have already done this with mesonet.agron.iastate.edu, but energy plus has more of the data I need.

Any help would be greatly appreciated!

edit retag flag offensive close merge delete

5 Answers

Sort by » oldest newest most voted
1

answered 2016-12-15 16:04:42 -0500

thoran's avatar
edit flag offensive delete link more
5

answered 2016-05-25 11:11:06 -0500

I may be cracking out of turn here, but the weather files for US locations are also on the Building Component Library -- which does have an API...

edit flag offensive delete link more

Comments

1

I like the search facility here. Will have to take a look at the API.

Jamie Bull's avatar Jamie Bull  ( 2016-05-25 11:15:48 -0500 )edit
4

answered 2016-05-25 03:28:53 -0500

updated 2016-05-26 05:26:02 -0500

As @TomB says, the site UX is horrible isn't ideal for automated downloading. (Edit: I'm talking specifically about searching for a weather file, not the site in general.) I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly. It's possible that the sites available have changed in the intervening time, I haven't checked.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files (link) ought to help.

import os
import zipfile

import requests


WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    Parameters
    --------------
    url : str
        URL of the zipfile.

    """
    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
        return
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
        f.write(data)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
    zf.extractall(WEATHER_DIR)
edit flag offensive delete link more

Comments

1

@Jamie Bull, if the site UX is "horrible", you can make suggestions for improvement at https://energyplus.uservoice.com/.

__AmirRoth__'s avatar __AmirRoth__  ( 2016-05-25 08:06:09 -0500 )edit
1

Thanks! The paths list was exactly what I needed - how did you get those?

scbarrett410's avatar scbarrett410  ( 2016-05-25 09:26:23 -0500 )edit

I don't quite remember but I imagine I looked a few up and reverse-engineered the paths.

Jamie Bull's avatar Jamie Bull  ( 2016-05-25 09:58:27 -0500 )edit

@__AmirRoth__ "This website is exclusively for ideas and suggestions for the EnergyPlus simulation engine.". Is that the right place for this suggestion?

Jamie Bull's avatar Jamie Bull  ( 2016-05-25 09:59:42 -0500 )edit
1

@__AmirRoth__ we did the same type of hacking the html path. Tedious... Would love a web api for this to connect to all of our scripts!

jmcneill's avatar jmcneill  ( 2016-05-25 10:37:42 -0500 )edit
2

answered 2016-05-24 17:21:33 -0500

TomB's avatar

Looking at the structure of the data on the weather pages at https://energyplus.net/weather site I can see why you would want to automate the download of all weather files. You don't have to have the site's API. You can write a web scraping script to crawl the links and download the files. It's not a difficult task. Scraping can be inappropriate, so it's good form to communicate with the site masters to check if they have a problem with it. Another good resource is white box technologies. http://weather.whiteboxtechnologies.c...

edit flag offensive delete link more

Comments

I use white box tech for my weather files.

sashadf1's avatar sashadf1  ( 2021-07-14 15:26:38 -0500 )edit
1

answered 2016-05-25 10:26:49 -0500

Chienman's avatar

Yeah, I'd just scrape it. You can write a recursive routine that just looks for the "btn-group-vertical" class and then iteratively loops through its children, terminating when it sees a "download all" InnerHtml on the anchor tag.

Personally, I like the UX for the weather site for this reason. It is pretty easy to write this iterative routine because of the consistency of the buttons and anchor tags.

Sorry, I don't have any python code for this, but I've done it in c# and javascript on a few different occasions now. JQuery makes this pretty easy.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Careers

Question Tools

1 follower

Stats

Asked: 2016-05-24 12:23:22 -0500

Seen: 2,140 times

Last updated: Jul 13 '21