Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

How can I automate downloads of weather data from EnergyPlus with Python?

asked 2016-05-24 12:23:22 -0600

scbarrett410 gravatar image

updated 2017-05-29 17:39:21 -0600

I am looking to script downloads from – I want to automate the download of an epw weather file by location – a process you can perform manually on the site under the Weather tab. I don't need to utilize Energy Plus's simulation capabilities, I just want the raw data. I am still new to scripting, but I assume that I need the site's API. I have already done this with, but energy plus has more of the data I need.

Any help would be greatly appreciated!

edit retag flag offensive close merge delete

5 Answers

Sort by » oldest newest most voted

answered 2016-12-15 16:04:42 -0600

thoran gravatar image
edit flag offensive delete link more

answered 2016-05-25 11:11:06 -0600

I may be cracking out of turn here, but the weather files for US locations are also on the Building Component Library -- which does have an API...

edit flag offensive delete link more



I like the search facility here. Will have to take a look at the API.

Jamie Bull gravatar image Jamie Bull  ( 2016-05-25 11:15:48 -0600 )edit

answered 2016-05-25 03:28:53 -0600

updated 2016-05-26 05:26:02 -0600

As @TomB says, the site UX is horrible isn't ideal for automated downloading. (Edit: I'm talking specifically about searching for a weather file, not the site in general.) I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly. It's possible that the sites available have changed in the intervening time, I haven't checked.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files (link) ought to help.

import os
import zipfile

import requests

WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    url : str
        URL of the zipfile.

    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
edit flag offensive delete link more



@Jamie Bull, if the site UX is "horrible", you can make suggestions for improvement at

__AmirRoth__ gravatar image __AmirRoth__  ( 2016-05-25 08:06:09 -0600 )edit

Thanks! The paths list was exactly what I needed - how did you get those?

scbarrett410 gravatar image scbarrett410  ( 2016-05-25 09:26:23 -0600 )edit

I don't quite remember but I imagine I looked a few up and reverse-engineered the paths.

Jamie Bull gravatar image Jamie Bull  ( 2016-05-25 09:58:27 -0600 )edit

@__AmirRoth__ "This website is exclusively for ideas and suggestions for the EnergyPlus simulation engine.". Is that the right place for this suggestion?

Jamie Bull gravatar image Jamie Bull  ( 2016-05-25 09:59:42 -0600 )edit

@__AmirRoth__ we did the same type of hacking the html path. Tedious... Would love a web api for this to connect to all of our scripts!

jmcneill gravatar image jmcneill  ( 2016-05-25 10:37:42 -0600 )edit

answered 2016-05-24 17:21:33 -0600

TomB gravatar image

Looking at the structure of the data on the weather pages at site I can see why you would want to automate the download of all weather files. You don't have to have the site's API. You can write a web scraping script to crawl the links and download the files. It's not a difficult task. Scraping can be inappropriate, so it's good form to communicate with the site masters to check if they have a problem with it. Another good resource is white box technologies. http://weather.whiteboxtechnologies.c...

edit flag offensive delete link more


I use white box tech for my weather files.

sashadf1 gravatar image sashadf1  ( 2021-07-14 15:26:38 -0600 )edit

answered 2016-05-25 10:26:49 -0600

Chienman gravatar image

Yeah, I'd just scrape it. You can write a recursive routine that just looks for the "btn-group-vertical" class and then iteratively loops through its children, terminating when it sees a "download all" InnerHtml on the anchor tag.

Personally, I like the UX for the weather site for this reason. It is pretty easy to write this iterative routine because of the consistency of the buttons and anchor tags.

Sorry, I don't have any python code for this, but I've done it in c# and javascript on a few different occasions now. JQuery makes this pretty easy.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Training Workshops

Question Tools

1 follower


Asked: 2016-05-24 12:23:22 -0600

Seen: 1,543 times

Last updated: Jul 13 '21