Question-and-Answer Resource for the Building Energy Modeling Community
Get started with the Help page
Ask Your Question

Revision history [back]

As @TomB says, the site UX is horrible! I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files ought to help.

import os
import zipfile

import requests


WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    Parameters
    --------------
    url : str
        URL of the zipfile.

    """
    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
        return
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
        f.write(data)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
    zf.extractall(WEATHER_DIR)

As @TomB says, the site UX is horrible! I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files (link) ought to help.

import os
import zipfile

import requests


WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    Parameters
    --------------
    url : str
        URL of the zipfile.

    """
    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
        return
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
        f.write(data)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
    zf.extractall(WEATHER_DIR)

As @TomB says, the site UX is horrible! I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly.fly. It's possible that the sites available have changed in the intervening time, I haven't checked.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files (link) ought to help.

import os
import zipfile

import requests


WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    Parameters
    --------------
    url : str
        URL of the zipfile.

    """
    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
        return
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
        f.write(data)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
    zf.extractall(WEATHER_DIR)

As @TomB says, the site UX is horrible! (Edit: I'm talking specifically about searching for a weather file, not the site in general.) I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly. It's possible that the sites available have changed in the intervening time, I haven't checked.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files (link) ought to help.

import os
import zipfile

import requests


WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    Parameters
    --------------
    url : str
        URL of the zipfile.

    """
    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
        return
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
        f.write(data)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
    zf.extractall(WEATHER_DIR)

As @TomB says, the site UX is horrible! horrible isn't ideal for automated downloading. (Edit: I'm talking specifically about searching for a weather file, not the site in general.) I made a similar tool in Excel a few years ago which may be of use.

Here's a link to the list of paths I ended up building along with a few characteristics. This seemed an easier approach than building the URLs on the fly. It's possible that the sites available have changed in the intervening time, I haven't checked.

Here's a link to an unlocked version of the Excel tool showing the workings.

And here's a blog post describing the tool.

In terms of the Python you'll need, this snippet I wrote while fetching the Prometheus weather files (link) ought to help.

import os
import zipfile

import requests


WEATHER_DIR = os.path.join(os.pardir, 'weather_files')

def fetch_and_extract(url):
    """Download and extract a zipfile to a weather directory.

    Parameters
    --------------
    url : str
        URL of the zipfile.

    """
    filename = os.path.join(WEATHER_DIR, url.split('/')[-1])
    if os.path.isfile(filename):  # we already have this file
        return
    # fetch the zipfile
    data = requests.get(url).content
    with open(filename, 'wb') as f:
        print "Saving {}".format(filename)
        f.write(data)
    # extract the zipfile
    zf = zipfile.ZipFile(filename)
    zf.extractall(WEATHER_DIR)