Wetterdienst - Open weather data for humans

Overview

Wetterdienst - Open weather data for humans

temperature timeseries of Hohenpeissenberg/Germany

"Three things are (almost) infinite: the universe, human stupidity and the temperature time series of Hohenpeissenberg I got with the help of wetterdienst; and I'm not sure about the universe." - Albert Einstein

Overview

Documentation Status https://pepy.tech/badge/wetterdienst/month https://img.shields.io/github/license/earthobservations/wetterdienst

Introduction

Welcome to Wetterdienst, your friendly weather service library for Python.

We are a group of like-minded people trying to make access to weather data in Python feel like a warm summer breeze, similar to other projects like rdwd for the R language, which originally drew our interest in this project. Our long-term goal is to provide access to multiple weather services as well as other related agencies such as river measurements. With wetterdienst we try to use modern Python technologies all over the place. The library is based on pandas across the board, uses Poetry for package administration and GitHub Actions for all things CI. Our users are an important part of the development as we are not currently using the data we are providing and only implement what we think would be the best. Therefore contributions and feedback whether it be data related or library related are very welcome! Just hand in a PR or Issue if you think we should include a new feature or data source.

Acknowledgements

We want to acknowledge all environmental agencies which provide their data open and free of charge first and foremost for the sake of endless research possibilities.

We want to acknowledge Jetbrains and their open source team for providing us with licenses for Pycharm Pro, which we are using for the development.

We want to acknowledge all contributors for being part of the improvements to this library that make it better and better every day.

Coverage

DWD (German Weather Service / Deutscher Wetterdienst / Germany)
  • Historical Weather Observations
    • Historical (last ~300 years), recent (500 days to yesterday), now (yesterday up to last hour)
    • Every minute to yearly resolution
    • Time series of stations in Germany
  • Mosmix - statistical optimized scalar forecasts extracted from weather models
    • Point forecast
    • 5400 stations worldwide
    • Both MOSMIX-L and MOSMIX-S is supported
    • Up to 115 parameters
  • Radar
    • 16 locations in Germany
    • All of Composite, Radolan, Radvor, Sites and Radolan_CDC
    • Radolan: calibrated radar precipitation
    • Radvor: radar precipitation forecast
ECCC (Environnement et Changement Climatique Canada, Environment and Climate Change Canada, Canada)
  • Historical Weather Observations
    • Historical (last ~180 years)
    • Hourly, daily, monthly, (annual) resolution
    • Time series of stations in Canada

To get better insight on which data we have currently made available and under which license those are published take a look at the data section.

Features

  • API(s) for stations (metadata) and values
  • Get station(s) nearby a selected location
  • Define your request by arguments such as parameter, period, resolution, start date, end date
  • Command line interface
  • Web-API via FastAPI
  • Run SQL queries on the results
  • Export results to databases and other data sinks
  • Public Docker image

Setup

wetterdienst can be used by either installing it on your workstation or within a Docker container.

Native

Via PyPi (standard):

pip install wetterdienst

Via Github (most recent):

pip install git+https://github.com/earthobservations/wetterdienst

There are some extras available for wetterdienst. Use them like:

pip install wetterdienst[http,sql]
  • docs: Install the Sphinx documentation generator.
  • ipython: Install iPython stack.
  • export: Install openpyxl for Excel export and pyarrow for writing files in Feather- and Parquet-format.
  • http: Install HTTP API prerequisites.
  • sql: Install DuckDB for querying data using SQL.
  • duckdb: Install support for DuckDB.
  • influxdb: Install support for InfluxDB.
  • cratedb: Install support for CrateDB.
  • mysql: Install support for MySQL.
  • postgresql: Install support for PostgreSQL.

In order to check the installation, invoke:

wetterdienst --help

Docker

Docker images for each stable release will get pushed to GitHub Container Registry.

There are images in two variants, wetterdienst-standard and wetterdienst-full.

wetterdienst-standard will contain a minimum set of 3rd-party packages, while wetterdienst-full will try to serve a full environment by also including packages like GDAL and wradlib.

Pull the Docker image:

docker pull ghcr.io/earthobservations/wetterdienst-standard
Library

Use the latest stable version of wetterdienst:

$ docker run -ti ghcr.io/earthobservations/wetterdienst-standard
Python 3.8.5 (default, Sep 10 2020, 16:58:22)
[GCC 8.3.0] on linux
import wetterdienst
wetterdienst.__version__
Command line script

The wetterdienst command is also available:

# Make an alias to use it conveniently from your shell.
alias wetterdienst='docker run -ti ghcr.io/earthobservations/wetterdienst-standard wetterdienst'

wetterdienst --version
wetterdienst --help

Example

Acquisition of historical data for specific stations using wetterdienst as library:

>>> from wetterdienst import Wetterdienst
>>> API = Wetterdienst("dwd", "observation")
>>> request = API(
...    parameter=["climate_summary"],
...    resolution="daily",
...    start_date="1990-01-01",  # Timezone: UTC
...    end_date="2020-01-01",  # Timezone: UTC
...    tidy=True,  # default, tidy data
...    humanize=True,  # default, humanized parameters
... ).filter_by_station_id(station_id=(1048, 4411))
>>> stations = request.df  # station list
>>> values = request.values.all().df  # values

Receiving of stations for defined parameters using the wetterdienst client:

# Get list of all stations for daily climate summary data in JSON format
wetterdienst dwd observations stations --parameter=kl --resolution=daily --period=recent

# Get daily climate summary data for specific stations
wetterdienst dwd observations values --station=1048,4411 --parameter=kl --resolution=daily --period=recent

Further examples (code samples) can be found in the examples folder.

Documentation

We strongly recommend reading the full documentation, which will be updated continuously as we make progress with this library:

https://wetterdienst.readthedocs.io/

For the whole functionality, check out the Wetterdienst API section of our documentation, which will be constantly updated. To stay up to date with the development, take a look at the changelog. Also, don't miss out our examples.

Data license

Licenses of the available data can be found in our documentation at the data license section. Licenses and usage requirements may differ so check this out before including the data in your project to be sure to fulfill copyright issues beforehand.

Contribution

There are different ways in which you can contribute to this library:

  • by handing in a PR which describes the feature/issue that was solved including tests for newly added features
  • by using our library and reporting bugs to us either by mail or by creating a new Issue
  • by letting us know either via issue or discussion what function or data source we may include into this library describing possible solutions or acquisition methods/endpoints/APIs

Development

  1. Clone the library and install the environment.

    This setup procedure will outline how to install the library and the minimum dependencies required to run the whole test suite. If, for some reason, you are not available to install all the packages, just leave out some of the "extras" dependency tags.

git clone https://github.com/earthobservations/wetterdienst
cd wetterdienst

# Install package in editable mode.
pip install --editable=.[http,sql,export,ui]

# Alternatively, when using Poetry.
poetry install --extras=http --extras=sql --extras=export --extras=ui
  1. For running the whole test suite, you will need to have Firefox and geckodriver installed on your machine. Install them like:

    # macOS
    brew install --cask firefox
    brew install geckodriver
    
    # Other OS
    # You can also get installers and/or release archives for Linux, macOS
    # and Windows at
    #
    # - https://www.mozilla.org/en-US/firefox/new/
    # - https://github.com/mozilla/geckodriver/releases
    

    If this does not work for some reason and you would like to skip ui-related tests on your machine, please invoke the test suite with:

    poe test -m "not ui"
    
  2. Edit the source code, add corresponding tests and documentation for your changes. While editing, you might want to continuously run the test suite by invoking:

    poe test
    

    In order to run only specific tests, invoke:

    # Run tests by module name or function name.
    poe test -k test_cli
    
    # Run tests by tags.
    poe test -m "not (remote or slow)"
    
  3. Before committing your changes, please als run those steps in order to make the patch adhere to the coding standards used here.

poe format  # black code formatting
poe lint    # lint checking
poe export  # export of requirements (for Github Dependency Graph)
  1. Push your changes and submit them as pull request

    Thank you in advance!

Note

If you need to extend the list of package dependencies, invoke:

# Add package to runtime dependencies.
poetry add new-package

# Add package to development dependencies.
poetry add --dev new-package

Known Issues

Under Mac ARM64 you need to install pandas, numpy and scipy as follows before continuing with the regular setup:

pip install pandas --no-use-pep517
pip install numpy --no-use-pep517
pip install --no-binary :all: --no-use-pep517 scipy

Further additional libraries are affected and have to be installed in a similar manner:

# SQL related
brew install postgresql
brew link openssl (and export ENVS as given)
pip install psycopg2-binary --no-use-pep517

Furthermore as h5py is currently bound to versions of numpy that conflict with the ARM64 ready libraries, h5py itself as well as wradlib are not available for users with that architecture!

Important Links

Wetterdienst API

Changelog

Comments
  • Aligning radar data upstream timestamps to straight/floored interval marks

    Aligning radar data upstream timestamps to straight/floored interval marks

    Introduction

    Within #190, we are trying to get hold of all possibilities to acquire radar data from the DWD data repository. We found different anomalies there. The most prominent one is the 5 minute mark alignment problem which is revolving around the possibility to address specific files by using timestamps. So, I would like to spawn a discussion about that within a different issue (here).

    cc @meteoDaniel, @kmuehlbauer

    opened by amotl 33
  • Improve radar data acquisition

    Improve radar data acquisition

    Dear @meteoDaniel,

    this implements #138 in order to also acquire radar data from /weather/radar besides /climate_environment/CDC/grids_germany/{5_minutes,hourly,daily}/radolan. It is based upon your #140.

    After the refactorings coming from #188 and #189, I took the chance to rework your contribution coming from #140 based on these changes. It might save you quite some amount of work when picking this up again.

    With kind regards, Andreas.

    feature 
    opened by amotl 23
  • Improve speed of Poetry on CI

    Improve speed of Poetry on CI

    Hi there,

    @gutzbenj tried to improve speed on GHA CI through #328 and I later chimed in through #331.

    We observed different problems, specifically when caching whole Python virtualenvs using GHA's actions/cache and are discussing it with @sondrelg at https://github.com/snok/install-poetry/issues/18.

    However, the recommendation is actually not to do it at all, see https://github.com/actions/cache/issues/175. Otherwise, this will probably bring in more problems than not. We should really trust @webknjaz and @pradyunsg here:

    Usually only pip's wheel cache should be cached in GHA. Caching the whole interpreter installation is going to create problems every time they upgrade it, see https://github.com/actions/cache/issues/175#issuecomment-636704469.

    Please don't cache site-packages or entire interpreter trees, see https://github.com/actions/cache/issues/175#issuecomment-636799040.

    The reason why it takes twice the time to install packages from the download cache into the environment on Windows is that Windows is just slow when handling thousands of small files, as @maphew suggested at https://github.com/actions/cache/issues/175#issuecomment-637297220.

    With kind regards, Andreas.

    opened by amotl 22
  • Slightly struggling to use the CLI

    Slightly struggling to use the CLI

    Hi Benjamin,

    we are currently working on finding out about differences in measurement reportings between Berlin-Tempelhof and Berlin-Buch. So, we tried to exercise some command line invocations, as inspired by the documentation, mostly wetterdienst --help. This is merely a report about our discoveries.

    DWD » 10min » humidity

    Using the command

    wetterdienst values --provider=dwd --kind=observation --period=recent --resolution=10min --station=00400,00433 --parameter=humidity
    

    raises an exception:

    10min could not be parsed from DwdObservationResolution.
    Traceback (most recent call last):
      File "/Users/amo/dev/earthobservations/wetterdienst/wetterdienst/util/enumeration.py", line 41, in parse_enumeration_from_template
        enum_parsed = intermediate[enum_name.upper()]
      File "/usr/local/Cellar/[email protected]/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/enum.py", line 432, in __getitem__
        return cls._member_map_[name]
    KeyError: '10MIN'
    
    During handling of the above exception, another exception occurred:
    

    This report is better:

    wetterdienst values --provider=dwd --kind=observation --period=recent --resolution=hourly --station=00400,00433 --parameter=kl
    
    The combination of kl, hourly, recent is invalid.
    No data available for given constraints
    No data available for given constraints
    

    DWD » Parameter discovery

    Corrupt JSON output?

    Works: wetterdienst about coverage --provider=dwd --kind=observation Fails: wetterdienst about coverage --provider=dwd --kind=observation | jq

    parse error: Invalid numeric literal at line 1, column 12
    

    Discover available parameters

    Trying to find out what parameters the 10min resolution would offer, also was not successful:

    wetterdienst about coverage --provider=dwd --kind=observation --resolution=10min
    Error: No such option: --resolution
    

    Thoughts and questions

    • Maybe we can improve the convience on those topics a bit more?
    • How would I find out which labels I can use on the --parameter option when aiming to acquire data from the 10min resolution product?

    With kind regards, Andreas.

    opened by amotl 21
  • Acquire data from MOSMIX_L/all_stations dataset

    Acquire data from MOSMIX_L/all_stations dataset

    Hello, This is my first feature request so please let me know if i did something wrong.

    All of this is regarding the provider dwd.

    As the title already states: I want to grab all mosmix forecasts from dwd. The dwd API offers 'single_stations' and 'all_stations'. http://opendata.dwd.de/weather/local_forecasts/mos/MOSMIX_L/ I want to reduce the amount of queries as i expect lot of overhead this way. Is there a way to force wetterdienst to use the 'all_stations' way?

    In wetterdienst.provider.dwd.constans.py i have found the 'all_stations' filepath stored in variable 'DWD_MOSMIX_L_PATH'. This variable is never used in later code so i assume wetterdienst currently loads everything from single stations.

    An basic idea of archieving this could be: Load the 'all_stations'-file and split it into single_stations and save into cache. Later queries will find the cached versions. The KMLReader could be adapted for this purpose.

    My list of questions:

    • Is this a feature of actual interest?
    • Is this feature already implemented and I just didnt find it?
    • If i forked this feature would it be accepted?

    Why do i want to grab the forecasts from all stations? I require historical predictions for ML purposes.

    opened by TheNeedForSleep 18
  • InfluxDB authentication and InfluxDB 2.x compatibility

    InfluxDB authentication and InfluxDB 2.x compatibility

    Hi all, I was wondering if it is possible to use the data-export with a authentification-enabled influx installation (See here). Your documentation here didn’t mention that scenario. If it isn’t possible yet, would it be a high effort to implement it?

    opened by jscmidt 18
  • Long runtime and wrong number of observations

    Long runtime and wrong number of observations

    Describe the bug I experience very long execution times for the following script and get a wrong number of observations:

    from wetterdienst.dwd import observations as obs
    
    sites = obs.DWDObservationSites(
        parameter_set=obs.DWDObservationParameterSet.TEMPERATURE_AIR,
        resolution=obs.DWDObservationResolution.HOURLY,
        period=obs.DWDObservationPeriod.RECENT,
        start_date="2020-06-09 12:00:00",
        end_date="2020-06-09 12:00:00",
    )
    df = sites.all()
    ids, lat, lon = map(np.array, [df.STATION_ID, df.LAT, df.LON])
    observations = obs.DWDObservationData(
        station_ids=ids,
        parameters=obs.DWDObservationParameter.HOURLY.TEMPERATURE_AIR_200,
        resolution=obs.DWDObservationResolution.HOURLY,
        start_date="2020-06-09 12:00:00",
        end_date="2020-06-09 12:00:00",
    )
    temp = np.array(observations.collect_safe().VALUE, dtype=float)
    head = "id, lat, lon, temp"
    np.savetxt("temp_obs.txt", np.array([ids, lat, lon, temp]).T, header=head)
    

    After ~15min I get an Error, because there are 496 stations but only 494 temp values, so numpy can't save this to a text file.

    Expected behavior When truncating the station array, it works fine:

    ids, lat, lon = ids[:10], lat[:10], lon[:10]
    

    Desktop (please complete the following information):

    • OS: Linux
    • Python-Version 3.6

    Am I doing something wrong?

    Cheers, Sebastian

    opened by MuellerSeb 18
  • Update radar tests to use wradlib

    Update radar tests to use wradlib

    Hi there,

    @neumann-nico was so nice and improved the radar tests by using wradlib by @kmuehlbauer. Thanks a stack!

    I am now merely pulling this patch into a PR in order to keep it as a note because a) the patch still looks like a work in progress and b) we will have to figure out whether we should include wradlib (and then also GDAL!?) into the list of dependencies to run the tests on. While the latter might not be too troublesome on Linux or macOS, it might be a totally different story on Windows.

    With kind regards, Andreas.

    opened by amotl 17
  • Get rid of dbmfile-based dogpile cache

    Get rid of dbmfile-based dogpile cache

    Describe the bug The dogpile cache based on dbmfile is brittle.

    To reproduce Run Wetterdienst from different Python environments and see accessing the shared cache break more often than not, see #217, #232, #233, #242 and #244. Also, #236 seems to be related as well.

    Expected behavior Wetterdienst should work in all circumstances, even when switching between different Python environments.

    Additional context While the documentation about pickle [1] promises that

    The pickle serialization format is guaranteed to be backwards compatible across Python releases provided a compatible pickle protocol is chosen and pickling and unpickling code deals with Python 2 to Python 3 type differences if your data is crossing that unique breaking change language boundary.

    it apparently still has problems in our context. While I currently don't have a clue why, I figure it might be coming from marshalling/unmarshalling data frames from different versions of Pandas. We can either investigate this further or use a different means of data storage and/or serialization protocol for the dogpile cache.

    [1] https://docs.python.org/3/library/pickle.html

    bug help wanted 
    opened by amotl 17
  • MetaFileNotFound with documentation example

    MetaFileNotFound with documentation example

    Hey guys, after not having used wetterdienst for a while as usual I'm struggling to adapt the code that used to work with the latest version. However, I cannot even run the example on the doc, e.g.

    request = DwdObservationRequest(
        parameter=[DwdObservationDataset.CLIMATE_SUMMARY],
        resolution=DwdObservationResolution.DAILY,
        start_date="1990-01-01",
        end_date="2020-01-01",
    ).filter_by_station_id(station_id=[3, 1048])
    

    MetaFileNotFound: No meta file was found amongst the files at https://opendata.dwd.de/climate_environment/CDC/observations_germany/climate/daily/kl/historical/.

    Is this a temporary error or am I doing something wrong?

    opened by guidocioni 14
  • Run radar tests including wradlib on GHA

    Run radar tests including wradlib on GHA

    Is your feature request related to a problem?

    Just an idea to go full-on with the radar tests in order to also test the radar programs within the example folder on behalf of #240. The task here is to install GDAL on a Windows environment.

    Describe the solution you'd like

    Use qgis_deploy_install_upgrade_ltr.ps1 by @Guts within a GitHub Action as .github/bin/osgeo4w_setup.ps1 when running on windows-latest in order to install the latest osgeo4w-setup-x86_64.exe.

    When using the command line option --packages, it might be able to install pkg-gdal-python only?

    • https://trac.osgeo.org/osgeo4w/wiki/CommandLine
    • https://trac.osgeo.org/osgeo4w/wiki/PackageListing
    • https://trac.osgeo.org/osgeo4w/wiki/pkg-gdal-python

    Describe alternatives you've considered

    @kmuehlbauer used Anaconda and is now transitioning to Mamba for installing the requirements of wradlib on CI.

    I am creating the issue here in order to investigate whether this will be possible without using any of both package managers and just go down the "native" path.

    opened by amotl 14
  • Add implementation for ZAMG observations

    Add implementation for ZAMG observations

    A stab at acquiring data from ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in Austria.

    • https://www.zamg.ac.at/
    • https://data.hub.zamg.ac.at/
    opened by gutzbenj 1
  • Is this the

    Is this the "best" way of getting the latest observations for all stations?

    Hiya, just went back to Wetterdienst today as I wanted to get the latest measured values for air temperature and all stations in the network.

    I came up with the following snippet

    stations = DwdObservationRequest(
        parameter=DwdObservationDataset.TEMPERATURE_AIR,
        resolution=Resolution.MINUTE_10,
        period=Period.NOW,
        start_date=pd.to_datetime("now", utc=True) - pd.to_timedelta("30 min"),
        end_date=pd.to_datetime("now", utc=True)
    )
    
    stations = stations.all().df
    ids = stations.station_id.values
    
    
    observations = DwdObservationRequest(
        parameter=DwdObservationDataset.TEMPERATURE_AIR,
        resolution=Resolution.MINUTE_10,
        period=Period.NOW
    )
    
    df = observations.filter_by_station_id(station_id=ids).values.all().df
    
    df = df.merge(stations[['station_id','height','latitude','longitude','name','state']], left_on='station_id', right_on='station_id')
    df = df.groupby("station_id").apply(lambda x: x.sort_values(by='date').tail(1)).drop(columns=['station_id']).reset_index()
    

    Which gives as expected

    station_id | level_1 | dataset | date | quality | pressure_air_site | temperature_air_mean_200 | temperature_air_mean_005 | humidity | temperature_dew_point_mean_200 | height | latitude | longitude | name | state -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- 00044 | 41 | temperature_air | 2022-12-19 06:50:00+00:00 | 2.0 | NaN | 2.5 | 1.9 | 96.9 | 2.1 | 44.0 | 52.9336 | 8.2370 | Großenkneten | Niedersachsen

    This, however, seems a little bit too complicated to only get the latest measured values. Is there a better and more concise way? :)

    P.S. For some reason the table of parameters in https://github.com/earthobservations/wetterdienst/edit/main/docs/data/coverage/dwd/observation.rst does not render

    opened by guidocioni 3
  • MetaFileNotFound

    MetaFileNotFound

    Hey!

    I got the same error as described in https://github.com/earthobservations/wetterdienst/issues/678

    Describe the bug

    Traceback (most recent call last):
      File "/opt/airflow/dags/dwd_kl_daily.py", line 67, in <module>
        r1 = DwdObservationRequest(
      File "/home/airflow/.local/lib/python3.9/site-packages/wetterdienst/core/scalar/request.py", line 624, in all
        df = self._all().copy().reset_index(drop=True)
      File "/home/airflow/.local/lib/python3.9/site-packages/wetterdienst/provider/dwd/observation/api.py", line 561, in _all
        df = create_meta_index_for_climate_observations(dataset, self.resolution, period)
      File "/home/airflow/.local/lib/python3.9/site-packages/wetterdienst/provider/dwd/observation/metaindex.py", line 84, in create_meta_index_for_climate_observations
        meta_index = _create_meta_index_for_climate_observations(dataset, resolution, period)
      File "/home/airflow/.local/lib/python3.9/site-packages/wetterdienst/provider/dwd/observation/metaindex.py", line 142, in _create_meta_index_for_climate_observations
        meta_file = _find_meta_file(files_server, url, ["beschreibung", "txt"])
      File "/home/airflow/.local/lib/python3.9/site-packages/wetterdienst/provider/dwd/observation/metaindex.py", line 170, in _find_meta_file
        **raise MetaFileNotFound(f"No meta file was found amongst the files at {url}.")**
    wetterdienst.exceptions.MetaFileNotFound: No meta file was found amongst the files at https://opendata.dwd.de/climate_environment/CDC/observations_germany/climate/daily/kl/recent/.
    

    To Reproduce Nothing special. Just a simple request which works locally on my computer.

    from os import environ
    environ['WD_CACHE_DISABLE'] = 'True'
    from wetterdienst.provider.dwd.observation import DwdObservationRequest
    from wetterdienst.provider.dwd.observation import DwdObservationRequest
    
    Settings.cache_disable = True
    r1 = DwdObservationRequest(
                      parameter=['climate_summary'],
                      resolution='daily',
                      period='recent'
    ).all()
    

    Desktop (please complete the following information):

    • OS: apache/airflow:2.5.0-python3.9
    • Python-Version 3.9

    Additional context The script works perfectly fine on the local computer. But crashes with the above mentioned error on a server instance within a docker container of apache airflow. I already switched off the cache to avoid any issues. But wetterdienst.info() refers to a location at /home/airflow/.cache/wetterdienst which doesn't exist as folder (and wasn't solved by creating the wetterdienst folder). The airflow log refers to wetterdienst.util.fsspec_monkeypatch - INFO - Dircache located at /root/.cache/wetterdienst which doesn't exist as folder (and wasn't solved by creating the wetterdienst folder).

    It seems that fsspec tries to resolve a cache directory for parsing the metadate file from the url but receives an empty list of files which led to the error and doesn't even try to request the content of the url. The dircache at /root/.cache/ seems to be misleading as it shouldn't be started as root. So my best guess is some authorization issue in a linux based context based on the fsspec_monkeypatch cache.

    I'll give it a further try tomorrow. I try to debug the issue and share my result. But I am thankful for any hints. Initially I tried to search for an environment variable to overwrite the fsspec cache.

    Regards, Marcel

    opened by scherbinek 8
  • CI: Collection of flukes

    CI: Collection of flukes

    Hi there,

    within this issue, we are collecting some observations of flaky behavior on CI. It is meant to get the big picture, so that we can improve the robustness of the test suite gradually, by identifying the bad spots. To be able to do that, it is important to diligently record all observations here.

    Most of the errors will be about concurrent file access going south, where specific tests are not appropriately marked with cflake, and the parallel testing based on pytest-xdist will hit concurrency issues.

    With kind regards, Andreas.

    • See also: https://github.com/caronc/apprise/issues/692
    opened by amotl 9
  • KeyError: 'station_id' in NoaaGhcnRequest

    KeyError: 'station_id' in NoaaGhcnRequest

    Describe the bug When getting data for specific stations around Amsterdam, using NoaaGhcnRequest, I get a KeyError 'station_id' even though the stations_object finds stations. When reproducing the code on issue 741 (https://github.com/earthobservations/wetterdienst/issues/741) I don't seem to have a problem.

    To Reproduce

    from wetterdienst.provider.noaa.ghcn.api import NoaaGhcnRequest, NoaaGhcnParameter
    import datetime as dt
    
    stations_object = NoaaGhcnRequest(
        parameter=NoaaGhcnParameter.DAILY.TEMPERATURE_AIR_MIN_200,
        start_date=dt.datetime(2010, 1, 1),
        end_date=dt.datetime(2022, 1, 1)
    ).filter_by_station_id('NLE00101920')
    
    print(stations_object)
    def get_data_from_stations_request(
        stations_object: NoaaGhcnRequest,
    ) -> pd.DataFrame:
        """
        Takes a stations request object and process queries
    
        Args:
            stations_object: DwdObservationRequest object that holds all required information
            for downloading opendata dwd data
    
        Returns:
            DataFrame with content from DwdObservationRequest
    
        """
        observation_data = []
    
        for result in stations_object.values.query():
            observation_data.append(result.df)
    
        return pd.concat(observation_data)
    
    df = get_data_from_stations_request(stations_object)
    print(df)
    

    Expected behavior Gets the temperature data

    Screenshots image

    Desktop (please complete the following information):

    • OS: Windows
    • Python-Version 3.8
    opened by nhcb 15
  • Improve runtime/bootstrap efficiency on `import wetterdienst`

    Improve runtime/bootstrap efficiency on `import wetterdienst`

    Hi there,

    I discovered this problem a time ago already, but finally wanted to take the chance to report about it.

    With kind regards, Andreas.

    Report

    Currently, it takes 4 seconds to run import wetterdienst. I think this detail should be improved earlier than later, because it will get worse with a growing number of adapters and submodules.

    time python -c 'import wetterdienst'
    
    real	0m3.111s
    user	0m3.962s
    sys	0m0.773s
    

    Analysis

    The reason is, due to the current module / API structure, we are loading the whole module and all submodules into memory - at import-time - right?

    https://github.com/earthobservations/wetterdienst/blob/ce9eadaf8d8c649875709143c08bd798ebdd503f/wetterdienst/api.py#L20-L39

    Proposal

    Without knowing too many details yet, my proposal would be to make the ApiEndpoints module registry to evaluate its members at runtime instead. Implementing it is not too difficult, however we may use a minimal kind of plugin system to decouple those modules just a little bit more.

    Thoughts

    Would it break any sort of editor/autocompletion support, when bringing in such a change? What do you think about it?

    opened by amotl 3
Releases(v0.51.0)
  • v0.51.0(Jan 1, 2023)

    • Update wetterdienst explorer with clickable stations and slighly changed layout
    • Improve radar tests and certain dict comparisons
    • Fix problem with numeric column names in method gain_of_value_pairs
    Source code(tar.gz)
    Source code(zip)
  • v0.50.0(Dec 3, 2022)

    • Interpolation/Summary: Now the queried point can be an existing station laying on the border of the polygon that it's being checked against
    • Geo: Change function signatures to use latlon tuple instead of latitude and longitude
    • Geo: Enable querying station id instead of latlon within interpolate and summarize
    • Geo: Allow using values of nearby stations instead of interpolated values
    • Fix timezone related problems when creating full date range
    • UI: Add interpolate/summarize methods as subspaces
    Source code(tar.gz)
    Source code(zip)
  • v0.49.0(Nov 28, 2022)

    What's Changed

    • CI: Fix testing on Python 3.11 by @amotl in https://github.com/earthobservations/wetterdienst/pull/787
    • Fix bug with dropping duplicates of acquired data by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/789
    • Add NWS observation api by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/781
    • Modernize Poetry configuration and project dependencies by @amotl in https://github.com/earthobservations/wetterdienst/pull/788
    • NWS API: Adjust User-Agent header by @amotl in https://github.com/earthobservations/wetterdienst/pull/792
    • Improve ruff configuration by @amotl in https://github.com/earthobservations/wetterdienst/pull/795
    • Add Eaufrance Hubeau API by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/657
    • Dependencies: Stop exporting requirements.txt files by @amotl in https://github.com/earthobservations/wetterdienst/pull/794
    • Fix NOAA GHCN data access issues with timezones and empty data by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/798
    • Fix Eaufrance Hubeau dynamic docs header by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/799
    • Remove unnecessary setup.py by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/800
    • Create dependency-review.yml by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/801
    • Bump version to 0.49.0 by @gutzbenj in https://github.com/earthobservations/wetterdienst/pull/807

    Full Changelog: https://github.com/earthobservations/wetterdienst/compare/v0.48.0...v0.49.0

    Source code(tar.gz)
    Source code(zip)
  • v0.48.0(Nov 12, 2022)

    • Fix DWD Observation urban_pressure dataset access (again)
    • Add example to dump DWD climate summary observations in zarr with help of xarray
    Source code(tar.gz)
    Source code(zip)
  • v0.47.1(Oct 23, 2022)

  • v0.47.0(Oct 15, 2022)

  • v0.46.0(Oct 14, 2022)

  • v0.45.2(Oct 11, 2022)

  • v0.45.1(Oct 10, 2022)

  • v0.45.0(Sep 22, 2022)

    • Add interpolation of multiple weather stations for a given lat/lon point (currently only works for DWDObservationRequest)
    • Fix access of DWD Observation climate_urban datasets
    Source code(tar.gz)
    Source code(zip)
  • v0.44.0(Sep 18, 2022)

    • Slightly adapt the conversion function to satisfy linter
    • Fix parameter names:
      • we now use consistently INDEX instead of INDICATOR
      • index and form got mixed up with certain parameters, where actually index was measured/given but not the form
      • global radiation was mistakenly named radiation_short_wave_direct at certain points, now it is named correctly
    • Adjust Docker images to fix build problems, now use python 3.10 as base
    • Adjust NOAA sources to AWS as NCEI sources currently are not available
    • Make explorer work again for all services setting up Period enum classes instead of single instances of Period for period base
    Source code(tar.gz)
    Source code(zip)
  • v0.43.0(Sep 5, 2022)

    • Use lxml.iterparse to reduce memory consumption when parsing DWD Mosmix files
    • Fix Settings object instantiation
    • Change logging level for Settings.cache_disable to INFO
    • Add DWD Observation climate_urban datasets
    Source code(tar.gz)
    Source code(zip)
  • v0.42.1(Aug 25, 2022)

  • v0.42.0(Aug 22, 2022)

  • v0.41.1(Aug 4, 2022)

  • v0.41.0(Jul 24, 2022)

  • v0.40.0(Jul 12, 2022)

  • v0.39.0(Jun 26, 2022)

  • v0.38.0(Jun 9, 2022)

    • Add DWD Observation 5 minute precipitation dataset
    • Add test to compare actually provided DWD observation datasets with the ones we made available with wetterdienst
    • Fix one particular dataset which was not correctly included in our DWD observations resolution-dataset-mapping
    Source code(tar.gz)
    Source code(zip)
  • v0.37.0(Jun 6, 2022)

  • v0.36.0(May 31, 2022)

  • v0.35.0(May 29, 2022)

  • v0.34.0(May 22, 2022)

  • v0.33.0(May 15, 2022)

    • Fix acquisition of DWD weather phenomena data
    • Set default encoding when reading data from DWD with pandas to 'latin1'
    • Fix typo in EcccObservationResolution
    Source code(tar.gz)
    Source code(zip)
  • v0.32.4(May 14, 2022)

  • v0.32.3(May 11, 2022)

  • v0.32.2(May 9, 2022)

  • v0.32.1(May 8, 2022)

  • v0.32.0(Apr 24, 2022)

  • v0.31.1(Apr 3, 2022)

A full-featured Python wrapper for the Onfleet API.

UPDATE: Please use Onfleet's wrapper instead. This repository is not maintained. https://github.com/onfleet/pyonfleet --- Python-Onfleet   python-onfl

Lionheart Software 11 Jan 13, 2022
Telegram Bot For Screenshot Generation.

Screenshotit_bot Telegram Bot For Screenshot Generation. Description An attempt to implement the screenshot generation of telegram files without downl

1 Nov 06, 2021
A Python API wrapper for the Twitter API!

PyTweet PyTweet is an api wrapper made for twitter using twitter's api version 2! Installation Windows py3 -m pip install PyTweet Linux python -m pip

TheFarGG 1 Nov 19, 2022
Build a better understanding of your data in PostgreSQL.

Data Fluent for PostgreSQL Build a better understanding of your data in PostgreSQL. The following shows an example report generated by this tool. It g

Mark Litwintschik 28 Aug 30, 2022
Singer Tap for dbt Artifacts built with the Meltano SDK

tap-dbt-artifacts tap-dbt-artifacts is a Singer tap for dbtArtifacts. Built with the Meltano SDK for Singer Taps.

Prratek Ramchandani 9 Nov 25, 2022
A Telegram bot to send messages in Telegram groups or Channels using bots anonymously.

Group-chatting-bot A bot to send messeges to group using bot telegram bot ❤️ Support Made with Python3

Pyrogramers 16 Nov 06, 2022
Program that automates the bump of the Disboard Bot. Done 100% in Python with PyAutoGUI library

Auto-Discord-Bump Program that automates the bump of the Disboard Bot done 100% in python with PyAutoGUI How to configue You will need 3 things before

Mateus 1 Dec 19, 2021
Python interface to the LinkedIn API

Python LinkedIn Python interface to the LinkedIn API This library provides a pure Python interface to the LinkedIn Profile, Group, Company, Jobs, Sear

ozgur 844 Dec 27, 2022
Quadrirrotor UFABC - ROS/Gazebo

QuadROS_UFABC - Repositório utilizado durante minha dissertação de mestrado para simular sistemas de controle e estimação para navegação de um quadrirrotor utilizando visão computacional.

Mateus Ribeiro 1 Dec 13, 2022
This is a open source discord bot project

pythonDiscordBot This is a open source discord bot project #based on the MAX A video: https://www.youtube.com/watch?v=jHZlvRr9KxM Prerequisites Python

Edson Holanda Teixeira Junior 3 Oct 11, 2021
A Discord bot that may save your day by predicting it.

Sage A Discord bot that may save your day by predicting it.

1 Nov 17, 2022
Darkflame Universe Account Manager

Darkflame Universe Account Manager This is a quick and simple web application intended for account creation and management for a DLU instance created

31 Nov 29, 2022
OSINT tool to get information from a Github and Gitlab profile and find user's email addresses leaked on commits.

gitrecon OSINT tool to get information from a Github or Gitlab profile and find user's email addresses leaked on commits. 📚 How does this work? GitHu

GOΠZO 211 Dec 17, 2022
Pogodasbot - Telegram bot sending channel weather info

Pogodasbot - Telegram bot sending channel weather info

Qayrat Sultan 1 Dec 15, 2022
Unofficial WebApp for WhatsApp Web created in PyQt6

Unofficial WebApp for WhatsApp Web created in PyQt6 using PyQt6-WebEngine

Rafael Tosta Santos 126 Dec 20, 2022
A simple telegram bot to save restricted content with custom thumbmail support by Mahesh Chauhan

Save Restricted Content Bot A simple telegram bot to save restricted content with custom thumbmail support by Mahesh Chauhan. Variables API_ID API_HAS

Mahesh Chauhan 532 Jan 02, 2023
Python client and module for BGP Ranking

Python client and module for BGP Ranking THis project will make querying BGP Ranking easier. Installation pip install pybgpranking Usage Command line

D4 project 3 Dec 16, 2021
Pdisk Link Converter Telegram Bot, Convert link in a single click

Pdisk Converter Bot Make short link by using Pdisk API key Installation The Easy Way Required Variables BOT_TOKEN: Create a bot using @BotFather, and

Ayush Kumar Jaiswal 6 Jul 28, 2022
Announces when a web3 wallet receives a token

excitare_cito v2.0 by Bogdan Vaida ([email protected]) Announces wh

1 Nov 30, 2021
A cracking tool of Xiaomi Dr AI (Archytas / Archimedes)

Archytas Tool 我们强烈抵制闲鱼平台上未经授权的刷机服务! 我对本人之前在程序中为防止违规刷机服务添加未生效的格机代码感到抱歉,在此声明此过激行为与 Crack Mi Dr AI Team 无关,并将程序开源。 A cracking tool of Xiaomi Dr AI (Archy

rponeawa 5 Oct 25, 2022