Build, deploy and extract satellite public constellations with one command line.

Overview

Logo

SatExtractor

Build, deploy and extract satellite public constellations with one command line.
Logo

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Contributing
  5. License

About The Project

  • tldr: SatExtractor gets all revisits in a date range from a given geojson region from any public satellite constellation and store it in a cloud friendly format.

The large amount of image data makes it difficult to create datasets to train models quickly and reliably. Existing methods for extracting satellite images take a long time to process and have user quotas that restrict access.

Therefore, we created an open source extraction tool SatExtractor to perform worldwide datasets extractions using serverless providers such as Google Cloud Platform or AWS and based on a common existing standard: STAC.

The tool scales horizontally as needed, extracting revisits and storing them in zarr format to be easily used by deep learning models.

It is fully configurable using Hydra.

(back to top)

Getting Started

SatExtractor needs a cloud provider to work. Before you start using it, you'll need to create and configure a cloud provider account.

We provide the implementation to work with Google Cloud, but SatExtractor is implemented to be easily extensible to other providers.

Structure

The package is structured in a modular and configurable approach. It is basically a pipeline containing 6 important steps (separated in modules).

  • Builder: contains the logic to build the container that will run the extraction.

    more info SatExtractor is based on a docker container. The Dockerfile in the root dir is used to build the core package and a reference in it to the specific provider extraction logic should be explicitly added (see the gcp example in directory providers/gcp).

    This is done by setting ENV PROVIDER var to point the provider directory. In the default Dockerfile it is set to gcp: ENV PROVIDER providers/gcp .

  • Stac: converts a public constellation to the STAC standard.
    more info If the original constellation is not already in STAC standard it should be converted. To do so, you have to implement the constellation specific STAC conversor. Sentinel 2 and Landsat 7/8 examples can be found in src/satextractor/stac . The function that is actually called to perform the conversion to the STAC standard is set in stac hydra config file ( conf/stac/gcp.yaml )
  • Tiler: Creates tiles of the given region to perform the extraction.
    more info The Tiler split the region in UTM tiles using SentinelHub splitter . There will be one Extraction Task per Tile. The config about the tiler can be found in conf/tiler/utm.yaml . There, the size of the tiles can be specified. Take into account that these tiles are not the actual patches that are later stored in your cloud provider, this is just the unit from where the (smaller) patches will be extracted.
  • Scheduler: Decides how those tiles are going to be scheduled creating extractions tasks.

    more info The Scheduler takes the resulting tiles from the Tiler and creates the actual patches (called also tiles) to be extracted.

    For example, if the Tiler splitted the region in 10000x10000 tiles, now the scheduler can be set to extract from each of the tiles smaller patches of, say, 1000x1000. Also, the scheduler calculates the intersection between the patches and the constellation STAC assets. At the end, you'll have and object called ExtractionTask with the information to extract one revisit, one band and one tile splitted in multiple patches. This ExtractionTask will be send to the cloud provider to perform the actual extraction.

    The config about the scheduler can be found in conf/scheduler/utm.yaml .

  • Preparer: Prepare the files in the cloud storage.

    more info The Preparer creates the cloud file structure. It creates the needed zarr groups and arrays in order to later store the extracted patches.

    The gcp preparer config can be found in conf/preparer/gcp.yaml .

  • Deployer: Deploy the extraction tasks created by the scheduler to perform the extraction.
    more info The Deployer sends one message per ExtractionTask to the cloud provider to perform the actal extraction. It works by publishing messages to a PubSub queue where the extraction is subscribed to. When a new message (ExtractionTask) arrives it will be automatically run on the cloud autoscaling. The gcp deployer config can be found in conf/deployer/gcp.yaml .

All the steps are optional and the user decides which to run the main config file.

Prerequisites

In order to run SatExtractor we recommend to have a virtual env and a cloud provider user should already been created.

Installation

  1. Clone the repo
    git clone https://github.com/FrontierDevelopmentLab/sat-extractor
  2. Install python packages
    pip install .

(back to top)

Usage

🔴 🔴 🔴

- WARNING!!!!:
Running SatExtractor will use your billable cloud provider services. 
We strongly recommend testing it with a small region to see if everything is working ok. 
Be sure you are running all your cloud provider services in the same region to avoid extra costs.

🔴 🔴 🔴

Once a cloud provider user is set and the package is installed you'll need to grab the geojson region you want (you can get it from the super-cool tool geojson.io) and change the config files.

  1. Save the region as .geojson and store it in the outputs folder (you can change your output dir in the config.yaml)
  2. Open the config.yaml and you'll see something like this:

Logo

The important here is to set the dataset_name to , define the start_date and end_date for your revisits, your constellations and the tasks to be run (you would want to run the build only one time and the comment it out.)

Important: the token.json contains the needed credentials to access you cloud provider. In this example case it contains the gcp credentials. You'll need to provide it.

  1. Open the cloud/ .yaml and add there your account info as in the default provided file. (optional): you can choose different configurations by changing modules configs: builder, stac, tiler, scheduler, preparer, etc. There you can change things like patch_size, chunk_size.

  2. Run python src/satextractor/cli.py and enjoy!

(back to top)

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the BSD 2 License. See LICENSE.txt for more information.

(back to top)

Acknowledgments

This work is the result of the 2021 ESA Frontier Development Lab World Food Embeddings team. We are grateful to all organisers, mentors and sponsors for providing us this opportunity. We thank Google Cloud for providing computing and storage resources to complete this work.

Comments
  • Dockerfile path not found

    Dockerfile path not found

    in gcp builder the dockerfile path is set to: dockerfile_path = Path(__file__).parents[3] It returns '/home/fran/miniconda3/envs/sat-extractor/lib/python3.9'.

    Should be changed to pass the path as parameter.

    bug 
    opened by frandorr 3
  • Loosen version locking

    Loosen version locking

    I think it might be necessary to loosen the dependency version locking in setup.py a bit. Quite difficult to e.g. pip install prefect[google] sat-extractor because of the conflicts.

    opened by carderne 3
  • DLQ, Pyarrow, backoff

    DLQ, Pyarrow, backoff

    Without pyarrow installed, I got the following error:

    2021-11-05 16:10:20.389 | INFO     | __main__:stac:33 - using satextractor.stac.gcp_region_to_item_collection stac creator.
    Error executing job with overrides: []
    Traceback (most recent call last):
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
        return _target_(*args, **kwargs)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 47, in gcp_region_to_item_collection
        df = get_sentinel_2_assets_df(client, region, start_date, end_date)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 168, in get_sentinel_2_assets_df
        dfs.append(query_job.to_dataframe())
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/job/query.py", line 1644, in to_dataframe
        return query_result.to_dataframe(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1938, in to_dataframe
        record_batch = self.to_arrow(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1713, in to_arrow
        raise ValueError(_NO_PYARROW_ERROR)
    ValueError: The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "sat-extractor/src/satextractor/cli.py", line 185, in main
        stac(cfg)
      File "sat-extractor/src/satextractor/cli.py", line 44, in stac
        item_collection = hydra.utils.call(
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 180, in instantiate
        return instantiate_node(config, *args, recursive=_recursive_, convert=_convert_)
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 249, in instantiate_node
        return _call_target(_target_, *args, **kwargs)
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 64, in _call_target
        raise type(e)(
      File "venv/lib/python3.9/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
        return _target_(*args, **kwargs)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 47, in gcp_region_to_item_collection
        df = get_sentinel_2_assets_df(client, region, start_date, end_date)
      File "venv/lib/python3.9/site-packages/satextractor/stac/stac.py", line 168, in get_sentinel_2_assets_df
        dfs.append(query_job.to_dataframe())
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/job/query.py", line 1644, in to_dataframe
        return query_result.to_dataframe(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1938, in to_dataframe
        record_batch = self.to_arrow(
      File "venv/lib/python3.9/site-packages/google/cloud/bigquery/table.py", line 1713, in to_arrow
        raise ValueError(_NO_PYARROW_ERROR)
    ValueError: Error instantiating 'satextractor.stac.stac.gcp_region_to_item_collection' : The pyarrow library is not installed, please install pyarrow to use the to_arrow() function.
    
    
    opened by carderne 3
  • Fix lazy scheduler

    Fix lazy scheduler

    This is much smaller than it looks, basically just:

    1. Add back support for passing item_collection as an already-loaded ItemCollection.
    2. Rename item to it on L132 to avoid shadowing item from the main loop.
    opened by carderne 2
  • satextractor.schedler causing circular import

    satextractor.schedler causing circular import

    The first line of this imports gcp_scheduler https://github.com/FrontierDevelopmentLab/sat-extractor/blob/d50713143c12c62b56927b526826fedfb52e77b4/src/satextractor/scheduler/init.py#L1-L2

    Which itself imports from the same __init__.py again, causing a circular loop: https://github.com/FrontierDevelopmentLab/sat-extractor/blob/d50713143c12c62b56927b526826fedfb52e77b4/src/satextractor/scheduler/gcp_scheduler.py#L8

    opened by carderne 2
  • Get DLQ working

    Get DLQ working

    @frandorr @Lkruitwagen

    From here, the service account for PubSub looks like this:

    PUBSUB_SERVICE_ACCOUNT="service-${project-number}@gcp-sa-pubsub.iam.gserviceaccount.com"
    

    i.e. instead of using the service account from the token.json.

    From CLI:

    PROJ_NUMBER=$(gcloud projects list \
    --filter="$(gcloud config get-value project)" \
    --format="value(PROJECT_NUMBER)")
    
    PUBSUB_SERVICE_ACCOUNT="s[email protected]"
    

    And then bind the account as already done:

    gcloud pubsub topics add-iam-policy-binding "$DLQ_TOPIC" \
      --member="serviceAccount:$PUBSUB_SERVICE_ACCOUNT" \
      --role=roles/pubsub.publisher
    
    gcloud pubsub subscriptions add-iam-policy-binding "$MAIN_SUBSCRIPTION" \
      --member="serviceAccount:$PUBSUB_SERVICE_ACCOUNT" \
      --role=roles/pubsub.subscriber
    
    opened by carderne 2
  • GCP dead-letter-queue not properly being created

    GCP dead-letter-queue not properly being created

    When the pubsub cloud run subscription is created, it creates also a dlq but it doesn't assign the correct roles and permissions and doesn't create the topic for the dlq: image

    As the dlq doesn't exist, the extraction task messages that fails will loop forever in the the main queue, restarting the cloud run service until messages are manually purged.

    We should add this permissions and create the dlq topic automatically to avoid an infinite loop.

    bug 
    opened by frandorr 2
  • mask and percentiles

    mask and percentiles

    These don't seem to be used for anything?

    https://github.com/FrontierDevelopmentLab/sat-extractor/blob/7a0821360ffdab8403563ca651b5bd43ecca3dc4/src/satextractor/preparer/preparer.py#L35-L51

    opened by carderne 1
  • Succeeds but get error

    Succeeds but get error

    @frandorr just sharing this here from last week

    [2021-10-29 13:17:48,746][grpc._plugin_wrapping][ERROR] - AuthMetadataPluginCallback "<google.auth.transport.grpc.AuthMetadataPlugin object at 0x7fc4d86812b0>" raised exception!
    Traceback (most recent call last):
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/grpc/_plugin_wrapping.py", line 89, in __call__
        self._metadata_plugin(
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/transport/grpc.py", line 101, in __call__
        callback(self._get_authorization_headers(context), None)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/transport/grpc.py", line 87, in _get_authorization_headers
        self._credentials.before_request(
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/credentials.py", line 134, in before_request
        self.apply(headers)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/credentials.py", line 110, in apply
        _helpers.from_bytes(token or self.token)
      File "/home/chris/.virtualenvs/ox/lib/python3.9/site-packages/google/auth/_helpers.py", line 129, in from_bytes
        raise ValueError("{0!r} could not be converted to unicode".format(value))
    ValueError: None could not be converted to unicode
    
    opened by carderne 1
  • Only works with pip install -e .

    Only works with pip install -e .

    This:

    cd sat-extractor
    pyenv global 3.9.7
    mkvirtualenv test
    pip install .
    python ./src/satextractor/cli.py
    

    Fails with:

    ImportError: Encountered error: `No module named 'satextractor.builder'` when
    loading module 'satextractor.builder.gcp_builder.build_gcp'
    

    However, installing with pip install -e . (what I did initially, which is why I didn't notice this) works fine.

    Maybe because when using a non-editable install, cli.py gets confused about whether it should be looking for modules in its directory or in the somehwere-else/site-packages/satextractor directory...

    opened by carderne 1
  • Feat/append data

    Feat/append data

    • add overwrite to main config
    • create a hash string for the config spec (for unique date-range & constellation combination)
    • if extraction archive already exists, resize any existing data and masks for the union of the new timeseries. Overwrite timeries with union prior to extraction.
    opened by Lkruitwagen 0
  • For consideration: Bring main extract function into package

    For consideration: Bring main extract function into package

    @Lkruitwagen @frandorr

    Moving all the extract_patches logic into the main package under satextractor.extractor, leaving only the HTTP and BQ stuff in the Flask app.

    Seems cleaner, and gives the option to do the extractions in-process if wanted! Could make it easier to try out sat-extractor on a local machine with a local output directory, no need to spin up PubSub, CloudRun etc.

    opened by carderne 0
  • Store bands info

    Store bands info

    Current implementation doesn't store the bands for each constellation. It would be nice to have that info stored. Some ideas:

    • Store a simple metadata json at constellation level (easiest)
    • At the end of each extraction create a STAC catalog that contains metadata info (maybe better but would take longer to implement)
    • Store at array level the info, something like xarray coordinates.
    enhancement 
    opened by frandorr 0
Releases(v0.3.3)
  • v0.3.3(Dec 7, 2021)

    What's Changed

    • fix COG reader by @Lkruitwagen in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/27
    • Remove download for jp2. Now using same function for geotiff and jp2 with rio. Remove gdal by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/28

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.2...v0.3.3

    Source code(tar.gz)
    Source code(zip)
  • v0.3.2(Dec 7, 2021)

    What's Changed

    • Remove COG downloader and use same as jp2k because it was buggy by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/26

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.1...v0.3.2

    Source code(tar.gz)
    Source code(zip)
  • v0.3.1(Dec 6, 2021)

    What's Changed

    • Fix Dockerfile path regression by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/24
    • Fix rescaling bug by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/25

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.3.0...v0.3.1

    Source code(tar.gz)
    Source code(zip)
  • v0.3.0(Dec 2, 2021)

    What's Changed

    • Make tile IDs globally unique by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/17
    • Improve Band common_names and Tile properties by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/21
    • Add task id to vsimem to avoid multiple tasks using the same in-memory file by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/22

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.1.1...v0.3.0

    Source code(tar.gz)
    Source code(zip)
  • v0.1.1(Nov 18, 2021)

    What's Changed

    • Feat/typos+readme by @Lkruitwagen in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/7
    • README clarifications, add pyarrow, specify platform in gcloud run by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/8
    • DLQ, Pyarrow, backoff by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/11
    • Fix constellations bug. by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/12
    • Compatible deps; refactor build_gcp; explicit Dockerfile; ItemCollection as object by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/14
    • Deployer returns job_id so callers can track monitor tables by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/15
    • Fix jp2 lossy compresion bug

    New Contributors

    • @Lkruitwagen made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/7

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/compare/v0.1.0...v0.1.1

    Source code(tar.gz)
    Source code(zip)
  • v0.1.0(Oct 28, 2021)

    What's Changed

    • Small README and config improvements by @carderne in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/2
    • small README modif by @rramosp in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/3
    • Add init file fixes #4 by @frandorr in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/5

    New Contributors

    • @carderne made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/2
    • @rramosp made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/3
    • @frandorr made their first contribution in https://github.com/FrontierDevelopmentLab/sat-extractor/pull/5

    Full Changelog: https://github.com/FrontierDevelopmentLab/sat-extractor/commits/v0.1.0

    Source code(tar.gz)
    Source code(zip)
Owner
Frontier Development Lab
For more projects, see: https://gitlab.com/frontierdevelopmentlab
Frontier Development Lab
pure-Python (Numpy optional) 3D coordinate conversions for geospace ecef enu eci

Python 3-D coordinate conversions Pure Python (no prerequistes beyond Python itself) 3-D geographic coordinate conversions and geodesy. API similar to

Geospace code 292 Dec 29, 2022
3D extension built off of shapely to make working with geospatial/trajectory data easier in python.

PyGeoShape 3D extension to shapely and pyproj to make working with geospatial/trajectory data easier in python. Getting Started Installation pip The e

Marc Brittain 5 Dec 27, 2022
ProjPicker (projection picker) is a Python module that allows the user to select all coordinate reference systems (CRSs)

ProjPicker ProjPicker (projection picker) is a Python module that allows the user to select all coordinate reference systems (CRSs) whose extent compl

Huidae Cho 4 Feb 06, 2022
EOReader is a multi-satellite reader allowing you to open optical and SAR data.

Remote-sensing opensource python library reading optical and SAR sensors, loading and stacking bands, clouds, DEM and index.

ICube-SERTIT 152 Dec 30, 2022
Xarray backend to Copernicus Sentinel-1 satellite data products

xarray-sentinel WARNING: this product is a "technology preview" / pre-Alpha Xarray backend to explore and load Copernicus Sentinel-1 satellite data pr

B-Open 191 Dec 15, 2022
🌐 Local tile server for viewing geospatial raster files with ipyleaflet or folium

🌐 Local Tile Server for Geospatial Rasters Need to visualize a rather large (gigabytes) raster you have locally? This is for you. A Flask application

Bane Sullivan 192 Jan 04, 2023
Enable geospatial data mining through Google Earth Engine in Grasshopper 3D, via its most recent Hops component.

AALU_Geo Mining This repository is produced for a masterclass at the Architectural Association Landscape Urbanism programme. Requirements Rhinoceros (

4 Nov 16, 2022
GeoNode is an open source platform that facilitates the creation, sharing, and collaborative use of geospatial data.

Table of Contents What is GeoNode? Try out GeoNode Install Learn GeoNode Development Contributing Roadmap Showcase Most useful links Licensing What is

GeoNode Development Team 1.2k Dec 26, 2022
LicenseLocation - License Location With Python

LicenseLocation Hi,everyone! ❤ 🧡 💛 💚 💙 💜 This is my first project! ✔ Actual

The Bin 1 Jan 25, 2022
Search and download Copernicus Sentinel satellite images

sentinelsat Sentinelsat makes searching, downloading and retrieving the metadata of Sentinel satellite images from the Copernicus Open Access Hub easy

837 Dec 28, 2022
A simple python script that, given a location and a date, uses the Nasa Earth API to show a photo taken by the Landsat 8 satellite. The script must be executed on the command-line.

What does it do? Given a location and a date, it uses the Nasa Earth API to show a photo taken by the Landsat 8 satellite. The script must be executed

Caio 42 Nov 26, 2022
ArcGIS Python Toolbox for WhiteboxTools

WhiteboxTools-ArcGIS ArcGIS Python Toolbox for WhiteboxTools. This repository is related to the ArcGIS Python Toolbox for WhiteboxTools, which is an A

Qiusheng Wu 190 Dec 30, 2022
🌐 Local tile server for viewing geospatial raster files with ipyleaflet

🌐 Local Tile Server for Geospatial Rasters Need to visualize a rather large raster (gigabytes) you have locally? This is for you. A Flask application

Bane Sullivan 192 Jan 04, 2023
iNaturalist observations along hiking trails

This tool reads the route of a hike and generates a table of iNaturalist observations along the trails. It also shows the observations and the route of the hike on a map. Moreover, it saves waypoints

7 Nov 11, 2022
A python package that extends Google Earth Engine.

A python package that extends Google Earth Engine GitHub: https://github.com/davemlz/eemont Documentation: https://eemont.readthedocs.io/ PyPI: https:

David Montero Loaiza 307 Jan 01, 2023
Focal Statistics

Focal-Statistics The Focal statistics tool in many GIS applications like ArcGIS, QGIS and GRASS GIS is a standard method to gain a local overview of r

Ifeanyi Nwasolu 1 Oct 21, 2021
GeoIP Legacy Python API

MaxMind GeoIP Legacy Python Extension API Requirements Python 2.5+ or 3.3+ GeoIP Legacy C Library 1.4.7 or greater Installation With pip: $ pip instal

MaxMind 230 Nov 10, 2022
Daily social mapping project in November 2021. Maps made using PyGMT whenever possible.

Daily social mapping project in November 2021. Maps made using PyGMT whenever possible.

Wei Ji 20 Nov 24, 2022
Replace MSFS2020's bing map to google map

English verison here 中文 免责声明 本教程提到的方法仅用于研究和学习用途。我不对使用、拓展该教程及方法所造成的任何法律责任和损失负责。 背景 微软模拟飞行2020的地景使用了Bing的卫星地图,然而卫星地图比较老旧,很多地区都是几年前的图设置直接是没有的。这种现象在全球不同地区

hesicong 272 Dec 24, 2022
Get-countries-info - A python code that fetches data of any country

Country-info A python code getting countries information including country's map

CODE 2 Feb 21, 2022