Skip to content

Commit

Permalink
Merge branch 'develop' into dependabot/github_actions/ASFHyP3/actions…
Browse files Browse the repository at this point in the history
…-0.11.0
  • Loading branch information
jtherrmann authored Jun 19, 2024
2 parents f5f2f6b + a99eda5 commit 449ef59
Show file tree
Hide file tree
Showing 12 changed files with 770 additions and 27 deletions.
13 changes: 9 additions & 4 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
# To get started with Dependabot version updates, you'll need to specify which
# package ecosystems to update and where the package manifests are located.
# Please see the documentation for all configuration options:
# https://docs.github.com/en/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file

version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
- package-ecosystem: github-actions
directory: /
schedule:
interval: "daily"
interval: weekly
labels:
- "bumpless"
- bumpless
21 changes: 1 addition & 20 deletions .github/workflows/distribute.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,26 +29,7 @@ jobs:
python -m build
- name: upload to PyPI.org
uses: pypa/gh-action-pypi-publish@v1.8.11
uses: pypa/gh-action-pypi-publish@v1.8.14
with:
user: __token__
password: ${{ secrets.TOOLS_PYPI_PAK }}

verify-distribution:
runs-on: ubuntu-latest
needs:
- call-version-info-workflow
- distribute
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v4

- uses: mamba-org/setup-micromamba@v1
with:
environment-file: environment.yml

- name: Ensure asf_tools v${{ needs.call-version-info-workflow.outputs.version }}} is pip installable
run: |
python -m pip install asf_tools==${{ needs.call-version-info-workflow.outputs.version_tag }}
21 changes: 20 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,28 @@ and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).


## [0.7.2]

### Fixed
- Changed relative link to the watermasking readme in the repo readme to the full URL, so that the link is valid when readme content is mirrored in hyp3-docs

## [0.7.1]

### Added
- A description of the `asf_tools.watermasking` sub-package has been added to the [`asf_tools` README](src/asf_tools/README.md)
- Installation instructions for `osmium-tool` have been added to the [`asf_tools.watermasking` README](src/asf_tools/watermasking/README.md)

### Fixed
- `osmium-tool` dependency handling. Because `osmium-tool` is not distributed on PyPI and thus is not installed when `pip` installing `asf_tools`, `asf_tools` will now raise an `ImportError` when `osmium-tool` is missing that provides installation instructions. Note: `osmium-tool` is distributed on conda-forge and will be included when conda installing `asf_tools`.

## [0.7.0]

### Added
* Scripts and entrypoints for generating our global watermasking dataset added to `watermasking`.

## [0.6.0]

## Added
### Added
* You can choose whether the `ts` (threat score; default) or `fmi` (Fowlkes-Mallows index) minimization metric is used for the flood mapping iterative estimator:
* the `flood_map` console script entrypoint now accepts a `--minimization-metric` argument
* the `asf_tools.hydrosar.floopd_map.make_flood_map` function now accepts a `minimization_metric` keyword argument
Expand Down
3 changes: 3 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,10 @@ dependencies:
- boto3
- fiona
- gdal>=3.7
- geopandas
- numpy
- osmium-tool
- pyogrio
- pysheds>=0.3
- rasterio
- scikit-fuzzy
Expand Down
9 changes: 7 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,14 +25,16 @@ dependencies = [
"astropy",
"fiona",
"gdal>=3.3",
"geopandas",
"numpy",
# "osmium-tool", # C++ CLI tool available via conda-forge or 'https://osmcode.org/osmium-tool/', used by `asf_tools.watermasking.generate_osm_tiles`.
"pyogrio",
"pysheds>=0.3",
"rasterio",
"scikit-fuzzy",
"scikit-image",
"scipy",
"shapely",
"tqdm",
"shapely"
]
dynamic = ["version"]

Expand All @@ -41,6 +43,9 @@ make_composite = "asf_tools.composite:main"
water_map = "asf_tools.hydrosar.water_map:main"
calculate_hand = "asf_tools.hydrosar.hand.calculate:main"
flood_map = "asf_tools.hydrosar.flood_map:main"
generate_osm_dataset = "asf_tools.watermasking.generate_osm_tiles:main"
generate_worldcover_dataset = "asf_tools.watermasking.generate_worldcover_tiles:main"
fill_missing_tiles = "asf_tools.watermasking.fill_missing_tiles:main"

[project.entry-points.hyp3]
water_map = "asf_tools.hydrosar.water_map:hyp3"
Expand Down
7 changes: 7 additions & 0 deletions src/asf_tools/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,3 +161,10 @@ flood_map --help
```

For details on the algorithm see the `asf_tools.flood_map.make_flood_map` docstring.

### Water Mask Dataset Generation

The `asf_tools.watermasking` sub-package allows you to create a watermasking dataset
over an arbitrary ROI using OpenStreetMap and ESA WorldCover data.
Note, this program requires `osmium-tool`. See the [watermasking subpackage readme](https://github.com/ASFHyP3/asf-tools/blob/main/src/asf_tools/watermasking/README.md)
for more information on setup and usage.
19 changes: 19 additions & 0 deletions src/asf_tools/watermasking/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
These scripts are for creating a global (or regional) water mask dataset based off of OpenStreetMaps, and optionally augmented by ESA WorldCover.

For the OSM water mask dataset, follow these steps to replicate our dataset:

1. Install osmium-tool from conda-forge or "https://osmcode.org/osmium-tool/".
2. Download the "Latest Weekly Planet PBF File" file from here: "https://planet.openstreetmap.org/".
3. Download the WGS84 water polygons shapefile from: "https://osmdata.openstreetmap.de/data/water-polygons.html".
4. The files should be unzipped and you should have something like `planet.osm.pbf` or `planet.pbf` and `water_polygons.shp` (and the support files for `water_polygons.shp`).
5. Run ```generate_osm_dataset --planet-file-path [path-to-planet.pbf] --ocean-polygons-path [path-to-water-polygons.shp] --lat-begin -85 --lat-end 85 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```
6. Run ```fill_missing_tiles --fill-value 0 --lat-begin -90 --lat-end -85 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```
7. Run ```fill_missing_tiles --fill-value 1 --lat-begin 85 --lat-end 90 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```

For the WorldCover water mask dataset, follow these steps:

1. Download the portions of the dataset for the areas you would like to cover from here: "https://worldcover2020.esa.int/downloader"
2. Extract the contents into a folder. Note, if you download multiple portions of the dataset, extract them all into the same folder.
3. Run ```generate_worldcover_dataset --worldcover-tiles-dir [path-to-worldcover-data] --lat-begin 55 --lat-end 80 --lon-begin -180 --lon-end 180 --tile-width 5 --tile-height 5```

Note that we only use WorldCover data over Alaska, Canada, and Russia for our dataset.
Empty file.
75 changes: 75 additions & 0 deletions src/asf_tools/watermasking/fill_missing_tiles.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import argparse
import os
import subprocess

import numpy as np
from osgeo import gdal, osr

from asf_tools.watermasking.utils import lat_lon_to_tile_string


gdal.UseExceptions()


def main():

parser = argparse.ArgumentParser(
prog='fill_missing_tiles.py',
description='Script for creating filled tifs in areas with missing tiles.'
)

parser.add_argument('--fill-value', help='The value to fill the data array with.', default=0)
parser.add_argument('--lat-begin', help='The minimum latitude of the dataset in EPSG:4326.', default=-85)
parser.add_argument('--lat-end', help='The maximum latitude of the dataset in EPSG:4326.', default=85)
parser.add_argument('--lon-begin', help='The minimum longitude of the dataset in EPSG:4326.', default=-180)
parser.add_argument('--lon-end', help='The maximum longitude of the dataset in EPSG:4326.', default=180)
parser.add_argument('--tile-width', help='The desired width of the tile in degrees.', default=5)
parser.add_argument('--tile-height', help='The desired height of the tile in degrees.', default=5)

args = parser.parse_args()

fill_value = int(args.fill_value)
lat_begin = int(args.lat_begin)
lat_end = int(args.lat_end)
lon_begin = int(args.lon_begin)
lon_end = int(args.lon_end)
tile_width = int(args.tile_width)
tile_height = int(args.tile_height)

lat_range = range(lat_begin, lat_end, tile_height)
lon_range = range(lon_begin, lon_end, tile_width)

for lat in lat_range:
for lon in lon_range:

tile = lat_lon_to_tile_string(lat, lon, is_worldcover=False, postfix='')
tile_tif = 'tiles/' + tile + '.tif'
tile_cog = 'tiles/cogs/' + tile + '.tif'

print(f'Processing: {tile}')

xmin, ymin = lon, lat
pixel_size_x = 0.00009009009
pixel_size_y = 0.00009009009

# All images in the dataset should be this size.
data = np.empty((55500, 55500))
data.fill(fill_value)

driver = gdal.GetDriverByName('GTiff')
dst_ds = driver.Create(tile_tif, xsize=data.shape[0], ysize=data.shape[1], bands=1, eType=gdal.GDT_Byte)
dst_ds.SetGeoTransform([xmin, pixel_size_x, 0, ymin, 0, pixel_size_y])
srs = osr.SpatialReference()
srs.ImportFromEPSG(4326)
dst_ds.SetProjection(srs.ExportToWkt())
dst_band = dst_ds.GetRasterBand(1)
dst_band.WriteArray(data)
del dst_ds

command = f'gdal_translate -of COG -co NUM_THREADS=all_cpus {tile_tif} {tile_cog}'.split(' ')
subprocess.run(command)
os.remove(tile_tif)


if __name__ == '__main__':
main()
Loading

0 comments on commit 449ef59

Please sign in to comment.