Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release v0.12.0 #234

Merged
merged 22 commits into from
Sep 22, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
5fe49cb
Add `create-jira-issue.yml`
jtherrmann Jun 29, 2023
f322b07
Merge pull request #225 from ASFHyP3/create-jira-issue
jtherrmann Jul 6, 2023
8a25e76
Update v.description for S1 data
jhkennedy Aug 24, 2023
ad8298d
state s1 metadata changes
jhkennedy Aug 24, 2023
95a5e3c
Update CHANGELOG.md
jhkennedy Aug 24, 2023
693d4fe
Merge pull request #227 from ASFHyP3/s1-metadata
jhkennedy Aug 25, 2023
12b4e30
fix typo
jhkennedy Sep 1, 2023
eaa4f45
fix typo
jhkennedy Sep 1, 2023
1be442f
Merge pull request #228 from ASFHyP3/jhkennedy-patch-1
jhkennedy Sep 1, 2023
60e3f83
First pass at cropping granules to valid data
jhkennedy Sep 7, 2023
c10044c
drop radar only vars for optical products
jhkennedy Sep 7, 2023
740f044
del not pop since I don't care removed items
jhkennedy Sep 7, 2023
4f078ad
a little cleanup/refactoring
jhkennedy Sep 8, 2023
bf2c3d7
Add h5netcdf for performance reasons
jhkennedy Sep 8, 2023
f562e6a
Update README for cropping
jhkennedy Sep 8, 2023
cd2bf93
Merge pull request #229 from ASFHyP3/crop-final-product
jhkennedy Sep 8, 2023
d949420
use dropna instead of drop=True
AndrewPlayer3 Sep 15, 2023
2c343c8
update changelog and add diff
AndrewPlayer3 Sep 15, 2023
dab90fb
Haven't release yet and don't need to state changes to (not in the v…
jhkennedy Sep 19, 2023
39a01c4
fix attribute typo
jhkennedy Sep 19, 2023
d9462bb
Merge pull request #233 from ASFHyP3/crop-speedup-changes
jhkennedy Sep 20, 2023
967aa6f
Merge pull request #232 from ASFHyP3/crop_speedup
jhkennedy Sep 21, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions .github/workflows/create-jira-issue.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
name: Create Jira issue

on:
issues:
types: [labeled]

jobs:
call-create-jira-issue-workflow:
uses: ASFHyP3/actions/.github/workflows/[email protected]
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
JIRA_API_TOKEN: ${{ secrets.JIRA_API_TOKEN }}
JIRA_PROJECT: ${{ secrets.JIRA_PROJECT }}
JIRA_FIELDS: ${{ secrets.JIRA_FIELDS }}
13 changes: 13 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,19 @@ and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).


## [0.12.0]

### Added
* [`hyp3_autorift.crop`](hyp3_autorift/crop.py) provides a `crop_netcdf_product` function to crop HyP3 AutoRIFT products
to the extent of valid `v` data

### Changed
* HyP3 AutoRIFT products generated with the main workflow will be cropped to the extent of the valid `v` data

### Fixed
* Patch [227](hyp3_autorift/vend/CHANGES-227.diff) was applied to align the S1 granules velocity description with the
optical products

## [0.11.1]

### Fixed
Expand Down
3 changes: 3 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ dependencies:
- wheel
# For running
- gdal>=3
- h5netcdf
- hyp3lib=1.7.0
- isce2=2.6.1.dev7
- autorift=1.5.0
Expand All @@ -31,5 +32,7 @@ dependencies:
- matplotlib-base
- netCDF4
- numpy<1.24 # https://github.com/isce-framework/isce2/pull/639
- pyproj
- requests
- scipy
- xarray
129 changes: 129 additions & 0 deletions hyp3_autorift/crop.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
# MIT License
#
# Copyright (c) 2020 NASA Jet Propulsion Laboratory
# Modifications (c) Copyright 2023 Alaska Satellite Facility
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.

"""Crop HyP3 AutoRIFT products to their valid data range, inplace

This module is based on the ITS_LIVE production script for cropping V2 products
after they have been generated and has been heavily refactored for use in this HyP3 plugin:

The original script:
https://github.com/nasa-jpl/its_live_production/blob/957e9aba627be2abafcc9601712a7f9c4dd87849/src/tools/crop_v2_granules.py
"""

from pathlib import Path

import numpy as np
import pyproj
import xarray as xr


ENCODING_TEMPLATE = {
'interp_mask': {'_FillValue': 0.0, 'dtype': 'ubyte', "zlib": True, "complevel": 2, "shuffle": True},
'chip_size_height': {'_FillValue': 0.0, 'dtype': 'ushort', "zlib": True, "complevel": 2, "shuffle": True},
'chip_size_width': {'_FillValue': 0.0, 'dtype': 'ushort', "zlib": True, "complevel": 2, "shuffle": True},
'M11': {'_FillValue': -32767, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'M12': {'_FillValue': -32767, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'v': {'_FillValue': -32767.0, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'vx': {'_FillValue': -32767.0, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'vy': {'_FillValue': -32767.0, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'v_error': {'_FillValue': -32767.0, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'va': {'_FillValue': -32767.0, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'vr': {'_FillValue': -32767.0, 'dtype': 'short', "zlib": True, "complevel": 2, "shuffle": True},
'x': {'_FillValue': None},
'y': {'_FillValue': None}
}


def crop_netcdf_product(netcdf_file: Path) -> Path:
"""

Args:
netcdf_file:
"""
with xr.open_dataset(netcdf_file) as ds:
# this will drop X/Y coordinates, so drop non-None values just to get X/Y extends
xy_ds = ds.where(ds.v.notnull()).dropna(dim='x', how='all').dropna(dim='y', how='all')

x_values = xy_ds.x.values
grid_x_min, grid_x_max = x_values.min(), x_values.max()

y_values = xy_ds.y.values
grid_y_min, grid_y_max = y_values.min(), y_values.max()

# Based on X/Y extends, mask original dataset
mask_lon = (ds.x >= grid_x_min) & (ds.x <= grid_x_max)
mask_lat = (ds.y >= grid_y_min) & (ds.y <= grid_y_max)
mask = (mask_lon & mask_lat)

cropped_ds = ds.where(mask).dropna(dim='x', how='all').dropna(dim='y', how='all')
cropped_ds = cropped_ds.load()

# Reset data for mapping and img_pair_info data variables as ds.where() extends data of all data variables
# to the dimensions of the "mask"
cropped_ds['mapping'] = ds['mapping']
cropped_ds['img_pair_info'] = ds['img_pair_info']

# Compute centroid longitude/latitude
center_x = (grid_x_min + grid_x_max) / 2
center_y = (grid_y_min + grid_y_max) / 2

# Convert to lon/lat coordinates
projection = ds['mapping'].attrs['spatial_epsg']
to_lon_lat_transformer = pyproj.Transformer.from_crs(
f"EPSG:{projection}",
'EPSG:4326',
always_xy=True
)

# Update centroid information for the granule
center_lon_lat = to_lon_lat_transformer.transform(center_x, center_y)

cropped_ds['img_pair_info'].attrs['latitude'] = round(center_lon_lat[1], 2)
cropped_ds['img_pair_info'].attrs['longitude'] = round(center_lon_lat[0], 2)

# Update mapping.GeoTransform
x_cell = x_values[1] - x_values[0]
y_cell = y_values[1] - y_values[0]

# It was decided to keep all values in GeoTransform center-based
cropped_ds['mapping'].attrs['GeoTransform'] = f"{x_values[0]} {x_cell} 0 {y_values[0]} 0 {y_cell}"

# Compute chunking like AutoRIFT does:
# https://github.com/ASFHyP3/hyp3-autorift/blob/develop/hyp3_autorift/vend/netcdf_output.py#L410-L411
dims = cropped_ds.dims
chunk_lines = np.min([np.ceil(8192 / dims['y']) * 128, dims['y']])
two_dim_chunks_settings = (chunk_lines, dims['x'])

encoding = ENCODING_TEMPLATE.copy()
if not netcdf_file.name.startswith('S1'):
for radar_variable in ['M11', 'M12', 'va', 'vr']:
del encoding[radar_variable]

for _, attributes in encoding.items():
if attributes['_FillValue'] is not None:
attributes['chunksizes'] = two_dim_chunks_settings

cropped_file = netcdf_file.with_stem(f'{netcdf_file.stem}_cropped')
cropped_ds.to_netcdf(cropped_file, engine='h5netcdf', encoding=encoding)

return cropped_file
13 changes: 9 additions & 4 deletions hyp3_autorift/process.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
from osgeo import gdal

from hyp3_autorift import geometry, image, io
from hyp3_autorift.crop import crop_netcdf_product

log = logging.getLogger(__name__)

Expand Down Expand Up @@ -471,18 +472,22 @@ def process(reference: str, secondary: str, parameter_file: str = DEFAULT_PARAME
if netcdf_file is None:
raise Exception('Processing failed! Output netCDF file not found')

netcdf_file = Path(netcdf_file)
cropped_file = crop_netcdf_product(netcdf_file)
netcdf_file.unlink()

if naming_scheme == 'ITS_LIVE_PROD':
product_file = Path(netcdf_file)
product_file = netcdf_file
elif naming_scheme == 'ASF':
product_name = get_product_name(
reference, secondary, orbit_files=(reference_state_vec, secondary_state_vec),
pixel_spacing=parameter_info['xsize'],
)
product_file = Path(f'{product_name}.nc')
shutil.move(netcdf_file, str(product_file))
else:
product_file = Path(netcdf_file.replace('.nc', '_IL_ASF_OD.nc'))
shutil.move(netcdf_file, str(product_file))
product_file = netcdf_file.with_stem(f'{netcdf_file.stem}_IL_ASF_OD')

shutil.move(cropped_file, str(product_file))

with Dataset(product_file) as nc:
velocity = nc.variables['v']
Expand Down
2 changes: 1 addition & 1 deletion hyp3_autorift/vend/CHANGES-189.diff
Original file line number Diff line number Diff line change
Expand Up @@ -295,7 +295,7 @@ diff --git netcdf_output.py netcdf_output.py
var.setncattr('stable_shift_flag_description', 'flag for applying velocity bias correction: 0 = no correction; '
'1 = correction from overlapping stable surface mask '
- '(stationary or slow-flowing surfaces with velocity < 15 m/yr)'
+ '(stationary or slow-flowing surfaces with velocity < 15 meter/yearr)'
+ '(stationary or slow-flowing surfaces with velocity < 15 meter/year)'
'(top priority); 2 = correction from slowest 25% of overlapping '
'velocities (second priority)')

Expand Down
15 changes: 15 additions & 0 deletions hyp3_autorift/vend/CHANGES-227.diff
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
diff --git netcdf_output.py netcdf_output.py
--- netcdf_output.py
+++ netcdf_output.py
@@ -639,10 +639,7 @@ def netCDF_packaging(VX, VY, DX, DY, INTERPMASK, CHIPSIZEX, CHIPSIZEY, SSM, SSM1
var = nc_outfile.createVariable('v', np.dtype('int16'), ('y', 'x'), fill_value=NoDataValue,
zlib=True, complevel=2, shuffle=True, chunksizes=ChunkSize)
var.setncattr('standard_name', 'land_ice_surface_velocity')
- if pair_type == 'radar':
- var.setncattr('description', 'velocity magnitude from radar range and azimuth measurements')
- else:
- var.setncattr('description', 'velocity magnitude')
+ var.setncattr('description', 'velocity magnitude')
var.setncattr('units', 'meter/year')
var.setncattr('grid_mapping', mapping_var_name)

4 changes: 4 additions & 0 deletions hyp3_autorift/vend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,3 +66,7 @@ We've replaced it with `hyp3_autorift.io.get_topsinsar_config`.
from Sentinel-1 pairs that were created using HyP3 autoRIFT versions < 0.9.0, which was released November 2, 2022
9. The changes listed in `CHANGES-223.diff` were applied in [ASFHyP3/hyp3-autorift#223](https://github.com/ASFHyP3/hyp3-autorift/pull/223)
were applied to the S1 correction workflow so that the scene's polarization was set correctly
10. The changes listed in `CHANGES-227.diff` were applied in [ASFHyP3/hyp3-autorift#227](https://github.com/ASFHyP3/hyp3-autorift/pull/227)
were applied to align the S1 granules velocity description with the optical products. These changes have been
[proposed upstream](https://github.com/nasa-jpl/autoRIFT/pull/87) and should be applied in the next
`nasa-jpl/autoRIFT` release.
7 changes: 2 additions & 5 deletions hyp3_autorift/vend/netcdf_output.py
Original file line number Diff line number Diff line change
Expand Up @@ -639,10 +639,7 @@ def netCDF_packaging(VX, VY, DX, DY, INTERPMASK, CHIPSIZEX, CHIPSIZEY, SSM, SSM1
var = nc_outfile.createVariable('v', np.dtype('int16'), ('y', 'x'), fill_value=NoDataValue,
zlib=True, complevel=2, shuffle=True, chunksizes=ChunkSize)
var.setncattr('standard_name', 'land_ice_surface_velocity')
if pair_type == 'radar':
var.setncattr('description', 'velocity magnitude from radar range and azimuth measurements')
else:
var.setncattr('description', 'velocity magnitude')
var.setncattr('description', 'velocity magnitude')
var.setncattr('units', 'meter/year')
var.setncattr('grid_mapping', mapping_var_name)

Expand Down Expand Up @@ -809,7 +806,7 @@ def netCDF_packaging(VX, VY, DX, DY, INTERPMASK, CHIPSIZEX, CHIPSIZEY, SSM, SSM1
var.setncattr('stable_shift_flag', stable_shift_applied)
var.setncattr('stable_shift_flag_description', 'flag for applying velocity bias correction: 0 = no correction; '
'1 = correction from overlapping stable surface mask '
'(stationary or slow-flowing surfaces with velocity < 15 meter/yearr)'
'(stationary or slow-flowing surfaces with velocity < 15 meter/year)'
'(top priority); 2 = correction from slowest 25% of overlapping '
'velocities (second priority)')

Expand Down
3 changes: 3 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,12 +37,15 @@
'boto3',
'botocore',
'gdal',
'h5netcdf',
'hyp3lib==1.7.0',
'matplotlib',
'netCDF4',
'numpy',
'pyproj',
'requests',
'scipy',
'xarray',
],

extras_require={
Expand Down
Loading