Skip to content

Commit

Permalink
Merge branch 'main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
ZachHoppinen authored May 24, 2022
2 parents 8c2d697 + c22b69f commit 99874e6
Show file tree
Hide file tree
Showing 7 changed files with 37 additions and 8 deletions.
17 changes: 16 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,19 @@ test.ipynb
uavsar_pytools.egg-info/
build/*
dist/*
uavsar_pytools.egg-info/
uavsar_pytools.egg-info/

# Distribution / packaging
bin/
build/
develop-eggs/
dist/
eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg
2 changes: 1 addition & 1 deletion build/lib/uavsar_pytools/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@


# Version of the package
__version__ = "0.4.8"
__version__ = "0.4.9"
2 changes: 1 addition & 1 deletion build/lib/uavsar_pytools/download/download.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ def stream_download(url, output_f):
pbar.update(len(ch))
else:
if r.status_code == 401:
log.warning(f'HTTP CODE 401. DOWNLOADING REQUIRES A NETRC FILE AND SIGNED UAVSAR END USER AGREEMENT!ß See ReadMe for instructions.')
log.warning(f'HTTP CODE 401. DOWNLOADING REQUIRES A NETRC FILE AND SIGNED UAVSAR END USER AGREEMENT! See ReadMe for instructions.')
elif r.status_code == 404:
log.warning(f'HTTP CODE 404. Url not found. Currently trying {url}.')
else:
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
EMAIL = '[email protected]'
AUTHOR = 'Zach Keskinen and Jack Tarricone'
REQUIRES_PYTHON = '>=3.7.0'
VERSION = '0.4.8'
VERSION = '0.4.9'

# What packages are required for this module to be executed?
REQUIRED = [
Expand Down
17 changes: 15 additions & 2 deletions uavsar_pytools.egg-info/PKG-INFO
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Metadata-Version: 2.1
Name: uavsar-pytools
Version: 0.4.8
Version: 0.4.9
Summary: Tools to download and convert ground projected UAVSAR images.
Home-page: https://github.com/SnowEx/uavsar_pytools
Author: Zach Keskinen and Jack Tarricone
Expand Down Expand Up @@ -35,7 +35,7 @@ pip install uavsar_pytools

You will need a [.netrc file](https://www.gnu.org/software/inetutils/manual/html_node/The-_002enetrc-file.html) in your home directory. This is a special file that stores passwords and usernames to be accessed by programs. If you are already registered at either the alaska satellite facility or jet propulsion laboratory skip step 1. Otherwise:

1. If you need a username and password register at [link](https://search.asf.alaska.edu/). Please ensure you have signed the end user agreement for Uavsar.
1. If you need a username and password register at [link](https://search.asf.alaska.edu/). Please ensure you have signed the end user agreement for Uavsar. You may need to attempt to download a uavsar image from vertex to prompt the end user agreement.

2. In a python terminal or notebook enter:
```python
Expand Down Expand Up @@ -73,6 +73,19 @@ scene.images[0]['array'] # get the first image numpy array for analysis

For quick checks to visualize the data there is also a convenience method `scene.show(i = 1)` that allows you to quickly visualize the first image, or by iterating on i = 2,3,4, etc all the images in the zip file. This method is only available after converting binary images to array with `scene.url_to_tiffs()`.

## Downloading whole collections

Uavsar_pytools can now take a collection name and a start and end date and find, download, and process an entire collection of uavsar images. Collection names can be found at [campaign list](https://api.daac.asf.alaska.edu/services/utils/mission_list). Once you know your campaign name and the date range you can give the package a working directory along with the name and dates and it will do the rest. For example if you want to download all uavsar images for Grand Mesa, Colorado from November 2019 to April 2020 and wanted to save it to your documents folder you would use:

```python
from uavsar_pytools import UavsarCollection
collection = UavsarCollection(collection = 'Grand Mesa, CO', work_dir = '~/Documents/collection_ex/', dates = ('2019-11-01','2020-04-01'))
# to keep binary files use `clean = False`, to download incidence angles with each image use `inc = True`, for only certain pols use `pols = ['VV','HV']`
collection.collection_to_tiffs()
```

Each image pair found will be placed in its own directory with its Alaska Satellite Facility derived name as the directory name. Unlike for UavsarScene this functionality will automatically delete the downloaded binary and zip files after converting them to tiffs to save space.

## Finding URLs for your images

The provided jupyter notebook tutorial in the notebooks folder will walk you through generating a bounding box for your area of interest and finding urls through the [asf_search api](https://github.com/asfadmin/Discovery-asf_search). However if you can also use the [vertex website](https://search.asf.alaska.edu/). After drawing a box and selecting UAVSAR from the platform selection pane (circled in red below) you will get a list of search results. Click on the ground projected image you want to download and right click on the download link (circled in orange below). Select ```copy link``` and you will have copied your relevant zip url.
Expand Down
1 change: 1 addition & 0 deletions uavsar_pytools/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@


# Version of the package

__version__ = "0.2.0"
4 changes: 2 additions & 2 deletions uavsar_pytools/uavsar_collection.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@ def __init__(self, collection, work_dir, overwrite = False, clean = True, debug
self.dates = dates
if dates:
# define search parameters for sierra flight line
self.start_date = dates[0]
self.end_date = dates[1]
self.start_date = pd.to_datetime(dates[0])
self.end_date = pd.to_datetime(dates[1])

def find_urls(self):
# search for data
Expand Down

0 comments on commit 99874e6

Please sign in to comment.