Pabitra Dash

USU

 Recent Activity

ABSTRACT:

This resource contains a draft Jupyter Notebook that has example code snippets showing how to access HydroShare resource files using HydroShare S3 buckets. The user_account.py is a utility to read user hydroshare cached account information in any of the JupyterHub instances that HydroShare has access to. The example notebook uses this utility so that you don't have to enter your hydroshare account information in order to access hydroshare buckets.

Here are the 3 notebooks in this resource:

- hydroshare_s3_bucket_access_examples.ipynb:

The above notebook has examples showing how to upload/download resource files from the resource bucket. It also contains examples how to list files and folders for a resource on a bucket.

- python-modules-direct-read-from-bucket/hs_bucket_access_gdal_example.ipynb:

The above notebook has examples for reading raster and shapefile from bucket using gdal without the need of downloading the file from the bucket to local disk.

- python-modules-direct-read-from-bucket/hs_bucket_access_non_gdal_example.ipynb

The above notebook has examples of using h5netcdf and xarray for reading netcdf file directly from bucket. It also contains examples of using rioxarray to read raster file, and pandas to read CSV file from hydroshare buckets.

Show More

ABSTRACT:

This resource contains the following bash scripts which can be used by the user in any of the JupterHub instances accessible via the "Open with" functionality of HydroShare to create conda environment to run AORC related notebooks. In order to run any of the listed bash script files in JupyterHub, first make the file executable using the JupyterHub terminal. Here is an example:

chmod +x setup_aorc_conda_env_cuahsi_jh.sh

Then the above script can be executed from the command line as follows:

./setup_aorc_conda_env_cuahsi_jh.sh

After executing the script, it will create a new conda environment called 'aorc'. It will also register this new conda environment as a new Jupyter kernel with name 'Python [conda env:aorc]'. In order for any of the AORC related notebooks to use this kernel you have to first shut down all kernels and then use the option 'Change Kernel" to select this new kernel.

- setup_aorc_conda_env_cuahsi_jh.sh (This script should be used in CUAHSI JupyterHub to create a new conda environment called 'aorc'. The corresponding new kernel that is created will have name as 'Python [conda env:aorc]' )

- setup_aorc_conda_env_cybergis_jh.sh (This script should be used in CyberGIS Jupyter for Water) to create a new conda environment called 'aorc'. The corresponding new kernel that is created will have name as 'Python 3 (AORC)')

- environment.yml (This file contains the python modules needed to run the AORC notebooks - similar to requirements.txt file used with pip install). The above script files use this yml file to install the python modules listed in that file.

- delete_aorc_conda_env_cuahsi_jh.sh (This script can be run to delete the conda env 'aorc' in CUAHSI JupyterHub)

- delete_aorc_conda_env_cybergis_jh.sh (This script can be run to delete the conda env 'aorc' in CyberGIS Jupyter for Water)

- setup_aorc_conda_env_2i2c_jh.sh (This script should be used in 2i2c JupyterHub to create a new conda environment called 'aorc'. The corresponding new kernel that is created will have name as 'Python [conda env:.conda-aorc]')

- environment-2i2c.yml (This file is used in the above script that creates the aorc conda env in 2i2c. This environment file has few python modules for installing using conda)

- requirements-2i2c.txt (This file is used in the above script that creates the aorc conda env in 2i2c to pip install most of the python modules needed for AORC notebooks. The reason for using 'pip install' in case of 2i2c to install majority of python modules for the new conda environment is that 2i2c gives error 'no space left on device' if the modules are installed using 'conda'.)

- delete_aorc_conda_env_2i2c_jh.sh (This script can be run to delete the conda env 'aorc' in 2i2c JupyterHub)

- sample_commands_to_create_conda_env_jh.md (This file contains example commands to create a new conda environment - without using a bash script)

How to use the new kernel created by the script (see above)?
If you just created the new aorc conda environment, shut down all kernels. Open the notebook that needs to use the aorc conda environment. Change the kernel for the notebook to aorc kernel. (for the exact name of the kernel see above).

How to update the aorc conda environment after it has been created?
Update here means installing additional python modules or upgrading to a newer version of any modules that is already installed in the aorc conda environment. The following commands are applicable to CUAHSI and CyberGIS JH instances.

- Update the environment.yml file by adding new modules or updating the version of any modules in that file.

- In the JupyterHub instance, use a terminal and run the following 2 commands:

- conda activate aorc
- mamba env update -f environment.yml

Update aorc environment in 2i2c:
- conda activate aorc

If you have updated the environments-2i2c.yml, run
- conda env update -f environment-2i2c.yml

If you have updated the requirements-2i2c.txt file, run
- pip install -r requirements-2i2c.txt

If you just want to install few new modules or upgrade existing modules without updating environment or requirements file, then run (after activating the aorc env):

conda install <module-1-name> <module-2-name>
OR
pip install <module-1-name> <module-2-name>

NOTE: It is better to update environment or requirements file when there is a need to update the environment as it helps reproducibility.

Show More

ABSTRACT:

This resource contains updated (code re-organization and cleanup) of Ayman's notebooks from the following 3 resources.

- AORC Notebook https://www.hydroshare.org/resource/72ea9726187e43d7b50a624f2acf591f/
- NWM retrospective data retrieval https://www.hydroshare.org/resource/6ca065138d764339baf3514ba2f2d72f/
- HydroFabric subsetter and retrieval https://www.hydroshare.org/resource/631be88704ef4167b12e9ad9d2529ba9/

Show More

ABSTRACT:

This netCDF data is the simulation output from Utah Energy Balance (UEB) model.It includes the simulation result of snow water equivalent during the period Oct. 2009 to June 2010 for TWDEF site in Utah.

Show More

ABSTRACT:

Testing timeout error with file list endpoint.

Show More

 Contact

 Author Identifiers

Resources
All 0
Collection 0
Resource 0
App Connector 0
Resource Resource
Composite Resource Type Design
Created: April 22, 2016, 10:17 p.m.
Authors: Tian Gan

ABSTRACT:

This includes the basic design idea of the composite resource type for discussion

Show More
Resource Resource

ABSTRACT:

The HydroDS tasks required to be executed to get complete UEB model inputs for an example watershed are given in the Python file “HydroDS_UEB_Setup”. This file calls functions from the other file, "hydrods_python_client" that has declarations for data service functions available from HydroDS.

To run the workflow for a different watershed in the Western US, modify the coordinates of the watershed boundary, outlet location, the start and end time of model period, and the spatial reference (projection) information in the form of EPSG Code (http://spatialreference.org/ref/epsg/). The commands in the workflow script can also be called interactively from any Python command line, or from a user application that uses incorporates the Python Client Library.

For watersheds outside of the Western US, but in the CONUS, you need to upload your own DEM. The services are currently limited to the US.
You need to have a HydroDS account to use these services.

These scripts are for the following paper
Gichamo, T. Z., N. S. Sazib, D. G. Tarboton and P. Dash, (2020), "HydroDS: Data Services in Support of Physically Based, Distributed Hydrological Models," Environmental Modelling & Software, https://doi.org/10.1016/j.envsoft.2020.104623.

Show More
Resource Resource
mf2k
Created: June 30, 2023, 4:49 p.m.
Authors: Jeff Sadler

ABSTRACT:

Testing timeout error with file list endpoint.

Show More
Resource Resource
Example Aggregation Metadata in Schema Org Format
Created: March 27, 2024, 2:39 a.m.
Authors: Dash, Pabitra · Jamy

ABSTRACT:

This netCDF data is the simulation output from Utah Energy Balance (UEB) model.It includes the simulation result of snow water equivalent during the period Oct. 2009 to June 2010 for TWDEF site in Utah.

Show More
Resource Resource
Ayman's Updated Notebooks
Created: Sept. 12, 2024, 2:18 p.m.
Authors: Dash, Pabitra

ABSTRACT:

This resource contains updated (code re-organization and cleanup) of Ayman's notebooks from the following 3 resources.

- AORC Notebook https://www.hydroshare.org/resource/72ea9726187e43d7b50a624f2acf591f/
- NWM retrospective data retrieval https://www.hydroshare.org/resource/6ca065138d764339baf3514ba2f2d72f/
- HydroFabric subsetter and retrieval https://www.hydroshare.org/resource/631be88704ef4167b12e9ad9d2529ba9/

Show More
Resource Resource
Scripts to Create Conda Environments in JupyterHub
Created: May 30, 2025, 1:58 p.m.
Authors: Dash, Pabitra

ABSTRACT:

This resource contains the following bash scripts which can be used by the user in any of the JupterHub instances accessible via the "Open with" functionality of HydroShare to create conda environment to run AORC related notebooks. In order to run any of the listed bash script files in JupyterHub, first make the file executable using the JupyterHub terminal. Here is an example:

chmod +x setup_aorc_conda_env_cuahsi_jh.sh

Then the above script can be executed from the command line as follows:

./setup_aorc_conda_env_cuahsi_jh.sh

After executing the script, it will create a new conda environment called 'aorc'. It will also register this new conda environment as a new Jupyter kernel with name 'Python [conda env:aorc]'. In order for any of the AORC related notebooks to use this kernel you have to first shut down all kernels and then use the option 'Change Kernel" to select this new kernel.

- setup_aorc_conda_env_cuahsi_jh.sh (This script should be used in CUAHSI JupyterHub to create a new conda environment called 'aorc'. The corresponding new kernel that is created will have name as 'Python [conda env:aorc]' )

- setup_aorc_conda_env_cybergis_jh.sh (This script should be used in CyberGIS Jupyter for Water) to create a new conda environment called 'aorc'. The corresponding new kernel that is created will have name as 'Python 3 (AORC)')

- environment.yml (This file contains the python modules needed to run the AORC notebooks - similar to requirements.txt file used with pip install). The above script files use this yml file to install the python modules listed in that file.

- delete_aorc_conda_env_cuahsi_jh.sh (This script can be run to delete the conda env 'aorc' in CUAHSI JupyterHub)

- delete_aorc_conda_env_cybergis_jh.sh (This script can be run to delete the conda env 'aorc' in CyberGIS Jupyter for Water)

- setup_aorc_conda_env_2i2c_jh.sh (This script should be used in 2i2c JupyterHub to create a new conda environment called 'aorc'. The corresponding new kernel that is created will have name as 'Python [conda env:.conda-aorc]')

- environment-2i2c.yml (This file is used in the above script that creates the aorc conda env in 2i2c. This environment file has few python modules for installing using conda)

- requirements-2i2c.txt (This file is used in the above script that creates the aorc conda env in 2i2c to pip install most of the python modules needed for AORC notebooks. The reason for using 'pip install' in case of 2i2c to install majority of python modules for the new conda environment is that 2i2c gives error 'no space left on device' if the modules are installed using 'conda'.)

- delete_aorc_conda_env_2i2c_jh.sh (This script can be run to delete the conda env 'aorc' in 2i2c JupyterHub)

- sample_commands_to_create_conda_env_jh.md (This file contains example commands to create a new conda environment - without using a bash script)

How to use the new kernel created by the script (see above)?
If you just created the new aorc conda environment, shut down all kernels. Open the notebook that needs to use the aorc conda environment. Change the kernel for the notebook to aorc kernel. (for the exact name of the kernel see above).

How to update the aorc conda environment after it has been created?
Update here means installing additional python modules or upgrading to a newer version of any modules that is already installed in the aorc conda environment. The following commands are applicable to CUAHSI and CyberGIS JH instances.

- Update the environment.yml file by adding new modules or updating the version of any modules in that file.

- In the JupyterHub instance, use a terminal and run the following 2 commands:

- conda activate aorc
- mamba env update -f environment.yml

Update aorc environment in 2i2c:
- conda activate aorc

If you have updated the environments-2i2c.yml, run
- conda env update -f environment-2i2c.yml

If you have updated the requirements-2i2c.txt file, run
- pip install -r requirements-2i2c.txt

If you just want to install few new modules or upgrade existing modules without updating environment or requirements file, then run (after activating the aorc env):

conda install <module-1-name> <module-2-name>
OR
pip install <module-1-name> <module-2-name>

NOTE: It is better to update environment or requirements file when there is a need to update the environment as it helps reproducibility.

Show More
Resource Resource

ABSTRACT:

This resource contains a draft Jupyter Notebook that has example code snippets showing how to access HydroShare resource files using HydroShare S3 buckets. The user_account.py is a utility to read user hydroshare cached account information in any of the JupyterHub instances that HydroShare has access to. The example notebook uses this utility so that you don't have to enter your hydroshare account information in order to access hydroshare buckets.

Here are the 3 notebooks in this resource:

- hydroshare_s3_bucket_access_examples.ipynb:

The above notebook has examples showing how to upload/download resource files from the resource bucket. It also contains examples how to list files and folders for a resource on a bucket.

- python-modules-direct-read-from-bucket/hs_bucket_access_gdal_example.ipynb:

The above notebook has examples for reading raster and shapefile from bucket using gdal without the need of downloading the file from the bucket to local disk.

- python-modules-direct-read-from-bucket/hs_bucket_access_non_gdal_example.ipynb

The above notebook has examples of using h5netcdf and xarray for reading netcdf file directly from bucket. It also contains examples of using rioxarray to read raster file, and pandas to read CSV file from hydroshare buckets.

Show More