{"id":17846769,"url":"https://github.com/hyriver/pygeohydro","last_synced_at":"2026-04-02T02:19:07.401Z","repository":{"id":38020819,"uuid":"237573928","full_name":"hyriver/pygeohydro","owner":"hyriver","description":"A part of HyRiver software stack for accessing hydrology data through web services","archived":false,"fork":false,"pushed_at":"2025-08-12T12:21:15.000Z","size":171987,"stargazers_count":87,"open_issues_count":4,"forks_count":22,"subscribers_count":3,"default_branch":"main","last_synced_at":"2026-03-16T11:14:29.023Z","etag":null,"topics":["climate-data","data-visualization","hydrologic-database","hydrology","python","usgs","watershed","webservices"],"latest_commit_sha":null,"homepage":"https://docs.hyriver.io","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/hyriver.png","metadata":{"files":{"readme":"README.rst","changelog":"HISTORY.rst","contributing":"CONTRIBUTING.rst","funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.rst","threat_model":null,"audit":null,"citation":"CITATION.cff","codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":"AUTHORS.rst","dei":null,"publiccode":null,"codemeta":null,"zenodo":null},"funding":{"github":["cheginit"]}},"created_at":"2020-02-01T06:37:12.000Z","updated_at":"2026-02-27T20:26:46.000Z","dependencies_parsed_at":"2024-03-09T00:33:06.643Z","dependency_job_id":"456a13f1-66f8-4539-a93f-801c2aec7a89","html_url":"https://github.com/hyriver/pygeohydro","commit_stats":{"total_commits":2608,"total_committers":11,"mean_commits":237.0909090909091,"dds":0.5195552147239264,"last_synced_commit":"74704dd9254caa9ed4074d71a0597f86d0c58dc1"},"previous_names":["cheginit/hydrodata","cheginit/pygeohydro"],"tags_count":58,"template":false,"template_full_name":null,"purl":"pkg:github/hyriver/pygeohydro","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/hyriver%2Fpygeohydro","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/hyriver%2Fpygeohydro/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/hyriver%2Fpygeohydro/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/hyriver%2Fpygeohydro/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/hyriver","download_url":"https://codeload.github.com/hyriver/pygeohydro/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/hyriver%2Fpygeohydro/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31294527,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-02T01:43:37.129Z","status":"online","status_checked_at":"2026-04-02T02:00:08.535Z","response_time":89,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["climate-data","data-visualization","hydrologic-database","hydrology","python","usgs","watershed","webservices"],"created_at":"2024-10-27T22:00:23.486Z","updated_at":"2026-04-02T02:19:07.357Z","avatar_url":"https://github.com/hyriver.png","language":"Python","readme":".. image:: https://raw.githubusercontent.com/hyriver/HyRiver-examples/main/notebooks/_static/pygeohydro_logo.png\n    :target: https://github.com/hyriver/HyRiver\n\n|\n\n.. image:: https://joss.theoj.org/papers/b0df2f6192f0a18b9e622a3edff52e77/status.svg\n    :target: https://joss.theoj.org/papers/b0df2f6192f0a18b9e622a3edff52e77\n    :alt: JOSS\n\n|\n\n.. |pygeohydro| image:: https://github.com/hyriver/pygeohydro/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pygeohydro/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |pygeoogc| image:: https://github.com/hyriver/pygeoogc/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pygeoogc/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |pygeoutils| image:: https://github.com/hyriver/pygeoutils/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pygeoutils/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |pynhd| image:: https://github.com/hyriver/pynhd/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pynhd/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |py3dep| image:: https://github.com/hyriver/py3dep/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/py3dep/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |pydaymet| image:: https://github.com/hyriver/pydaymet/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pydaymet/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |pygridmet| image:: https://github.com/hyriver/pygridmet/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pygridmet/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |pynldas2| image:: https://github.com/hyriver/pynldas2/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/pynldas2/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |async| image:: https://github.com/hyriver/async-retriever/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/async-retriever/actions/workflows/test.yml\n    :alt: Github Actions\n\n.. |signatures| image:: https://github.com/hyriver/hydrosignatures/actions/workflows/test.yml/badge.svg\n    :target: https://github.com/hyriver/hydrosignatures/actions/workflows/test.yml\n    :alt: Github Actions\n\n================ ====================================================================\nPackage          Description\n================ ====================================================================\nPyNHD_           Navigate and subset NHDPlus (MR and HR) using web services\nPy3DEP_          Access topographic data through National Map's 3DEP web service\nPyGeoHydro_      Access NWIS, NID, WQP, eHydro, NLCD, CAMELS, and SSEBop databases\nPyDaymet_        Access daily, monthly, and annual climate data via Daymet\nPyGridMET_       Access daily climate data via GridMET\nPyNLDAS2_        Access hourly NLDAS-2 data via web services\nHydroSignatures_ A collection of tools for computing hydrological signatures\nAsyncRetriever_  High-level API for asynchronous requests with persistent caching\nPyGeoOGC_        Send queries to any ArcGIS RESTful-, WMS-, and WFS-based services\nPyGeoUtils_      Utilities for manipulating geospatial, (Geo)JSON, and (Geo)TIFF data\n================ ====================================================================\n\n.. _PyGeoHydro: https://github.com/hyriver/pygeohydro\n.. _AsyncRetriever: https://github.com/hyriver/async-retriever\n.. _PyGeoOGC: https://github.com/hyriver/pygeoogc\n.. _PyGeoUtils: https://github.com/hyriver/pygeoutils\n.. _PyNHD: https://github.com/hyriver/pynhd\n.. _Py3DEP: https://github.com/hyriver/py3dep\n.. _PyDaymet: https://github.com/hyriver/pydaymet\n.. _PyGridMET: https://github.com/hyriver/pygridmet\n.. _PyNLDAS2: https://github.com/hyriver/pynldas2\n.. _HydroSignatures: https://github.com/hyriver/hydrosignatures\n\nPyGeoHydro: Retrieve Geospatial Hydrology Data\n----------------------------------------------\n\n.. image:: https://img.shields.io/pypi/v/pygeohydro.svg\n    :target: https://pypi.python.org/pypi/pygeohydro\n    :alt: PyPi\n\n.. image:: https://img.shields.io/conda/vn/conda-forge/pygeohydro.svg\n    :target: https://anaconda.org/conda-forge/pygeohydro\n    :alt: Conda Version\n\n.. image:: https://codecov.io/gh/hyriver/pygeohydro/graph/badge.svg\n    :target: https://codecov.io/gh/hyriver/pygeohydro\n    :alt: CodeCov\n\n.. image:: https://img.shields.io/pypi/pyversions/pygeohydro.svg\n    :target: https://pypi.python.org/pypi/pygeohydro\n    :alt: Python Versions\n\n.. image:: https://static.pepy.tech/badge/pygeohydro\n    :target: https://pepy.tech/project/pygeohydro\n    :alt: Downloads\n\n|\n\n.. image:: https://www.codefactor.io/repository/github/hyriver/pygeohydro/badge/main\n    :target: https://www.codefactor.io/repository/github/hyriver/pygeohydro/overview/main\n    :alt: CodeFactor\n\n.. image:: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json\n    :target: https://github.com/astral-sh/ruff\n    :alt: Ruff\n\n.. image:: https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit\u0026logoColor=white\n    :target: https://github.com/pre-commit/pre-commit\n    :alt: pre-commit\n\n.. image:: https://mybinder.org/badge_logo.svg\n    :target: https://mybinder.org/v2/gh/hyriver/HyRiver-examples/main?urlpath=lab/tree/notebooks\n    :alt: Binder\n\n|\n\nFeatures\n--------\n\nPyGeoHydro (formerly named `hydrodata \u003chttps://pypi.org/project/hydrodata\u003e`__) is a part of\n`HyRiver \u003chttps://github.com/hyriver/HyRiver\u003e`__ software stack that\nis designed to aid in hydroclimate analysis through web services. This package provides\naccess to some public web services that offer geospatial hydrology data. It has three\nmain modules: ``pygeohydro``, ``plot``, and ``helpers``.\n\nPyGeoHydro supports the following datasets:\n\n* `gNATSGO \u003chttps://planetarycomputer.microsoft.com/dataset/gnatsgo-rasters\u003e`__ for\n  US soil properties.\n* `SoilGrids \u003chttps://www.isric.org/explore/soilgrids/faq-soilgrids#What_do_the_filename_codes_mean\u003e`__\n  for seamless global soil properties.\n* `Derived Soil Properties \u003chttps://www.sciencebase.gov/catalog/item/5fd7c19cd34e30b9123cb51f\u003e`__\n  for soil porosity, available water capacity, and field capacity across the US.\n* `NWIS \u003chttps://nwis.waterdata.usgs.gov/nwis\u003e`__ for daily mean streamflow observations\n  (returned as a ``pandas.DataFrame`` or ``xarray.Dataset`` with station attributes),\n* `SensorThings API \u003chttps://labs.waterdata.usgs.gov/api-docs/about-sensorthings-api/index.html\u003e`__\n  for accessing real-time data of USGS sensors.\n* `CAMELS \u003chttps://ral.ucar.edu/solutions/products/camels\u003e`__ for accessing streamflow\n  observations (1980-2014) and basin-level attributes of 671 stations within CONUS.\n* `Water Quality Portal \u003chttps://www.waterqualitydata.us/\u003e`__ for accessing current and\n  historical water quality data from more than 1.5 million sites across the US,\n* `NID \u003chttps://nid.sec.usace.army.mil\u003e`__ for accessing the National Inventory of Dams\n  web service,\n* `HCDN 2009 \u003chttps://www2.usgs.gov/science/cite-view.php?cite=2932\u003e`__ for identifying sites\n  where human activity affects the natural flow of the watercourse,\n* `NLCD 2021 \u003chttps://www.mrlc.gov/\u003e`__ for land cover/land use, imperviousness\n  descriptor, and canopy data. You can get data using both geometries and coordinates.\n* `WBD \u003chttps://hydro.nationalmap.gov/arcgis/rest/services/wbd/MapServer/\u003e`__ for accessing\n  Hydrologic Unit (HU) polygon boundaries within the US (all HUC levels).\n* `SSEBop \u003chttps://earlywarning.usgs.gov/ssebop/modis/daily\u003e`__ for daily actual\n  evapotranspiration, for both single pixel and gridded data.\n* `Irrigation Withdrawals \u003chttps://doi.org/10.5066/P9FDLY8P\u003e`__ for estimated\n  monthly water use for irrigation by 12-digit hydrologic unit in the CONUS for 2015\n* `STN \u003chttps://stn.wim.usgs.gov/STNWeb/#/\u003e`__ for access USGS Short-Term Network (STN)\n* `eHydro \u003chttps://navigation.usace.army.mil/Survey/Hydro\u003e`__ for accessing USACE\n  Hydrographic Surveys that includes topobathymetry data\n* `NFHL \u003chttps://hazards.fema.gov/femaportal/wps/portal/NFHLWMS\u003e`__ for accessing\n  FEMA's National Flood Hazard Layer (NFHL) data.\n\nAlso, it includes several other functions:\n\n* ``interactive_map``: Interactive map for exploring NWIS stations within a bounding box.\n* ``cover_statistics``: Categorical statistics of land use/land cover data.\n* ``overland_roughness``: Estimate overland roughness from land use/land cover data.\n* ``streamflow_fillna``: Fill missing daily streamflow values with day-of-year averages.\n  Streamflow observations must be at least for 10-year long.\n\nThe ``plot`` module includes two main functions:\n\n* ``signatures``: Hydrologic signature graphs.\n* ``cover_legends``: Official NLCD land cover legends for plotting a land cover dataset.\n* ``descriptor_legends``: Color map and legends for plotting an imperviousness descriptor dataset.\n\nThe ``helpers`` module includes:\n\n* ``nlcd_helper``: A roughness coefficients lookup table for each land cover and imperviousness\n  descriptor type which is useful for overland flow routing among other applications.\n* ``nwis_error``: A dataframe for finding information about NWIS requests' errors.\n\nYou can find some example notebooks `here \u003chttps://github.com/hyriver/HyRiver-examples\u003e`__.\n\nMoreover, under the hood, PyGeoHydro uses\n`PyGeoOGC \u003chttps://github.com/hyriver/pygeoogc\u003e`__ and\n`AsyncRetriever \u003chttps://github.com/hyriver/async-retriever\u003e`__ packages\nfor making requests in parallel and storing responses in chunks. This improves the\nreliability and speed of data retrieval significantly.\n\nYou can control the request/response caching behavior and verbosity of the package\nby setting the following environment variables:\n\n* ``HYRIVER_CACHE_NAME``: Path to the caching SQLite database for asynchronous HTTP\n  requests. It defaults to ``./cache/aiohttp_cache.sqlite``\n* ``HYRIVER_CACHE_NAME_HTTP``: Path to the caching SQLite database for HTTP requests.\n  It defaults to ``./cache/http_cache.sqlite``\n* ``HYRIVER_CACHE_EXPIRE``: Expiration time for cached requests in seconds. It defaults to\n  one week.\n* ``HYRIVER_CACHE_DISABLE``: Disable reading/writing from/to the cache. The default is false.\n* ``HYRIVER_SSL_CERT``: Path to a SSL certificate file.\n\nFor example, in your code before making any requests you can do:\n\n.. code-block:: python\n\n    import os\n\n    os.environ[\"HYRIVER_CACHE_NAME\"] = \"path/to/aiohttp_cache.sqlite\"\n    os.environ[\"HYRIVER_CACHE_NAME_HTTP\"] = \"path/to/http_cache.sqlite\"\n    os.environ[\"HYRIVER_CACHE_EXPIRE\"] = \"3600\"\n    os.environ[\"HYRIVER_CACHE_DISABLE\"] = \"true\"\n    os.environ[\"HYRIVER_SSL_CERT\"] = \"path/to/cert.pem\"\n\nYou can also try using PyGeoHydro without installing\nit on your system by clicking on the binder badge. A Jupyter Lab\ninstance with the HyRiver stack pre-installed will be launched in your web browser, and you\ncan start coding!\n\nMoreover, requests for additional functionalities can be submitted via\n`issue tracker \u003chttps://github.com/hyriver/pygeohydro/issues\u003e`__.\n\nCitation\n--------\nIf you use any of HyRiver packages in your research, we appreciate citations:\n\n.. code-block:: bibtex\n\n    @article{Chegini_2021,\n        author = {Chegini, Taher and Li, Hong-Yi and Leung, L. Ruby},\n        doi = {10.21105/joss.03175},\n        journal = {Journal of Open Source Software},\n        month = {10},\n        number = {66},\n        pages = {1--3},\n        title = {{HyRiver: Hydroclimate Data Retriever}},\n        volume = {6},\n        year = {2021}\n    }\n\nInstallation\n------------\n\nYou can install PyGeoHydro using ``pip`` after installing ``libgdal`` on your system\n(for example, in Ubuntu run ``sudo apt install libgdal-dev``). Moreover, PyGeoHydro has an optional\ndependency for using persistent caching, ``requests-cache``. We highly recommend installing\nthis package as it can significantly speed up send/receive queries. You don't have to change\nanything in your code, since PyGeoHydro under-the-hood looks for ``requests-cache`` and\nif available, it will automatically use persistent caching:\n\n.. code-block:: console\n\n    $ pip install pygeohydro\n\nAlternatively, PyGeoHydro can be installed from the ``conda-forge`` repository\nusing `Conda \u003chttps://docs.conda.io/en/latest/\u003e`__:\n\n.. code-block:: console\n\n    $ conda install -c conda-forge pygeohydro\n\nQuick start\n-----------\nWe can obtain river topobathymetry data using the ``EHydro`` class. We can subset\nthe dataset either using a geometry or a bounding box, based on their ID, or SQL query:\n\n.. code-block:: python\n\n    from pygeohydro import EHydro\n\n    ehydro = EHydro(\"points\")\n    topobathy = ehydro.bygeom((-122.53, 45.57, -122.52, 45.59))\n\nWe can explore the available NWIS stations within a bounding box using ``interactive_map``\nfunction. It returns an interactive map and by clicking on a station some of the most\nimportant properties of stations are shown.\n\n.. code-block:: python\n\n    import pygeohydro as gh\n\n    bbox = (-69.5, 45, -69, 45.5)\n    gh.interactive_map(bbox)\n\n.. image:: https://raw.githubusercontent.com/hyriver/HyRiver-examples/main/notebooks/_static/interactive_map.png\n    :target: https://github.com/hyriver/HyRiver-examples/blob/main/notebooks/nwis.ipynb\n    :alt: Interactive Map\n\nWe can select all the stations within this boundary box that have daily mean streamflow data from\n``2000-01-01`` to ``2010-12-31``:\n\n.. code-block:: python\n\n    from pygeohydro import NWIS\n\n    nwis = NWIS()\n    query = {\n        \"bBox\": \",\".join(f\"{b:.06f}\" for b in bbox),\n        \"hasDataTypeCd\": \"dv\",\n        \"outputDataTypeCd\": \"dv\",\n    }\n    info_box = nwis.get_info(query)\n    dates = (\"2000-01-01\", \"2010-12-31\")\n    stations = info_box[\n        (info_box.begin_date \u003c= dates[0]) \u0026 (info_box.end_date \u003e= dates[1])\n    ].site_no.tolist()\n\nThen, we can get the daily streamflow data in mm/day (by default the values are in cms)\nand plot them:\n\n.. code-block:: python\n\n    from pygeohydro import plot\n\n    qobs = nwis.get_streamflow(stations, dates, mmd=True)\n    plot.signatures(qobs)\n\nBy default, ``get_streamflow`` returns a ``pandas.DataFrame`` that has a ``attrs`` method\ncontaining metadata for all the stations. You can access it like so ``qobs.attrs``.\nMoreover, we can get the same data as ``xarray.Dataset`` as follows:\n\n.. code-block:: python\n\n    qobs_ds = nwis.get_streamflow(stations, dates, to_xarray=True)\n\nThis ``xarray.Dataset`` has two dimensions: ``time`` and ``station_id``. It has\n10 variables including ``discharge`` with two dimensions while other variables\nthat are station attitudes are one dimensional.\n\nWe can also get instantaneous streamflow data using ``get_streamflow``. This method assumes\nthat the input dates are in UTC time zone and returns the data in UTC time zone as well.\n\n.. code-block:: python\n\n    date = (\"2005-01-01 12:00\", \"2005-01-12 15:00\")\n    qobs = nwis.get_streamflow(\"01646500\", date, freq=\"iv\")\n\nWe can query USGS stations of type \"stream\" in Arizona using SensorThings API\nas follows:\n\n.. code-block:: python\n\n    odata = {\n        \"filter\": \"properties/monitoringLocationType eq 'Stream' and properties/stateFIPS eq 'US:04'\",\n    }\n    df = sensor.query_byodata(odata)\n\nIrrigation withdrawals data can be obtained as follows:\n\n.. code-block:: python\n\n    irr = gh.irrigation_withdrawals()\n\nWe can get the CAMELS dataset as a ``geopandas.GeoDataFrame`` that includes geometry and\nbasin-level attributes of 671 natural watersheds within CONUS and their streamflow\nobservations between 1980-2014 as a ``xarray.Dataset``, like so:\n\n.. code-block:: python\n\n    attrs, qobs = gh.get_camels()\n\nThe ``WaterQuality`` has a number of convenience methods to retrieve data from the\nweb service. Since there are many parameter combinations that can be\nused to retrieve data, a general method is also provided to retrieve data from\nany of the valid endpoints. You can use ``get_json`` to retrieve stations info\nas a ``geopandas.GeoDataFrame`` or ``get_csv`` to retrieve stations data as a\n``pandas.DataFrame``. You can construct a dictionary of the parameters and pass\nit to one of these functions. For more information on the parameters, please\nconsult the `Water Quality Data documentation \u003chttps://www.waterqualitydata.us/webservices_documentation\u003e`__.\nFor example, let's find all the stations within a bounding box that have Caffeine data:\n\n.. code-block:: python\n\n    from pynhd import WaterQuality\n\n    bbox = (-92.8, 44.2, -88.9, 46.0)\n    kwds = {\"characteristicName\": \"Caffeine\"}\n    wq = WaterQuality()\n    stations = wq.station_bybbox(bbox, kwds)\n\nOr the same criterion but within a 30-mile radius of a point:\n\n.. code-block:: python\n\n    stations = wq.station_bydistance(-92.8, 44.2, 30, kwds)\n\nThen we can get the data for all these stations the data like this:\n\n.. code-block:: python\n\n    sids = stations.MonitoringLocationIdentifier.tolist()\n    caff = wq.data_bystation(sids, kwds)\n\n.. image:: https://raw.githubusercontent.com/hyriver/HyRiver-examples/main/notebooks/_static/water_quality.png\n    :target: https://github.com/hyriver/HyRiver-examples/blob/main/notebooks/water_quality.ipynb\n    :alt: Water Quality\n\nMoreover, we can get land use/land cove data using ``nlcd_bygeom`` or ``nlcd_bycoods`` functions,\npercentages of land cover types using ``cover_statistics``, and overland roughness using\n``overland_roughness``. The ``nlcd_bycoords`` function returns a ``geopandas.GeoDataFrame``\nwith the NLCD layers as columns and input coordinates as the ``geometry`` column. Moreover,\nthe ``nlcd_bygeom`` function accepts both a single geometry or a ``geopandas.GeoDataFrame``\nas the input.\n\n.. code-block:: python\n\n    from pynhd import NLDI\n\n    basins = NLDI().get_basins([\"01031450\", \"01318500\", \"01031510\"])\n    lulc = gh.nlcd_bygeom(basins, 100, years={\"cover\": [2016, 2019]})\n    stats = gh.cover_statistics(lulc[\"01318500\"].cover_2016)\n    roughness = gh.overland_roughness(lulc[\"01318500\"].cover_2019)\n\n.. image:: https://raw.githubusercontent.com/hyriver/HyRiver-examples/main/notebooks/_static/lulc.png\n    :target: https://github.com/hyriver/HyRiver-examples/blob/main/notebooks/nlcd.ipynb\n    :alt: Land Use/Land Cover\n\nNext, let's use ``ssebopeta_bygeom`` to get actual ET data for a basin. Note that there's a\n``ssebopeta_bycoords`` function that returns an ETA time series for a single coordinate.\n\n.. code-block:: python\n\n    geometry = NLDI().get_basins(\"01315500\").geometry[0]\n    eta = gh.ssebopeta_bygeom(geometry, dates=(\"2005-10-01\", \"2005-10-05\"))\n\n.. image:: https://raw.githubusercontent.com/hyriver/HyRiver-examples/main/notebooks/_static/eta.png\n    :target: https://github.com/hyriver/HyRiver-examples/blob/main/notebooks/ssebop.ipynb\n    :alt: Actual ET\n\nAdditionally, we can pull all the US dams data using ``NID``. Let's get dams that are within this\nbounding box and have a maximum storage larger than 200 acre-feet.\n\n.. code-block:: python\n\n    nid = NID()\n    dams = nid.get_bygeom((-65.77, 43.07, -69.31, 45.45), 4326)\n    dams = nid.inventory_byid(dams.id.to_list())\n    dams = dams[dams.maxStorage \u003e 200]\n\nWe can get also all dams within CONUS with maximum storage larger than 2500 acre-feet:\n\n.. code-block:: python\n\n    conus_geom = gh.get_us_states(\"contiguous\")\n\n    dam_list = nid.get_byfilter([{\"maxStorage\": [\"[2500 +inf]\"]}])\n    dams = nid.inventory_byid(dam_list[0].id.to_list(), stage_nid=True)\n\n    conus_dams = dams[dams.stateKey.isin(conus_geom.STUSPS)].reset_index(drop=True)\n\n.. image:: https://raw.githubusercontent.com/hyriver/HyRiver-examples/main/notebooks/_static/dams.png\n    :target: https://github.com/hyriver/HyRiver-examples/blob/main/notebooks/nid.ipynb\n    :alt: Dams\n\n\nThe ``WBD`` class allows us to get Hydrologic Unit (HU) polygon boundaries. Let's\nget the two Hudson HUC4s:\n\n.. code-block:: python\n\n    from pygeohydro import WBD\n\n    wbd = WBD(\"huc4\")\n    hudson = wbd.byids(\"huc4\", [\"0202\", \"0203\"])\n\n\nThe ``NFHL`` class allows us to retrieve FEMA's National Flood Hazard Layer (NFHL) data.\nLet's get the cross-section data for a small region in Vermont:\n\n.. code-block:: python\n\n    from pygeohydro import NFHL\n\n    nfhl = NFHL(\"NFHL\", \"cross-sections\")\n    gdf_xs = nfhl.bygeom((-73.42, 43.28, -72.9, 43.52), geo_crs=4269)\n\n\nContributing\n------------\n\nContributions are very welcomed. Please read\n`CONTRIBUTING.rst \u003chttps://github.com/hyriver/pygeoogc/blob/main/CONTRIBUTING.rst\u003e`__\nfile for instructions.\n\nCredits\n-------\n\nThis package was created based on the `audreyr/cookiecutter-pypackage`__ project template.\n\n__ https://github.com/audreyr/cookiecutter-pypackage\n","funding_links":["https://github.com/sponsors/cheginit"],"categories":["Software"],"sub_categories":["Data Mining"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhyriver%2Fpygeohydro","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fhyriver%2Fpygeohydro","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhyriver%2Fpygeohydro/lists"}