NDBC API
A Python API for retrieving meteorological and oceanographic data from the National Data Buoy Center.
https://github.com/cdjellen/ndbc-api
Category: Hydrosphere
Sub Category: Ocean and Hydrology Data Access
Keywords
api climate noaa ocean python
Last synced: about 2 hours ago
JSON representation
Repository metadata
A Python API for retrieving meteorological and oceanographic data from the National Data Buoy Center (NDBC).
- Host: GitHub
- URL: https://github.com/cdjellen/ndbc-api
- Owner: CDJellen
- License: mit
- Created: 2022-05-25T16:05:23.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2025-06-09T15:50:40.000Z (6 days ago)
- Last Synced: 2025-06-09T16:04:01.677Z (6 days ago)
- Topics: api, climate, noaa, ocean, python
- Language: Jupyter Notebook
- Homepage:
- Size: 25.3 MB
- Stars: 15
- Watchers: 1
- Forks: 4
- Open Issues: 0
- Releases: 11
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
README.md
The National Oceanic and Atmospheric Association's National Data Buoy Center maintains marine monitoring and observation stations around the world^1. These stations report atmospheric, oceanographic, and other meterological data at regular intervals to the NDBC. Measurements are made available over HTTP through the NDBC's data service.
The ndbc-api is a python library that makes this data more widely accessible.
The ndbc-api is primarily built to parse whitespace-delimited oceanographic and atmospheric data distributed as text files for available time ranges, on a station-by-station basis^2. Measurements are typically distributed as utf-8
encoded, station-by-station, fixed-period text files. More information on the measurements and methodology are available on the NDBC website^3.
Please see the included example notebook for a more detailed walkthrough of the API's capabilities.
Installation
The ndbc-api
can be installed via PIP:
pip install ndbc-api
Conda users can install the ndbc-api
via the conda-forge
channel:
conda install -c conda-forge ndbc-api
Finally, to install the ndbc-api
from source, clone the repository and run the following command:
python setup.py install
Requirements
The ndbc-api
has been tested on Python 3.6, 3.7, 3.8, 3.9, and 3.10. Python 2 support is not currently planned, but could be implemented based on the needs of the atmospheric research community.
The API uses synchronous HTTP requests to compile data matching the user-supplied parameters. The ndbc-api
package depends on:
- requests>=2.10.0
- pandas
- bs4
- html5lib>=1.1
- xarray
- scipy
Development
If you would like to contribute to the growth and maintenance of the ndbc-api
, please feel free to open a PR with tests covering your changes. The tests leverage pytest
and depend on the above requirements, as well as:
- coveralls
- httpretty
- pytest
- pytest-cov
- pyyaml
- pyarrow
Breaking changes will be considered, especially in the current alpha
state of the package on PyPi
. As the API further matures, breaking changes will only be considered with new major versions (e.g. N.0.0
).
Example
The ndbc-api
exposes public methods through the NdbcApi
class.
from ndbc_api import NdbcApi
api = NdbcApi()
The NdbcApi
provides a unified access point for NDBC data. All methods for obtaining data, metadata, and locating stations are available using the api
object. The get_data
method is the primary method for accessing NDBC data, and is used to retrieve measurements from a given station over a specified time range. This method can request data from the NDBC HTTP Data Service or the THREDDS data service, and return the data as a pandas.DataFrame
, xarray.Dataset
or python dict
object.
Data made available by the NDBC falls into two broad categories.
- Station metadata
- Station measurements
The api
supports a range of public methods for accessing data from the above categories.
Station metadata
The api
has five key public methods for accessing NDBC metadata.
- The
stations
method, which returns all NDBC stations. - The
nearest_station
method, which returns the station ID of the nearest station. - The
station
method, which returns station metadata from a given station ID. - The
available_realtime
method, which returns hyperlinks and measurement names for realtime measurements captured by a given station. - The
available_historical
method, which returns hyperlinks and measurement names for historical measurements captured by a given station.
stations
# get all stations and some metadata as a Pandas DataFrame
stations_df = api.stations()
# parse the response as a dictionary
stations_dict = api.stations(as_df=False)
nearest_station
# specify desired latitude and longitude
lat = '38.88N'
lon = '76.43W'
# find the station ID of the nearest NDBC station
nearest = api.nearest_station(lat=lat, lon=lon)
print(nearest_station)
'tplm2'
radial_search
# specify desired latitude, longitude, radius, and units
lat = '38.88N'
lon = '76.43W'
radius = 100
units = 'km'
# find the station IDs of all NDBC stations within the radius
nearby_stations_df = api.radial_search(lat=lat, lon=lon, radius=radius, units=units)
'tplm2'
station
# get station metadata
tplm2_meta = api.station(station_id='tplm2')
# parse the response as a Pandas DataFrame
tplm2_df = api.station(station_id='tplm2', as_df=True)
available_realtime
# get all available realtime measurements, periods, and hyperlinks
tplm2_realtime = api.available_realtime(station_id='tplm2')
# parse the response as a Pandas DataFrame
tplm2_realtime_df = api.available_realtime(station_id='tplm2', as_df=True)
available_historical
# get all available historical measurements, periods, and hyperlinks
tplm2_historical = api.available_historical(station_id='tplm2')
# parse the response as a Pandas DataFrame
tplm2_historical_df = api.available_historical(station_id='tplm2', as_df=True)
Station measurements
The api
has two public methods which support accessing supported NDBC station measurements.
- The
get_modes
method, which returns a list of supportedmode
s, corresponding to the data formats provided by the NDBC data service. For example, theadcp
mode represents "Acoustic Doppler Current Profiler" measurements, providing information about ocean currents at different depths, whilecwind
represents "Continuous winds" data, offering high-frequency wind speed and direction measurements.
Note that not all stations provide the same set of measurements. The available_realtime
and available_historical
methods can be called on a station-by station basis to ensure a station has the desired data available, before building and executing requests with get_data
.
- The
get_data
method, which returns measurements of a given type for a given station.
get_modes
# get the list of supported meterological measurement modes
modes = api.get_modes()
print(modes)
[
'adcp',
'cwind',
'ocean',
'spec',
'stdmet',
'supl',
'swden',
'swdir',
'swdir2',
'swr1',
'swr2'
]
The mode values above map directly to the identifiers used buy the NDBC. Desriptions for each mode are presented below:
adcp
: Acoustic Doppler Current Profiler measurements, providing information about ocean currents at different depths.cwind
: Continuous winds data, offering high-frequency wind speed and direction measurements.ocean
: Oceanographic data, including water temperature, salinity, and wave measurements.spec
: Spectral wave data, providing detailed information about wave energy and direction.stdmet
: Standard meteorological data, including air temperature, pressure, wind speed, and visibility.supl
: Supplemental measurements, which can vary depending on the specific buoy and its sensors.swden
: Spectral wave density data, providing information about the distribution of wave energy across different frequencies.swdir
: Spectral wave direction data, indicating the primary direction of wave energy.swdir2
: Secondary spectral wave direction data, capturing additional wave direction information.swr1
: First-order spectral wave data, providing basic wave height and period information.swr2
: Second-order spectral wave data, offering more detailed wave measurements.
get_data
# get all continuous wind (`cwind`) measurements for station tplm2
cwind_df = api.get_data(
station_id='tplm2',
mode='cwind',
start_time='2020-01-01',
end_time='2022-09-15',
)
# return data as a dictionary
cwind_dict = api.get_data(
station_id='tplm2',
mode='cwind',
start_time='2020-01-01',
end_time='2022-09-15',
as_df=False
)
# get only the wind speed measurements
wspd_df = api.get_data(
station_id='tplm2',
mode='cwind',
start_time='2020-01-01',
end_time='2022-09-15',
as_df=True,
cols=['WSPD']
)
# get all standard meterological (`stdmet`) measurements for stations tplm2 and apam2
stdmet_df = api.get_data(
station_ids=['tplm2', 'apam2'],
mode='stdmet',
start_time='2022-01-01',
end_time='2023-01-01',
)
# get all (available) continuous wind and standard meterological measurements for stations tplm2 and apam2
# for station apam2, this is unavailable and will log an error but not affect the rest of the results.
stdmet_df = api.get_data(
station_ids=['tplm2', 'apam2'],
modes=['stdmet', 'cwind'],
start_time='2022-01-01',
end_time='2023-01-01',
)
More Information
Please see the included example notebook for a more detailed walkthrough of the API's capabilities.
Questions
If you have questions regarding the library please post them into
the GitHub discussion forum.
Owner metadata
- Name: Chris Jellen
- Login: CDJellen
- Email:
- Kind: user
- Description: Cloud software engineering at Microsoft. Cross-team data solutions focused on observability, reliability, strategic planning, and automation.
- Website: cdjellen.com
- Location: Seattle, WA
- Twitter:
- Company: @Microsoft
- Icon url: https://avatars.githubusercontent.com/u/64814231?u=c278c4c6fa8a93417c54df64b65d75f2372780c9&v=4
- Repositories: 5
- Last ynced at: 2023-05-05T07:45:47.738Z
- Profile URL: https://github.com/CDJellen
GitHub Events
Total
- Create event: 15
- Issues event: 9
- Release event: 8
- Watch event: 4
- Delete event: 5
- Issue comment event: 11
- Push event: 37
- Pull request review comment event: 2
- Pull request review event: 4
- Pull request event: 23
- Fork event: 3
Last Year
- Create event: 15
- Issues event: 9
- Release event: 8
- Watch event: 4
- Delete event: 5
- Issue comment event: 11
- Push event: 37
- Pull request review comment event: 2
- Pull request review event: 4
- Pull request event: 23
- Fork event: 3
Committers metadata
Last synced: 3 days ago
Total Commits: 190
Total Committers: 5
Avg Commits per committer: 38.0
Development Distribution Score (DDS): 0.026
Commits in past year: 73
Committers in past year: 4
Avg Commits per committer in past year: 18.25
Development Distribution Score (DDS) in past year: 0.055
Name | Commits | |
---|---|---|
cdjellen | c****n@g****m | 185 |
Rachel Wegener | 3****2 | 2 |
abdu558 | 6****8 | 1 |
Weidav | 6****v | 1 |
Austin Raney | a****y@p****m | 1 |
Committer domains:
Issue and Pull Request metadata
Last synced: 1 day ago
Total issues: 12
Total pull requests: 50
Average time to close issues: 6 days
Average time to close pull requests: about 4 hours
Total issue authors: 9
Total pull request authors: 5
Average comments per issue: 3.67
Average comments per pull request: 0.1
Merged pull request: 49
Bot issues: 0
Bot pull requests: 0
Past year issues: 9
Past year pull requests: 21
Past year average time to close issues: 4 days
Past year average time to close pull requests: about 2 hours
Past year issue authors: 6
Past year pull request authors: 4
Past year average comments per issue: 3.0
Past year average comments per pull request: 0.1
Past year merged pull request: 20
Past year bot issues: 0
Past year bot pull requests: 0
Top Issue Authors
- ks905383 (4)
- Weidav (1)
- anthony-meza (1)
- supermanzer (1)
- abdu558 (1)
- Thomasdht (1)
- rwegener2 (1)
- thex1le (1)
- dominicdill (1)
Top Pull Request Authors
- CDJellen (46)
- Weidav (1)
- aaraney (1)
- abdu558 (1)
- rwegener2 (1)
Top Issue Labels
- bug (6)
- enhancement (6)
- documentation (1)
Top Pull Request Labels
- enhancement (22)
- documentation (18)
- bug (6)
- help wanted (1)
Package metadata
- Total packages: 1
-
Total downloads:
- pypi: 532 last-month
- Total dependent packages: 0
- Total dependent repositories: 1
- Total versions: 23
- Total maintainers: 1
pypi.org: ndbc-api
A Python API for the National Data Buoy Center.
- Homepage:
- Documentation: https://ndbc-api.readthedocs.io/
- Licenses: MIT
- Latest release: 0.0.2 (published almost 3 years ago)
- Last Synced: 2025-06-15T01:31:21.546Z (1 day ago)
- Versions: 23
- Dependent Packages: 0
- Dependent Repositories: 1
- Downloads: 532 Last month
-
Rankings:
- Dependent packages count: 10.119%
- Dependent repos count: 21.545%
- Average: 23.861%
- Stargazers count: 25.064%
- Forks count: 29.791%
- Downloads: 32.786%
- Maintainers (1)
Dependencies
- bs4 *
- html5lib >=1.1
- pandas >=1.3.5
- requests >=2.10.0
- pyarrow * development
- pytest * development
- pyyaml * development
- bs4 *
- html5lib *
- pandas *
- requests *
- actions/checkout v2 composite
- actions/setup-python v2 composite
- actions/checkout v2 composite
- actions/setup-python v2 composite
- beautifulsoup4 4.12.3
- certifi 2024.7.4
- charset-normalizer 3.3.2
- colorama 0.4.6
- coverage 7.5.4
- coveralls 4.0.1
- docopt 0.6.2
- exceptiongroup 1.2.1
- html5lib 1.1
- httpretty 1.1.4
- idna 3.7
- importlib-metadata 8.0.0
- iniconfig 2.0.0
- numpy 2.0.0
- numpy 1.24.4
- packaging 24.1
- pandas 2.0.3
- platformdirs 4.2.2
- pluggy 1.5.0
- pyarrow 16.1.0
- pytest 8.2.2
- pytest-cov 5.0.0
- python-dateutil 2.9.0.post0
- pytz 2024.1
- pyyaml 6.0.1
- requests 2.32.3
- setuptools 70.2.0
- six 1.16.0
- soupsieve 2.5
- tomli 2.0.1
- tzdata 2024.1
- urllib3 2.2.2
- webencodings 0.5.1
- yapf 0.40.2
- zipp 3.19.2
- coveralls >=4.0.1 develop
- httpretty >=0.9.1 develop
- pyarrow ~16 develop
- pytest >=7.1.2 develop
- pytest-cov >=3.0.0 develop
- pyyaml >=6.0 develop
- setuptools >=61.0 develop
- yapf >=0.30 develop
- beautifulsoup4 ~4
- html5lib ^1.1
- pandas >=1.5.3
- python >=3.8,<3.13
- requests >=2.10.0
Score: 10.597883952496506