You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Some datasets are not updated on a strict schedule (specifically the "reanalysis-era5-single-levels-monthly-means"). I see no means of finding out what, if anything, is newer than the last time I checked, which means I have to re-request the data regularly until it changes (and detecting that there has been a change can be tricky and convoluted).
Describe the solution you'd like
It would be extremely useful if there was some sort of programmatic way to check the "last modified" time of a dataset. Even if it was just a single modification time for the whole dataset as a whole would reduce the amount of data bandwidth being consumed.
Ideally, it would be nice if there was a way to query for the "last modification time" as a time series matched with the record dimension of the dataset, so that we can find out what range of time are newer than what we have currently. It'll also help with automatically going back and reprocessing revised analyses.
Describe alternatives you've considered
No response
Additional context
No response
Organisation
Atmospheric and Environmental Research, Inc.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
Some datasets are not updated on a strict schedule (specifically the "reanalysis-era5-single-levels-monthly-means"). I see no means of finding out what, if anything, is newer than the last time I checked, which means I have to re-request the data regularly until it changes (and detecting that there has been a change can be tricky and convoluted).
Describe the solution you'd like
It would be extremely useful if there was some sort of programmatic way to check the "last modified" time of a dataset. Even if it was just a single modification time for the whole dataset as a whole would reduce the amount of data bandwidth being consumed.
Ideally, it would be nice if there was a way to query for the "last modification time" as a time series matched with the record dimension of the dataset, so that we can find out what range of time are newer than what we have currently. It'll also help with automatically going back and reprocessing revised analyses.
Describe alternatives you've considered
No response
Additional context
No response
Organisation
Atmospheric and Environmental Research, Inc.
The text was updated successfully, but these errors were encountered: