NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.
To learn about what's going on, see About the Archive Site.
Hello again, I may have a source for the data you're looking for... We maintain an AWS S3 Bucket for a NEXRAD Level 3 archive, which goes back to 2020 sometime. You can view the bucket details here: https://registry.opendata.aws/noaa-nexrad/ The catch is the Browse Bucket link currently doesn't work, so it's not intuitively easy to browse the bucket from your browser (define irony). However, with a bit of Python code we've been able to verify that there is data here which is NOT available from NCEI, including the DTA product from LTX during times you're looking for. I'll paste some sample code below, and let us know if you need further assistance working with the S3 Bucket. I will add that fixing that Browse Bucket link is on my to-do list, though I do not have an ETA on that as I'm at AMS all next week. Here is the Python code example: #- START -# from functools import cached_property from pathlib import Path import shutil import boto3 import botocore from botocore.client import Config s3 = boto3.resource('s3', config=Config(signature_version=botocore.UNSIGNED, user_agent_extra='Resource')) class Product: def __init__(self, obj): self.name = obj.key self._obj = obj @cached_property def file(self): return self._obj.get()['Body'] def download(self, path=None): if path is None: path = Path() / self.name elif (path := Path(path)).is_dir(): path = path / self.name else: path = Path(path) with open(path, 'wb') as outfile: shutil.copyfileobj(self.file, outfile) def day_iterator(start, end): while start < end: yield start start = start + timedelta(days=1) def build_prefix(site, prod_id, dt): return f'{site}_{prod_id}_{dt:%Y_%m_%d}_' def dt_from_key(key): return datetime.strptime(key.split('_', maxsplit=2)[-1], '%Y_%m_%d_%H_%M_%S') def get_range(site, prod_id, start, end): bucket = s3.Bucket('unidata-nexrad-level3') for dt in day_iterator(start, end): for obj in bucket.objects.filter(Prefix=build_prefix(site, prod_id, dt)): if start <= dt_from_key(obj.key) < end: yield Product(obj) from datetime import datetime, timedelta start = datetime(2022, 9, 29, 10) end = datetime(2022, 9, 29, 15) for prod in get_range('LTX', 'DTA', start, end): print(prod.name) #- END -# I hope this helps you find what you're looking for, and again let us know if you need any other assistance. Best, -Mike > Yes, I see the same issue with the Google data which can also be accessed > directly through the NOAA Weather & Climate Toolkit. The gaps the NCEI shows > are present nationwide, so I believe it's logistical and hope the majority of > the seemingly missing data can be recovered by NCEI, which they are looking > into. > > Thank you Mike for taking the time to look into other options! Greatly > appreciated! Ticket Details =================== Ticket ID: EBO-191519 Department: Support THREDDS Priority: High Status: Closed =================== NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publicly available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.