You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Potentially use gdal to generate multilayer cloud-optimized geotiff by year (UTC). One layer per overpass over the prediction region.
overpass time was tile dependent (with ~5 min variation across adjacent tiles) -- perhaps will need to pick a reference tile?
unclear what other metadata to pass through
sometimes helpful to have original (raw) data for contrast with predicted values (pred); is this just useful for our evaluation papers or should we export both values for others or as separate (optional) files?
The text was updated successfully, but these errors were encountered:
perhaps we could consider parquet as an efficient target store for the prediction output (as big tabular data) still within the targets workflow (and more language agnostic than fst);
targets appears to leverage arrow::read_parquet() and arrow::write_parquet()
Potentially use gdal to generate multilayer cloud-optimized geotiff by year (UTC). One layer per overpass over the prediction region.
The text was updated successfully, but these errors were encountered: