Skip to content

Instantly share code, notes, and snippets.

View ctoney's full-sized avatar

Chris Toney ctoney

  • Missoula, Montana
  • 00:02 (UTC -07:00)
View GitHub Profile

https://github.com/openlandmap/GEDTM30?tab=readme-ov-file

gdalinfo /vsicurl/https://s3.opengeohub.org/global/edtm/legendtm_rf_30m_m_s_20000101_20231231_go_epsg.4326_v20250130.tif

Driver: GTiff/GeoTIFF
Files: /vsicurl/https://s3.opengeohub.org/global/edtm/legendtm_rf_30m_m_s_20000101_20231231_go_epsg.4326_v20250130.tif
Size is 1440010, 600010
Coordinate System is:
x_from_col <- function(dimension, bbox, col) {
  col[col < 1] <- NA
  col[col > dimension[1L]] <- NA
  xres <- diff(bbox[c(1, 3)]) / dimension[1]
  bbox[1] - xres/2 + col * xres
}
y_from_row <- function(dimension, bbox, row) {
  row[row < 1] <- NA
  row[row > dimension[2]] <- NA

extract rema at points

library(xml2)
library(gdalraster)
dsn <- "/vsicurl/https://raw.githubusercontent.com/mdsumner/rema-ovr/main/REMA-2m_dem_ovr.vrt"
url <- gsub("/vsicurl/", "", dsn)
xml <- read_xml(url)
dst <- xml |> xml_find_all(".//DstRect")
## https://developmentseed.org/obstore/latest/examples/fastapi/

# Example large Parquet file hosted in AWS open data
#store = S3Store("ookla-open-data", region="us-west-2", skip_signature=True)
#path = "parquet/performance/type=fixed/year=2024/quarter=1/2024-01-01_performance_fixed_tiles.parquet"


Sys.setenv("AWS_REGION" = "us-west-2")

On this site is a lot of ZARR datasets, ARCO (analysis ready cloud optimized).

https://catalog.leap.columbia.edu/

There are example codes that show how to open in xarray. We can do this is in R by loading up the GDAL api via the Python osgeo.gdal package.

The new version of reticulate makes this easy to do.

following on from https://bsky.app/profile/jerry-shannon.bsky.social/post/3lcisatdppk2h

gdalinfo /vsigs/gcp-public-data-arco-era5/ar/full_37-1h-0p25deg-chunk-1.zarr-v3
ERROR 15: No valid GCS credentials found. For authenticated requests, you need to set GS_SECRET_ACCESS_KEY, GS_ACCESS_KEY_ID, 
GS_OAUTH2_REFRESH_TOKEN, GOOGLE_APPLICATION_CREDENTIALS, or other configuration options, or create a /home/ubuntu/.boto file. Consult
https://gdal.org/en/stable/user/virtual_file_systems.html#vsigs-google-cloud-storage-files for more details. For unauthenticated requests
on public resources, set the GS_NO_SIGN_REQUEST configuration option to YES.

List files and stream (non-spatial) table from within a tarball on CRAN.

(we need dev gdalraster for the full dir/vsitar listing capability, but reading from remote files or archives is available in many GDAL versions and existing supported GDAL packages on CRAN)

cransrc <- "https://cran.r-project.org/src/contrib" 
library(gdalraster)  ## for listing dirs recursively we need gh:USDAForestService/gdalraster for now
#> GDAL 3.10.0dev-449d5f09b7, released 2024/08/26, GEOS 3.12.2, PROJ 9.4.1

## list all R packages .tar.gz
@JakobMiksch
JakobMiksch / README.md
Last active July 7, 2024 08:36
PostGIS simple Docker Compose file

Run a basic PostGIS database with docker

docker compose up

or directly.

docker run -d \
library(gdalraster)
library(ximage)  ## remotes::install_github("hypertidy/ximage")
dsn <- "/vsicurl/https://prd-tnm.s3.amazonaws.com/StagedProducts/Elevation/1/TIFF/USGS_Seamless_DEM_1.vrt"

## we want a grid of a local area
ex <- c(-2, 2, -1, 1) * 85000
args <- c("-te", ex[1], ex[3], ex[2], ex[4], "-ts", 1024, 0)
crs <- "+proj=laea +lon_0=-112.6 +lat_0=36.3"