R/ee_download.R
ee_table_to_gcs.Rd
Creates a task to export a FeatureCollection to Google Cloud Storage.
This function is a wrapper around
ee$batch$Export$table$toCloudStorage(...)
.
ee_table_to_gcs(
collection,
description = "myExportTableTask",
bucket = NULL,
fileNamePrefix = NULL,
timePrefix = TRUE,
fileFormat = NULL,
selectors = NULL
)
The feature collection to be exported.
User-friendly name of the task.
The name of a Cloud Storage bucket for the export.
Cloud Storage object name prefix for the export. Defaults to the name of the task.
Prefixes the current date and time to the exported files.
The output format: "CSV" (default), "GeoJSON", "KML", "KMZ", "SHP", or "TFRecord".
The list of properties to include in the output, as a list of strings or a comma-separated string. By default, all properties are included. **kwargs: Holds other keyword arguments that may have been deprecated such as 'outputBucket'.
An unstarted Task that exports the table to Google Cloud Storage.
Other vector export task creator:
ee_table_to_asset()
,
ee_table_to_drive()
if (FALSE) { # \dontrun{
library(rgee)
library(stars)
library(sf)
ee_users()
ee_Initialize(gcs = TRUE)
# Define study area (local -> earth engine)
# Communal Reserve Amarakaeri - Peru
rlist <- list(xmin = -71.13, xmax = -70.95,ymin = -12.89, ymax = -12.73)
ROI <- c(rlist$xmin, rlist$ymin,
rlist$xmax, rlist$ymin,
rlist$xmax, rlist$ymax,
rlist$xmin, rlist$ymax,
rlist$xmin, rlist$ymin)
ee_ROI <- matrix(ROI, ncol = 2, byrow = TRUE) %>%
list() %>%
st_polygon() %>%
st_sfc() %>%
st_set_crs(4326) %>%
sf_as_ee()
amk_fc <- ee$FeatureCollection(
list(ee$Feature(ee_ROI, list(name = "Amarakaeri")))
)
task_vector <- ee_table_to_gcs(
collection = amk_fc,
bucket = "rgee_dev",
fileFormat = "SHP",
fileNamePrefix = "geom_Amarakaeri"
)
task_vector$start()
ee_monitoring(task_vector) # optional
amk_geom <- ee_gcs_to_local(task = task_vector)
plot(sf::read_sf(amk_geom[3]), border = "red", lwd = 10)
} # }