Convert an Earth Engine table into a sf object

  overwrite = TRUE,
  via = "getInfo",
  container = "rgee_backup",
  crs = NULL,
  maxFeatures = 5000,
  selectors = NULL,
  lazy = FALSE,
  public = TRUE,
  add_metadata = TRUE,
  timePrefix = TRUE,
  quiet = FALSE



Earth Engine table (ee$FeatureCollection) to be converted into a sf object.


Character. Output filename. In case dsn is missing, a shapefile is created in the tmp() directory.


Logical. Delete data source dsn before attempting to write?.


Character. Method to export the image. Three method are implemented: "getInfo", "drive", "gcs". See details.


Character. Name of the folder ('drive') or bucket ('gcs') to be exported into (ignored if via is not defined as "drive" or "gcs").


Integer or Character. Coordinate Reference System (CRS) for the EE table. If it is NULL, ee_as_sf will take the CRS of the first element.


Numeric. The maximum allowed number of features to export (ignored if via is not set as "getInfo"). The task will fail if the exported region covers more features than the specified in maxFeatures. Defaults to 5000.


The list of properties to include in the output, as a list/vector of strings or a comma-separated string. By default, all properties are included.


Logical. If TRUE, a future::sequential object is created to evaluate the task in the future. Ignore if via is set as "getInfo". See details.


Logical. If TRUE, a public link to the file is created. See details.


Add metadata to the sf object. See details.


Logical. Add current date and time (Sys.time()) as a prefix to export files. This parameter helps to avoid exported files with the same name. By default TRUE.


logical. Suppress info message.


An sf object.


ee_as_sf supports the download of ee$Geometry, ee$Feature, and ee$FeatureCollection by three different options: "getInfo" (which make an REST call to retrieve the data), "drive" (which use Google Drive) and "gcs" (which use Google Cloud Storage). The advantage of use "getInfo" is a direct and faster download. However, there is a limitation of 5000 features by request, making it not recommendable for large FeatureCollection. Instead of "getInfo", the options: "drive" and "gcs" are suitable for large FeatureCollections due to the use of an intermediate container. When via is set as "drive" or "gcs" ee_as_sf perform the following steps:

  • 1. A task is started (i.e., ee$batch$Task$start()) to move the EE Table from Earth Engine to the file storage system (Google Drive or Google Cloud Storage) specified in the argument via.

  • 2. If the argument lazy is TRUE, the task will not be monitored. This is useful to launch several tasks simultaneously and calls them later using ee_utils_future_value or future::value. At the end of this step, the EE Table is stored under the path specified by the argument dsn.

  • 3. Finally, if the argument add_metadata is TRUE, a list with the following elements is added to the sf object.

    • if via is "drive":

      • ee_id: Name of the Earth Engine task.

      • drive_name: Name of the Table in Google Drive.

      • drive_id: Id of the Table in Google Drive.

      • drive_download_link: Download link to the table.

    • if via is "gcs":

      • ee_id: Name of the Earth Engine task.

      • gcs_name: Name of the Table in Google Cloud Storage.

      • gcs_bucket: Name of the bucket.

      • gcs_fileFormat: Format of the table.

      • gcs_public_link: Download link to the table.

      • gcs_URI: gs:// link to the table.

    Run attr(sf, "metadata") to get the list.

To get more information about exporting data from Earth Engine, take a look at the Google Earth Engine Guide - Export data.


if (FALSE) {

ee_Initialize(drive = TRUE, gcs = TRUE)

# Region of interest
roi <- ee$Geometry$Polygon(list(
  c(-122.275, 37.891),
  c(-122.275, 37.868),
  c(-122.240, 37.868),
  c(-122.240, 37.891)

# TIGER: US Census Blocks Dataset
blocks <- ee$FeatureCollection("TIGER/2010/Blocks")
subset <- blocks$filterBounds(roi)
sf_subset <- ee_as_sf(x = subset)

# Create Random points in Earth Engine
region <- ee$Geometry$Rectangle(-119.224, 34.669, -99.536, 50.064)
ee_randomPoints <- ee$FeatureCollection$randomPoints(region, 100)

# Download via GetInfo
sf_randomPoints <- ee_as_sf(ee_randomPoints)

# Download via drive
sf_randomPoints_drive <- ee_as_sf(
  x = ee_randomPoints,
  via = 'drive'

# Download via GCS
sf_randomPoints_gcs <- ee_as_sf(
 x = subset,
 via = 'gcs',
 container = 'rgee_dev' #GCS bucket name