Convert an Earth Engine table into a sf object

  overwrite = TRUE,
  via = "getInfo",
  container = "rgee_backup",
  crs = NULL,
  maxFeatures = 5000,
  selectors = NULL,
  lazy = FALSE,
  public = FALSE,
  add_metadata = TRUE,
  timePrefix = TRUE,
  quiet = FALSE



Earth Engine table (ee$FeatureCollection) to be converted into a sf object.


Character. Output filename. In case dsn is missing, a shapefile is created in the tmp() directory.


Logical. Delete data source dsn before attempting to write?.


Character. Method to export the image. Three method are implemented: "getInfo", "drive", "gcs". See details.


Character. Name of the folder ('drive') or bucket ('gcs') to be exported into (ignored if via is not defined as "drive" or "gcs").


Integer or Character. Coordinate Reference System (CRS) for the EE table. If it is NULL, ee_as_sf will take the CRS of the first element.


Numeric. The maximum number of features allowed for export (ignore if via is not set as "getInfo"). The task will fail if the exported region covers more features than the specified in maxFeatures. Default is 5000.


List of properties to include in the output, as a list/vector of strings or a comma-separated string. By default, all properties are included.


Logical. If TRUE, a future::sequential object is created to evaluate the task in the future. Ignore if via is set as "getInfo". See details.


Logical. If TRUE, a public link to the file is created. See details.


Add metadata to the sf object. See details.


Logical. Add current date and time (Sys.time()) as a prefix to export files. This parameter helps to prevent exported files from having the same name. By default TRUE.


Logical. Suppress info message.


An sf object.


ee_as_sf supports the download of ee$Geometry, ee$Feature, and ee$FeatureCollection by three different options: "getInfo" (which make an REST call to retrieve the data), "drive" (which use Google Drive) and "gcs" (which use Google Cloud Storage). The advantage of using "getInfo" is a direct and faster download. However, there is a limit of 5000 features by request, which makes it not recommendable for large FeatureCollection. Instead of "getInfo", the "drive" and "gcs" options are suitable for large FeatureCollections because they use an intermediate container. When via is set as "drive" or "gcs" ee_as_sf performs the following steps:

  • 1. A task is started (i.e., ee$batch$Task$start()) to move the EE Table from Earth Engine to the file storage system (Google Drive or Google Cloud Storage) specified in the via argument.

  • 2. If the argument lazy is TRUE, the task will not be monitored. This is useful for launching several tasks simultaneously and calling them later using ee_utils_future_value or future::value. At the end of this step, the EE Table is stored under the path specified by the argument dsn.

  • 3. Finally, if the argument add_metadata is TRUE, a list with the following elements is added to the sf object.

    • if via is "drive":

      • ee_id: Earth Engine task name.

      • drive_name: Google Drive table name

      • drive_id: Google Drive table ID

      • drive_download_link: Link to download the table

    • if via is "gcs":

      • ee_id: Earth Engine task name.

      • gcs_name: Google Cloud Storage table name

      • gcs_bucket: Bucket name

      • gcs_fileFormat: Table format

      • gcs_public_link: Link to download the table.

      • gcs_URI: gs:// link to the table.

    Run attr(sf, "metadata") to get the list.

To get more information about exporting data from Earth Engine, take a look at the Google Earth Engine Guide - Export data.


if (FALSE) {

ee_Initialize(drive = TRUE, gcs = TRUE)

# Region of interest
roi <- ee$Geometry$Polygon(list(
  c(-122.275, 37.891),
  c(-122.275, 37.868),
  c(-122.240, 37.868),
  c(-122.240, 37.891)

# TIGER: US Census Blocks Dataset
blocks <- ee$FeatureCollection("TIGER/2010/Blocks")
subset <- blocks$filterBounds(roi)
sf_subset <- ee_as_sf(x = subset)

# Create Random points in Earth Engine
region <- ee$Geometry$Rectangle(-119.224, 34.669, -99.536, 50.064)
ee_randomPoints <- ee$FeatureCollection$randomPoints(region, 100)

# Download via GetInfo
sf_randomPoints <- ee_as_sf(ee_randomPoints)

# Download via drive
sf_randomPoints_drive <- ee_as_sf(
  x = ee_randomPoints,
  via = 'drive'

# Download via GCS
sf_randomPoints_gcs <- ee_as_sf(
 x = subset,
 via = 'gcs',
 container = 'rgee_dev' #GCS bucket name