Skip to main content

Computing with large rasters in R: tiling, parallelization, optimization

Processing large spatial data in a programming environment such as R is not trivial. Even if you use powerful computing infrastructure, it might take careful programming to be able to process large spatial data. The reasons why R has not been recommended as a programming environment for large data were: (a) R does not handle well large objects in the working memory, and (b) for many existing functions parallelization is not implemented automatically but has to be added through additional programming. This tutorial demonstrates how to use R to read, process and create large spatial (raster) data sets. To obtain the data sets used in the tutorial and R code please refer to the github repository.

Tutorials: Mapping Potential Natural Vegetation (

Documentation (slides):