The Rangeland Analysis Platform (RAP) is an interactive web application designed to assist in managing and monitoring America’s valuable rangelands. This free tool was developed for landowners, managers, and conservationists to quickly and easily access information that can guide land use decisions.
…
The RAP was created in collaboration with the USDA’s Natural Resources Conservation Service, the DOI’s Bureau of Land Management, and the University of Montana.
Notice two key things about RAP: (1) it is designed to be a “platform” and (2) its web address ends in .app
. Those are clues as to why it’s tricky to get your hands on the data. Unlike the other data sources we’ve seen – for which I’ve encouraged you to download the data programmatically – you must download RAP programmatically. Well, almost – let me explain.
RAP is meant to be used interactively and online – i.e., it is primarily a web app. RAP’s primary users are landowners and managers, some of which don’t have formal training with GIS software. While the app is the primary interface, the data underlying the app are very useful for folks like us – academic ecologists. But because we are not RAP’s primary users, there is no dedicated web app for downloading portions of the data. However, they do make the data available for download. The main challenge with most RAP data is that each layer is stored as a single GeoTiff which covers the entire western US. The % cover files (details below), for example, are ~ 130 GB each, far too large for most of us to download and process efficiently on a desktop or laptop computer. (The issue is not necessarily hard drive space, but rather RAM [“memory”] in most cases – although if you need 10 years of data, that will more than fill a 1 TB hard drive).
If you’re interested in using RAP, I do highly recommend you check out the app itself. https://rangelands.app/
So did I actually tell you what RAP is? Maybe it will be clearer to tell you what data sets RAP provides.
All of RAP’s data sets are accompanied by peer reviewed literature describing them in detail. I highly recommend you read the relevant papers to understand the data set before using it in an analysis.
RAP uses Landsat satellite imagery, ground data, and machine learning algorithms (e.g., random forest, convolutional neural networks) to produce the following data sets:
New in 2020
Note, this is the layer that Dr. Eric Thacker spoke highly of to our group in a recent lab meeting.
The Rangeland Analysis Platform’s vegetation biomass product provides annual and 16-day aboveground biomass from 1986 to present of: annual forbs and grasses, perennial forbs and grasses, and herbaceous (combination of annual and perennial forbs and grasses). Estimates represent accumulated new biomass throughout the year or 16-day period and do not include biomass accumulation in previous years.
The Rangeland Analysis Platform’s vegetation cover product provides annual percent cover estimates from 1984 to present of: annual forbs and grasses, perennial forbs and grasses, shrubs, trees, and bare ground.
The Rangeland Analysis Platform provides annual and 16-day rangeland carbon estimates from 1986 to present. These estimates represent net primary productivity (NPP; total net plant carbon) and are partitioned into the following functional groups: annual forb and grass, perennial forb and grass, shrub, and tree. NPP is the net increase (i.e., photosynthesis minus respiration) in total plant carbon, including above and below ground. NPP does not measure soil carbon.
(CONUS = conterminous United States)
In addition to rangeland carbon, the Rangeland Analysis Platform provides estimates of gross primary productivity (GPP) and net primary productivity (NPP) for the conterminous United States (CONUS).
RAP has data specific to the Great Plains, but we will not cover it here. FYI:
RAP also has data specific to the sagebrush biome. Many of us working in Utah might find this useful, but we won’t get into specifics. The approach we use to download the other GeoTIFFs will work here, so if you’re interested, definitely check it out:
In this example, we’ll download the percent cover layer, but the process is basically the same for any layer available as a GeoTIFF (see data descriptions). You can find those data here: http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/
We need to use gdal_translate
to grab just the window of the data we need. Recall from the GDAL module that gdal_translate
has a lot of potential arguments. For example, you might already know when you download the data that you are going to resample to a coarser (or finer) resolution. If so, you might want to do it now, rather than once the raster is sitting on your hard drive, taking up space. But here, we’ll just use a handful of useful arguments.
We’ll start by using paste()
to build the shell command for gdal_translate
. Then we will pass that string to the shell using shell()
(on Windows) or system()
on Mac.
Important note! Remember how we discussed that Windows is the “weird” operating system that uses the backslash (\
)? Well, backslash means something else to other systems, and since R was originally built on Linux, R uses the forward slash (/
). When R sees a backslash in a string, it reads it as a special character. For example, "\n"
is the character for a new line. Anything that starts with \u
is a code for a Unicode character. And any character that otherwise has a special meaning (like a quote) can be “escaped” with a backslash. Here’s an example you can run in your own R session:
newline <- "\n"
plusminus <- "\u00b1"
lambda <- "\u03bb"
Sigma <- "\u03a3"
doublequote <- "\""
smiley <- "\u263a"
cat(paste(doublequote, lambda, plusminus, Sigma, doublequote,
newline, newline, smiley, newline, newline,
doublequote, lambda, plusminus, Sigma, doublequote))
All that is to say that backslash means something special. But we need the backslash in our shell code if we’re on Windows, so we need to “escape” it, by preceding it with a … backslash! So basically, you need to type double backslashes if you’re on Windows. If you’re on Mac or Linux, use the forward slash instead!
Okay, let’s compose our shell command.
# Years to download
y <- 2015:2019
# Bounding coordinates
xmin <- -111.75
xmax <- -111.25
ymin <- 41.5
ymax <- 42.0
# Path to gdal_translate (on my machine)
trans_path <- "C:\\OSGeo4W64\\bin\\gdal_translate.exe"
# Any optional parameters we want to pass, including window
parms <- paste0("-co compress=lzw ",
"-co tiled=yes ",
"-co BIGTIFF=YES ",
"-projwin ", paste(xmin, ymax, xmax, ymin))
# Base URL for RAP % cover
# "/vsicurl" tells GDAL it's a remote file
base_url <- "/vsicurl/http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/"
# Input filenames
in_files <- paste0(base_url, "vegetation-cover-v2-", y, ".tif")
# Output filenames
out_files <- paste0("RAP_", y, ".tif")
# Now paste it all together
shell_command <- paste(trans_path,
parms,
in_files,
out_files)
# Print
print(shell_command)
## [1] "C:\\OSGeo4W64\\bin\\gdal_translate.exe -co compress=lzw -co tiled=yes -co BIGTIFF=YES -projwin -111.75 42 -111.25 41.5 /vsicurl/http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/vegetation-cover-v2-2015.tif RAP_2015.tif"
## [2] "C:\\OSGeo4W64\\bin\\gdal_translate.exe -co compress=lzw -co tiled=yes -co BIGTIFF=YES -projwin -111.75 42 -111.25 41.5 /vsicurl/http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/vegetation-cover-v2-2016.tif RAP_2016.tif"
## [3] "C:\\OSGeo4W64\\bin\\gdal_translate.exe -co compress=lzw -co tiled=yes -co BIGTIFF=YES -projwin -111.75 42 -111.25 41.5 /vsicurl/http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/vegetation-cover-v2-2017.tif RAP_2017.tif"
## [4] "C:\\OSGeo4W64\\bin\\gdal_translate.exe -co compress=lzw -co tiled=yes -co BIGTIFF=YES -projwin -111.75 42 -111.25 41.5 /vsicurl/http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/vegetation-cover-v2-2018.tif RAP_2018.tif"
## [5] "C:\\OSGeo4W64\\bin\\gdal_translate.exe -co compress=lzw -co tiled=yes -co BIGTIFF=YES -projwin -111.75 42 -111.25 41.5 /vsicurl/http://rangeland.ntsg.umt.edu/data/rap/rap-vegetation-cover/v2/vegetation-cover-v2-2019.tif RAP_2019.tif"
You can see we have 1 character string per year, for a total of 5 different strings we want to run. Each one will execute a command to download one year of data. All that’s left to do is pass each one to the system to run. One particularly easy way to do that is with lapply()
, but you could write a for()
loop or whatever else you like to do to iterate.
On Windows:
lapply(shell_command, shell)
On Linux or Mac:
lapply(shell_command, system)
The RAP GeoTIFFs for % cover have 6 bands:
If you try to load them in R with raster()
, it will quietly default to band = 1
, and you will be looking at only annual forbs and grasses. Make sure you specify which band you want!
library(raster)
band1 <- raster("RAP_2015.tif", band = 1)
band6 <- raster("RAP_2015.tif", band = 6)
par(mfrow = c(1, 2))
plot(band1, main = "Annual Forbs and Grasses")
plot(band6, main = "Trees")
Sorry, there’s no easy way right now! Someone could certainly write a function that takes some coordinates, creates a bounding box, and uses paste()
to create the GDAL command. But there could be issues for other users if their GDAL utilities are not in the same location as the function assumes. There are certainly work-arounds, but they do not currently exist.
RAP data are generated using Google Earth Engine, and they are also made available for others to use through Google Earth Engine (GEE).
I have just barely started dabbling with GEE, and it is definitely something I’m interested in learning more about. But I definitely don’t have the experience to include GEE in this workshop.
I will point out that there is an R package for interfacing with GEE (via Python). If you’re interested, check it out! I’d love to hear feedback on what you think.
RAP has several products that are particularly useful for understanding the space-use of ungulates in the western US. Biomass, percent cover, and carbon (NPP) are all available for download. To avoid dealing with huge files, use GDAL to extract the portion of the raster you need and download it to your local machine. If you have experience with Google Earth Engine, you can also access data from there.