How to combine state-level shapefiles from the US Census Bureau into a nationwide form

The Census Bureau does not provide a nationwide shapefile of public-use microdata areas (the smallest geography available in the American Community Survey). I tried to combine them all with several different methods, but even the one that de-duplicates identifiers breaks when it gets to California. Am I doing something stupid or does it require a difficult workaround? Here, the code is reproduced until things break.

library(taRifx.geo) library(maptools) td <- tempdir() ; tf <- tempfile() setInternet2( TRUE ) download.file( "ftp://ftp2.census.gov/geo/tiger/TIGER2014/PUMA/" , tf ) al <- readLines( tf ) tl <- al[ grep( "geo/tiger/TIGER2014/PUMA/tl_2014_" , al ) ] fp <- gsub( "(.*)geo/tiger/TIGER2014/PUMA/tl_2014_([0-9]*)_puma10\\.zip(.*)" , "\\2" , tl ) # get rid of alaska fp <- fp[ fp != '02' ] af <- paste0( "ftp://ftp2.census.gov/geo/tiger/TIGER2014/PUMA/tl_2014_" , fp , "_puma10.zip" ) d <- NULL for ( i in af ){ try( file.remove( z ) , silent = TRUE ) download.file( i , tf , mode = 'wb' ) z <- unzip( tf , exdir = td ) b <- readShapePoly( z[ grep( 'shp$' , z ) ] ) if ( is.null( d ) ) d <- b else d <- taRifx.geo:::rbind.SpatialPolygonsDataFrame( d , b , fix.duplicated.IDs = TRUE ) } # Error in `row.names<-.data.frame`(`*tmp*`, value = c("d.0", "d.1", "d.2", : # duplicate 'row.names' are not allowed # In addition: Warning message: # non-unique values when setting 'row.names': 'd.0', 'd.1', 'd.10', 'd.11', 'd.12', 'd.13', 'd.14', 'd.15', 'd.16', 'd.17', 'd.18', 'd.19', 'd.2', 'd.3', 'd.4', 'd.5', 'd.6', 'd.7', 'd.8', 'd.9' 
+6
source share
2 answers

Here is another approach that includes a short reduction for listing FTP directories. As mentioned in @Pop, the key is to ensure that the identifiers are unique.

 library(RCurl) library(rgdal) # get the directory listing u <- 'ftp://ftp2.census.gov/geo/tiger/TIGER2014/PUMA/' f <- paste0(u, strsplit(getURL(u, ftp.use.epsv = FALSE, ftplistonly = TRUE), '\\s+')[[1]]) # download and extract to tempdir/shps invisible(sapply(f, function(x) { path <- file.path(tempdir(), basename(x)) download.file(x, destfile=path, mode = 'wb') unzip(path, exdir=file.path(tempdir(), 'shps')) })) # read in all shps, and prepend shapefile name to IDs shps <- lapply(sub('\\.zip', '', basename(f)), function(x) { shp <- readOGR(file.path(tempdir(), 'shps'), x) shp <- spChFIDs(shp, paste0(x, '_', sapply(slot(shp, "polygons"), slot, "ID"))) shp }) # rbind to a single object shp <- do.call(rbind, as.list(shps)) # plot (note: clipping to contiguous states for display purposes) plot(shp, axes=T, xlim=c(-130, -60), ylim=c(20, 50), las=1) # write out to wd/USA.shp writeOGR(shp, '.', 'USA', 'ESRI Shapefile') 

unified shp

+3
source

Your problem, which you should have known, is because there are duplicate identifiers of the polygons in your object d .

Indeed, all polygon identifiers in your shp files are "0" . So you used fix.duplicated.IDs = TRUE to make them different.

This is strange because taRifx.geo:::rbind.SpatialPolygonsDataFrame had to fix it when setting fix.duplicated.IDs = TRUE . More precisely, the information is passed to sp::rbind.SpatialPolygons , which calls the sp:::makeUniqueIDs "internal" function, which finally uses the base::make.unique .

I did not want to see what went wrong in this chain. As an alternative, I advise you to configure the identifiers of your polygons , and not use the fix.duplicated.IDs parameter.

To fix this yourself, replace your for loop with the following code:

 d <- NULL count <- 0 for ( i in af ){ try( file.remove( z ) , silent = TRUE ) download.file( i , tf , mode = 'wb' ) z <- unzip( tf , exdir = td ) b <- readShapePoly( z[ grep( 'shp$' , z ) ] ) for (j in 1:length( b@polygons )) b@polygons [[j]]@ID <- as.character(j + count) count <- count + length( b@polygons ) if ( is.null( d ) ) d <- b else d <- taRifx.geo:::rbind.SpatialPolygonsDataFrame( d , b ) } 

Simple for a loop on j only changes the identifier of each polygon in object b before setting it to d .

+5
source

Source: https://habr.com/ru/post/976710/


All Articles