I have ~ 100 XML publishing data files each> 10 GB, formatted as follows:
<?xml version="1.0" encoding="UTF-8"?>
<records xmlns="http://website">
<REC rid="this is a test">
<UID>ABCD123</UID>
<data_1>
<fullrecord_metadata>
<references count="3">
<reference>
<uid>ABCD2345</uid>
</reference>
<reference>
<uid>ABCD3456</uid>
</reference>
<reference>
<uid>ABCD4567</uid>
</reference>
</references>
</fullrecord_metadata>
</data_1>
</REC>
<REC rid="this is a test">
<UID>XYZ0987</UID>
<data_1>
<fullrecord_metadata>
<references count="N">
</references>
</fullrecord_metadata>
</data_1>
</REC>
</records>
with a change in the number of links for each unique record (with a UID index), some of which may be zero.
Purpose: create 1 simple data.frame file for the XML file as follows:
UID reference
ABCD123 ABCD2345
ABCD123 ABCD3456
ABCD123 ABCD4567
XYZ0987 NULL
Due to file size and the need for an efficient loop for many files, I studied xmlEventParse to limit memory usage. I can successfully extract the unique “UID” key for each “REC” and create the data.frame file using the following code from previous questions:
branchFunction <- function() {
store <- new.env()
func <- function(x, ...) {
ns <- getNodeSet(x, path = "//UID")
key <- xmlValue(ns[[1]])
value <- xmlValue(ns[[1]])
print(value)
store[[key]] <- value
}
getStore <- function() { as.list(store) }
list(UID = func, getStore=getStore)
}
myfunctions <- branchFunction()
xmlEventParse(
file = "test.xml",
handlers = NULL,
branches = myfunctions
)
DF <- do.call(rbind.data.frame, myfunctions$getStore())
But I can’t successfully save the reference data and not process the variations in the reference numbers for one UID. Thanks for any suggestions!