I came across a nullpointer exception when trying something like this. Since we cannot perform operations with RDD in RDD.
Spark does not support RDD deployment, the reason is that to perform an operation or create a new RDD spark runtime, access to the sparkcontext object, which is available only in the driver machine, is required.
Therefore, if you want to work with nested RDDs, you can assemble the parent RDD on the node driver and then iterate over it using an array or something like that.
Note. - The RDD class is serializable. See below.
user3548788
source share