How to group a common element in an array?

I am trying to find a solution in sparks to group data with a common element in an array.

 key                            value
[k1,k2]                         v1
[k2]                            v2
[k3,k2]                         v3
[k4]                            v4

If any element matches the key, we need to assign it the same groupid (group element)

Result:

key                             value  GroupID
[k1,k2]                           v1    G1
[k2]                              v2    G1
[k3,k2]                           v3    G1 
[k4]                              v4    G2

Some suggestions have already been set using Spark Graphx, but at this point the learning curve will be larger to implement this for a single function.

+6
source share
1 answer

graphframes ( Spark - 2.1, 2.2, , 2.3), XXX Spark YYY Scala:

spark.jars.packages  graphframes:graphframes:0.5.0-sparkXXX-s_YYY

:

import org.apache.spark.sql.functions._

val df = Seq(
   (Seq("k1", "k2"), "v1"), (Seq("k2"), "v2"),
   (Seq("k3", "k2"), "v3"), (Seq("k4"), "v4")
).toDF("key", "value")

val edges = df.select(
  explode($"key") as "src", $"value" as "dst")

graphframe:

import org.graphframes._

val gf = GraphFrame.fromEdges(edges)

( ):

import org.apache.spark.sql.SparkSession

val path: String = ???
val spark: SparkSession = ???
spark.sparkContext.setCheckpointDir(path)

:

val components = GraphFrame.fromEdges(edges).connectedComponents.setAlgorithm("graphx").run

:

 val result = components.where($"id".startsWith("v")).toDF("value", "group").join(df, Seq("value"))

:

result.show

// +-----+------------+--------+
// |value|       group|     key|
// +-----+------------+--------+
// |   v3|489626271744|[k3, k2]|
// |   v2|489626271744|    [k2]|
// |   v4|532575944704|    [k4]|
// |   v1|489626271744|[k1, k2]|
// +-----+------------+--------+
+11

Source: https://habr.com/ru/post/1676917/


All Articles