You will need some imports:
from pyspark.sql.functions import array, col, lit, sort_array, struct
With data as shown in the question:
df = sc.parallelize([
("name 1", 0, 3, 1, 2, 1, 6),
("name 2", 1, 7, 2, 9, 5, 3),
]).toDF(["contact"] + ["offer_{}".format(i) for i in range(1, 7)])
you can collect and sort the array structs:
offers = sort_array(array(*[
struct(col(c).alias("v"), lit(c).alias("k")) for c in df.columns[1:]
]), asc=False)
and select:
df.select(
["contact"] + [offers[i]["k"].alias("_{}".format(i)) for i in [0, 1, 2]])
which should give the following result:
+-------+-------+-------+-------+
|contact| _0| _1| _2|
+-------+-------+-------+-------+
| name 1|offer_6|offer_2|offer_4|
| name 2|offer_4|offer_2|offer_5|
+-------+-------+-------+-------+
, .