Convert pyspark data frame columns to lowercase

I have a dataframe in pyspark that has uppercase columns such as ID, COMPANYetc.

I want the names of these columns to be ID COMPANYand so on. It's awkward to convert all columns to lower or upper case, depending on the requirement.

I want to make the column data types remain the same.

How can we do this?

+4
source share
1 answer

Use a field columnsfrom a DataFrame

df = // load
for col in df.columns:
    df = df.withColumnRenamed(col, col.lower())

Or as @ zero323 suggested:

df.toDF(*[c.lower() for c in df.columns])
+5
source

Source: https://habr.com/ru/post/1673148/


All Articles