pyspark access column of dataframe with a dot '

2020-02-06 10:33发布

问题:

A pyspark dataframe containing dot (e.g. "id.orig_h") will not allow to groupby upon unless first renamed by withColumnRenamed. Is there a workaround? "`a.b`" doesn't seem to solve it.

回答1:

In my pyspark shell, the following snippets are working:

from pyspark.sql.functions import *
myCol = col("`id.orig_h`")    
result = df.groupBy(myCol).agg(...)

and

myCol = df["`id.orig_h`"]   
result = df.groupBy(myCol).agg(...)

I hope it helps.