I have the following code in Spark-Python to get the list of names from the schema of a DataFrame, which works fine, but how can I get the list of the data types?
columnNames = df.schema.names
For example, something like:
columnTypes = df.schema.types
Is there any way to get a separate list of the data types contained in a DataFrame schema?
Use schema.dtypes
Since the question title is not python-specific, I'll add
scala
version here:It will result in an array of
org.apache.spark.sql.types.DataType
.Here's a suggestion:
Reference: