Is there a way to get the column names by order fr

2019-09-20 10:21发布

问题:

I have a JSON file and the keys would be my column while loading into Spark SQL. Now when i want to retrieve the column names, it was retrieved in Alphabetical order. But i want the details should be in the order of how its present in the file

my input data is

{"id":1,"name":"Judith","email":"jknight0@google.co.uk","city":"Évry","country":"France","ip":"199.63.123.157"}

Below is my way to retrieve the column names and build a single string

val dataframe = sqlContext.read.json("/virtual/home/587635/users.json")
    val columns = dataframe.columns 
    var query = columns.apply(0)+" STRING"  
    for (a <- 1 to (columns.length-1))
    {
      query = query + ","+ columns.apply(a) + " STRING" 
    }
    println(query)

This gives me the output like below

city STRING,country STRING,email STRING,id STRING,ip STRING,name STRING

But i want my output as

id STRING,name STRING,email STRING,city STRING,country STRING,ip STRING

回答1:

Add a select with the columns correctly ordered

val dataframe = 
  sqlContext
    .read
    .json("/tmp/test.jsn")
    .select("id", "name", "email", "city", "country", "ip")

If you tried this at the shell, you will notice the correct order

dataframe: org.apache.spark.sql.DataFrame = [id: bigint, name: string, email: string, city: string, country: string, ip: string]

By executing the rest of your script, the output is as expected

id STRING,name STRING,email STRING,city STRING,country STRING,ip STRING