Rdd to json in spark and scala

2019-08-06 05:32发布

问题:

I take a Json file with spark/scala and i save it in a rdd.

  val dataFile = "resources/tweet-json/hello.json"
  lazy val rdd = SparkCommons.sqlContext.read.format("json").load(dataFile)

After querying rdd, i want to generate again a Json output file (that i will send with a get Http request). How can i convert this rdd in json?

[
{
    "label": [
        "fattacq_an_eser_facq",
        "eu_tot_doc",
        "fattacq_prot_facq",
        "id_sogg",
        "eu_tot_man"
    ],
    "values": [
        {
            "label": "Prima Fattura 2016",
            "values": [
                2016,
                956.48,
                691,
                44633,
                956.48
            ]
        },
        {
            "label": "Seconda Fattura 2016",
            "values": [
                2016,
                190,
                982,
                38127,
                190
            ]
        },
        {
            "label": "Terza Fattura 2016",
            "values": [
                2016,
                140.3,
                1088,
                59381,
                140.3
            ]
        },
        {
            "label": "Quarta Fattura 2016",
            "values": [
                2016,
                488,
                1091,
                59382,
                488
            ]
        },
        {
            "label": "Quinta Fattura 2016",
            "values": [
                2016,
                11365.95,
                1154,
                57526,
                11365.95
            ]
        },
        {
            "label": "Sesta Fattura 2016",
            "values": [
                2016,
                44440.01,
                1276,
                5555,
                44440.01
            ]
        }
    ]
  }
]

回答1:

You can simply use the write function to write out the Json Example:

dfTobeSaved.write.format("json").save("/root/data.json")

I think this should work fine !