Spark Dataframe is saved to MongoDB in wrong forma

2019-09-12 10:29发布

问题:

I am using Spark-MongoDB and I am trying to save a DataFrame into MongoDB :

val event = """{"Dev":[{"a":3},{"b":3}],"hr":[{"a":6}]}"""
val events = sc.parallelize(event :: Nil)
val df = sqlc.read.json(events)
val saveConfig = MongodbConfigBuilder(Map(Host -> List("localhost:27017"),
 Database -> "test", Collection -> "test", SamplingRatio -> 1.0, WriteConcern -> "normal",
 SplitSize -> 8, SplitKey -> "_id"))
df.saveToMongodb(saveConfig.build)

I'm expecting the data to be saved as the input string, but what is actually saved is:

{ "_id" : ObjectId("57cedf4bd244c56e8e783a45"), "Dev" : [ { "a" : NumberLong(3), "b" : null }, { "a" : null, "b" : NumberLong(3) } ], "hr" : [ { "a" : NumberLong(6) } ] }

I want to avoid those null values and duplicates, Any idea?

回答1:

Have you tried event defined as below using backslash:

val event = "{\"Dev\":[{\"a\":3},{\"b\":3}],\"hr\":[{\"a\":6}]}"