Resolving “Kryo serialization failed: Buffer overf

2020-07-11 10:05发布

I am trying to run Spark (Java) code and getting the error

org.apache.spark.SparkException: Kryo serialization failed: Buffer overflow. Available: 0, required: 27".

Other posts have suggested setting the buffer to its max value. When I tried this with max buffer value of 512MB I got the error

java.lang.ClassNotFoundException: org.apache.spark.serializer.KryoSerializer.buffer.max', '512'

How can I solve this problem?

2条回答
We Are One
2楼-- · 2020-07-11 10:22

Try using "spark.kryoserializer.buffer.max.mb", "512" instead spark.kryoserializer.buffer.max", "512MB"

查看更多
劫难
3楼-- · 2020-07-11 10:45

The property name is correct, spark.kryoserializer.buffer.max, the value should include the unit, so in your case is 512m.

Also, dependending where you are setting up the configuration you might have to write --conf spark.kryoserializer.buffer.max=512m. For instance, with a spark-submit or within the <spark-opts>...</spark-opts> of an Oozie worflow action.

查看更多
登录 后发表回答