How to set up mesos for running spark on standalon

2019-06-21 04:40发布

I want to do testing of Spark programs on a Mac. Spark is running and my spark scala program compiles: but there is a library (mesos.so ?) error at runtime:

Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1758)
    at java.lang.Runtime.loadLibrary0(Runtime.java:823)
    at java.lang.System.loadLibrary(System.java:1045)
    at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:46)
    at spark.SparkContext.<init>(SparkContext.scala:170)
    at com.blazedb.scala.ccp.spark.LoadRDD$.main(LoadRDD.scala:14)

What setup is required on os/x beyond the spark server itself for mesos in order to run a spark client program?

3条回答
家丑人穷心不美
2楼-- · 2019-06-21 04:50

You need to set 'MESOS_NATIVE_LIBRARY' environment variable, which is the location of libmesos.so. It's typically /usr/local/lib/libmesos.so.

# For Linux
$ export MESOS_NATIVE_LIBRARY='/usr/local/lib/libmesos.so'

# For OSX
$ export MESOS_NATIVE_LIBRARY='/usr/local/lib/libmesos.dylib'

I would recommend adding that line to your .bashrc as well, to avoid doing that every time.

查看更多
【Aperson】
3楼-- · 2019-06-21 04:54

if you build mesos from sources then all generated libs will be generated inside [MESOS_HOME]/src/.libs folder. You must delete the empty [MESOS_HOME]/.libs folder and create symbolic link to [MESOS_HOME]/src/.libs

Used commands are:

  • rm -r [MESOS_HOME]/src/.libs
  • ln -s [MESOS_HOME]/src/.libs [MESOS_HOME]/.libs

lost my problem "g++: error: ./.libs/libmesos.so: No such file or directory"

查看更多
走好不送
4楼-- · 2019-06-21 05:03

If you want to use Spark with Mesos, there are instructions on the project website, including notes on how to find the path to the Mesos library on OS X.

As you've noticed, there are other deployment modes, including the local modes, that don't require Mesos to be installed.

Based on your stacktrace, it looks like you might be using an older version of Spark. Since Spark 0.8.0+, the packages have been moved into the org.apache.spark namespace, so you might need to use earlier versions of the docs if you don't want to upgrade.

查看更多
登录 后发表回答