first I started
$SPARK_HOME/bin/pyspark
and write this code
sqlContext.load("jdbc", url="jdbc:mysql://IP:3306/test", driver="com.mysql.jdbc.Driver", dbtable="test.test_tb")
when I write only dbtable= "test_db", the error is same.
After this error is occurred,
py4j.protocol.Py4JJavaError: An error occurred while calling o66.load. : java.lang.AssertionError: assertion failed: No schema defined, and no Parquet data file or summary file found under . at scala.Predef$.assert(Predef.scala:179) at org.apache.spark.sql.parquet.ParquetRelation2$MetadataCache.org$apache$spark$sql$parquet$ParquetRelation2$MetadataCache$$readSchema(newParquet.scala:429) .....
why this error was occured?? I want to know and solve this problem.
thank you.