How to execute .sql file in spark using python

2019-02-24 18:17发布

问题:

from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext

conf = SparkConf().setAppName("Test").set("spark.driver.memory", "1g")
sc = SparkContext(conf = conf)

sqlContext = SQLContext(sc)

results = sqlContext.sql("/home/ubuntu/workload/queryXX.sql")

When I execute this command using: python test.py it gives me an error.

y4j.protocol.Py4JJavaError: An error occurred while calling o20.sql. : java.lang.RuntimeException: [1.1] failure: ``with'' expected but `/' found

/home/ubuntu/workload/queryXX.sql

at scala.sys.package$.error(package.scala:27)

I am very new to Spark and I need help here to move forward.

回答1:

SqlContext.sql expects a valid SQL query not a path to the file. Try this:

with open("/home/ubuntu/workload/queryXX.sql") as fr:
   query = fr.read()
results = sqlContext.sql(query)


回答2:

Run spark-sql --help will give you

CLI options:
 -d,--define <key=value>          Variable subsitution to apply to hive
                                  commands. e.g. -d A=B or --define A=B
    --database <databasename>     Specify the database to use
 -e <quoted-query-string>         SQL from command line
 -f <filename>                    SQL from files
 -H,--help                        Print help information
    --hiveconf <property=value>   Use value for given property
    --hivevar <key=value>         Variable subsitution to apply to hive
                                  commands. e.g. --hivevar A=B
 -i <filename>                    Initialization SQL file
 -S,--silent                      Silent mode in interactive shell
 -v,--verbose                     Verbose mode (echo executed SQL to the
                                  console)

So you can execute your sql script like this:

spark-sql -f <your-script>.sql



回答3:

I'm not sure will it answer your question. But if you intend to run query on existing table you can use,

spark-sql -i <Filename_with abs path/.sql>

One more thing, if you have pyspark script you can use spark-submit details in here.