有了前面spark-shell的經(jīng)驗(yàn),看這兩個(gè)腳本就容易多啦。前面總結(jié)的Spark-shell的分析可以參考:
Spark-submit
if [ -z "${SPARK_HOME}" ]; then export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"fi# disable randomized hash for string in Python 3.3+export PYTHONHASHSEED=0exec "${SPARK_HOME}"/bin/spark-class org.apache.spark.deploy.SparkSubmit "$@"
跟Spark-shell一樣,先檢查是否設(shè)置了
網(wǎng)友評(píng)論