Quantcast
Channel: My Tech Notes
Viewing all articles
Browse latest Browse all 90

Run Spark Shell locally

$
0
0
If you want to run spark to access the local file system, here is the simple way:

HADOOP_CONF_DIR=. MASTER=local spark-shell
If you don't give HADOOP_CONF_DIR, spark will use /etc/hadoop/conf which may point to a cluster running in pseduo mode. When HADOOP_CONF_DIR points to a dir without any Hadoop configuration, the file system will be local. It also works for spark-submit.

Viewing all articles
Browse latest Browse all 90

Trending Articles