1、下载解压spark安装包
Apache官网下载Spark,拖拽到master节点上lala用户目录下,解压 https://archive.apache.org/dist/spark/spark-1.4.0/
执行如下语句进行解压缩
tar -zvxf spark-1.4.0-bin-hadoop2.4.tgz
2、配置Spark-env.sh
(1) 进入spark-1.4.0-bin-hadoop2.4/conf,复制其中的spark-env.sh.template并另存为该目录下的spark-env.sh
cd spark-1.4.0-bin-hadoop2.4/conf
cp spark-env.sh.template spark-env.sh
(2)打开spark-env.sh
gedit spark-env.sh
(3) 将如下代码添加到spark-env.sh中
export HADDOP_CONF_DIR=/home/lala/hadoop-2.5.2/
export JAVA_HOME=/usr/java/jdk1.7.0_71/
export SCALA_HOME=/usr/scala-2.10.4
export SPARK_MASTER_IP=192.168.149.132
export SPARK_MASTER_PORT=7077
export SPARK_MASTER_WEBUI_PORT=8080
export SPARK_WORKER_PORT=7078
export SPARK_WORKER_WEBUI_PORT=8081
export SPARK_WORKER_CORES=1
export SPARK_WORKER_INSTRANCES=1
export SPARK_WORKER_MEMORY=2g
export SPARK_JAR=/home/lala/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar
3、配置Spark-defaults.conf
(1)进入spark-1.4.0-bin-hadoop2.4/conf,复制其中的spark-defaults.conf.template并另存为该目录下的spark-defaults.conf
cd spark-1.4.0-bin-hadoop2.4/conf
cp spark-defaults.conf.template spark-defaults.conf
(2)打开spark-defaults.conf
gedit spark-defaults.conf
(3)向spark-defaults.conf添加如下代码,根据用户ip进行修改
spark.master=spark://192.168.109.134:7077
4、配置Slaves
(1)进入spark-1.4.0-bin-hadoop2.4/conf,复制其中的slaves.template并另存为该目录下的slaves
cp slaves.template slaves
(2)打开spark-defaults.conf
gedit slaves
(3) 在slaves中添加子节点机器名或者IP地址
192.168.109.133
192.168.109.134
未完待续 2021/10/1 20:19
|