原文链接
参考:
1.安装JDK
2.安装scala 2.9.3
Spark 0.7.2 依赖 Scala 2.9.3, 我们必须要安装Scala 2.9.3.
下载 scala-2.9.3.tgz 并 保存到home目录(已经在sg206上).
$ tar -zxf scala-2.9.3.tgz$ sudo mv scala-2.9.3 /usr/lib$ sudo vim /etc/profile# add the following lines at the endexport SCALA_HOME=/usr/lib/scala-2.9.3export PATH=$PATH:$SCALA_HOME/bin# save and exit vim#make the bash profile take effect immediatelysource /etc/profile# test$ scala -version3.building spark
cd /home
tar -zxf spark-0.7.3-sources.gz
cd spark-0.7.3
sbt/sbt package (需要git环境 yum install git)
4.配置文件
spark-env.sh
############
export SCALA_HOME=/usr/lib/scala-2.9.3
export SPARK_MASTER_IP=172.16.48.202export SPARK_WORKER_MEMORY=10G#############
slaves
将从节点IP添加至slaves配置文件
5.启动和停止
bin/start-master.sh - Starts a master instance on the machine the script is executed on.
bin/start-slaves.sh - Starts a slave instance on each machine specified in the conf/slaves file.bin/start-all.sh - Starts both a master and a number of slaves as described above.bin/stop-master.sh - Stops the master that was started via the bin/start-master.sh script.bin/stop-slaves.sh - Stops the slave instances that were started via bin/start-slaves.sh.bin/stop-all.sh - Stops both the master and the slaves as described above.