spark cluster

Spark cluster共3篇,其余Spark安装、Spark配置

OS: CentOS7

Prepare

  1. 已经安装了《ZooKeeper multi-server install
  2. 已经安装了《hadoop - cluster

Stage I: Installation

  1. spark安装 - CentOS
  2. Firewall设置请参见《firewall-soft list(port)
  3. 设置OpenJDK请参见《设置OpenJDK - CentOS
  4. set up

详见Reference[4]

Stage II: Run

1. ZK

zkServer.sh start

2. Hadoop

1) HDFS

see 《hadoop - cluster - 3.1 HDFS》

2) YARN

see 《hadoop - cluster - 3.2 YARN》

3. Spark

(server3,4)

$SPARK_HOME/sbin/start-all.sh

1) other site

$SPARK_HOME/sbin/start-master.sh

Web: http://192.168.42.109:8082/

http://192.168.42.110:8082/

or http://192.168.42.109:8081/

http://192.168.42.110:8081/

Stage III: Stop

$SPARK_HOME/sbin/stop-all.sh

Reference

  1. Running Spark on YARN
  2. hadoop - cluster
  3. Spark学习之路 (二)Spark2.3 HA集群的分布式安装
  4. spark配置