当前位置:网站首页>08 Spark cluster construction
08 Spark cluster construction
2022-08-01 16:35:00 【blue wind 9】
Foreword
Ha ha recently there are a series of related requirements for environment construction
Record it
spark three nodes: 192.168.110.150, 192.168.110.151, 192.168.110.152
150 is master, 151 is slave01, 152 is slave02
All three machines are trusted shell
spark version is spark-3.2.1-bin-hadoop2.7
Spark cluster setup
spark three nodes: 192.168.110.150, 192.168.110.151, 192.168.110.152
1. Basic environment preparation
Install jdk on 192.168.110.150, 192.168.110.151, 192.168.110.152, and upload the spark installation package
The installation package is from Downloads | Apache Spark
2. spark configuration adjustment
Copy the following three configuration files, make adjustments, and then scp to slave01, slave02 above
[email protected]:/usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7# cp conf/spark-defaults.conf.template conf/[email protected]:/usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7# cp conf/spark-env.sh.template conf/[email protected]:/usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7# cp conf/workers.template conf/workers
Update workers
# A Spark Worker will be started on each of the machines listed below.slave01slave02
Update spark-defaults.conf
spark.master spark://master:7077# spark.eventLog.enabled true# spark.eventLog.dir hdfs://namenode:8021/directoryspark.serializer org.apache.spark.serializer.KryoSerializerspark.driver.memory 1g# spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
Update spark-env.sh
export JAVA_HOME=/usr/local/ProgramFiles/jdk1.8.0_291export HADOOP_HOME=/usr/local/ProgramFiles/hadoop-2.10.1export HADOOP_CONF_DIR=/usr/local/ProgramFiles/hadoop-2.10.1/etc/hadoopexport SPARK_DIST_CLASSPATH=$(/usr/local/ProgramFiles/hadoop-2.10.1/bin/hadoop classpath)export SPARK_MASTER_HOST=masterexport SPARK_MASTER_PORT=7077
3. Start the cluster
The machine where the master is located executes start-all.sh
[email protected]:/usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7# ./sbin/start-all.shstarting org.apache.spark.deploy.master.Master, logging to /usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.outslave01: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave01.outslave02: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7/logs/spa[email protected]master:/usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7#
Test cluster
Submit 1000 iterations of SparkPI using spark-submit
spark-submit --class org.apache.spark.examples.SparkPi /usr/local/ProgramFiles/spark-3.2.1-bin-hadoop2.7/examples/jars/spark-examples_2.12-3.2.1.jar 1000
java driver submits spark task
spark web ui monitoring page
End
边栏推荐
猜你喜欢
随机推荐
06 redis 集群搭建
Use Canvas to implement mobile phone signature
08 spark 集群搭建
HashCode technology insider interview must ask
珠海市生物安全P3实验室主体结构封顶
C#Excel帮助类
PHP security flaws: session hijacking, cross-site scripting, SQL injection and how to fix them
使用Canvas 实现手机端签名
Financial products with high annualized returns
二分练习题
面对营销难,有米云指出一条破局之路
南京科技大学、中国电子科技第28研究所等联合|MLRIP: Pre-training a military language representation model with informative factual knowledge and professional knowledge base(预训练具有丰富事实知识和专业知识库的军事语言表示模型)
Winform message prompt box helper class
【无标题】
PAT 甲级 A1003 Emergency
1个月写900多条用例,2线城市年薪33W+的测试经理能有多卷?
Synchronized原理
AI艺术‘美丑’不可控?试试 AI 美学评分器~
【paper】Cam2BEV论文浅析
22年镜头“卷”史,智能手机之战卷进死胡同