当前位置:网站首页>Basic configuration and use of spark
Basic configuration and use of spark
2022-07-06 17:39:00 【Bald Second Senior brother】
Catalog
Content :
spark Configuration of the three modes and spark The basic use method
Spark Three models of
Local Pattern ( Local mode )
local Set up Master Methods :local( Default to a thread ),local[k]( Specified number of threads ),local[*]( Most used cpu Set thread ); The thread executing is Worker
To configure :
local There is no need to modify the configuration file ,Spark After installation, you can use it directly local Model to calculate and analyze data
Standalone Pattern
Operation mode and hadoop Of resourcemanage Very similar .
The configuration file
Modify this file , If you don't change the name of his reference document
Modify the content :
Add the name of the host , Follow hadoop The same is for group and cluster services
This file also needs to be modified , If nothing, change the reference file
Configure the content
This is designated to run on that machine master, And specify the port number
The next step is to send the configuration file to slaves Each host in
This file modification specifies java The running address of
History server
To add a historical server, you need to modify this configuration file and add the following contents
Be careful : Take out the notes
As above, the configuration file needs to be modified
stay hdps Created in directory file , And start up hdfs
HA
Modify the configuration file
Add content
Appoint zookeeper Location of , And hdfs At the same time, comment out
To avoid conflict
Then distribute the configuration file
Yarn Pattern
effect : You don't need to build Spark Cluster of
The configuration file :
1.
modify hadoop in yarn Configuration file for
Modify the content :
The function of this configuration is to turn off operations with excessive memory , Otherwise, when the computing memory exceeds a certain limit spark It will turn off automatically
2. modify
Modify the content :
Be careful
Before using different modes, you should comment or delete the contents of other modes in the configuration file .
Use of official cases
Pay attention to each different mode ,master The content of is different , There will be some differences in other places
api Use
object WordCount { def main(args: Array[String]): Unit = { //WordCount Development //local Pattern // establish sparkConf object // Set up spark Deployment environment for val config = new SparkConf().setMaster("local[*]").setAppName("WordCount") // establish spark Context object val sc=new SparkContext(config) // Read the file , Read line by line ( Local search file:///xxxxx) val lines = sc.textFile("file:///opt/module/ha/spark/in") // Decompose the data into words one by one val words = lines.flatMap(_.split(" ")) // Transformation structure val wordToOne = words.map((_, 1)) // Group aggregation val wordToSum = wordToOne.reduceByKey(_ + _) // Printout println(wordToSum.collect()) } }
summary :
Today I learned how to spark At the same time, the official use of spark Three modes of and their configuration files ,spark The configuration file of is relatively simple, but it is inconvenient that it cannot coexist , At the same time, the use method of his official operation is too complex to be easy to remember , It is estimated that you still need to check the document for later use . Next is in java It is also cumbersome when used on , But it is much simpler than the command line
边栏推荐
- Pyspark operator processing spatial data full parsing (4): let's talk about spatial operations first
- Deploy flask project based on LNMP
- PostgreSQL 14.2, 13.6, 12.10, 11.15 and 10.20 releases
- 虚拟机启动提示Probing EDD (edd=off to disable)错误
- C WinForm series button easy to use
- Models used in data warehouse modeling and layered introduction
- Vscode matches and replaces the brackets
- 07 personal R & D products and promotion - human resources information management system
- [ASM] introduction and use of bytecode operation classwriter class
- Connect to LAN MySQL
猜你喜欢
Flink parsing (IV): recovery mechanism
案例:检查空字段【注解+反射+自定义异常】
JVM garbage collector part 1
Wordcloud colormap color set and custom colors
Development and practice of lightweight planning service tools
[reverse] repair IAT and close ASLR after shelling
Huawei certified cloud computing hica
04 products and promotion developed by individuals - data push tool
Kali2021 installation and basic configuration
Flink analysis (II): analysis of backpressure mechanism
随机推荐
List集合数据移除(List.subList.clear)
05个人研发的产品及推广-数据同步工具
Precipitated database operation class - version C (SQL Server)
03个人研发的产品及推广-计划服务配置器V3.0
Redis installation on centos7
Uipath browser performs actions in the new tab
Automatic operation and maintenance sharp weapon ansible Foundation
应用服务配置器(定时,数据库备份,文件备份,异地备份)
Remote code execution penetration test - B module test
Huawei certified cloud computing hica
TCP connection is more than communicating with TCP protocol
遠程代碼執行滲透測試——B模塊測試
Akamai anti confusion
02个人研发的产品及推广-短信平台
MySQL报错解决
PySpark算子处理空间数据全解析(4): 先说说空间运算
Models used in data warehouse modeling and layered introduction
C version selenium operation chrome full screen mode display (F11)
JVM garbage collector part 2
信息与网络安全期末复习(完整版)