当前位置:网站首页>Basic configuration and use of spark
Basic configuration and use of spark
2022-07-06 17:39:00 【Bald Second Senior brother】
Catalog
Content :
spark Configuration of the three modes and spark The basic use method
Spark Three models of
Local Pattern ( Local mode )
local Set up Master Methods :local( Default to a thread ),local[k]( Specified number of threads ),local[*]( Most used cpu Set thread ); The thread executing is Worker
To configure :
local There is no need to modify the configuration file ,Spark After installation, you can use it directly local Model to calculate and analyze data
Standalone Pattern
Operation mode and hadoop Of resourcemanage Very similar .
The configuration file
Modify this file , If you don't change the name of his reference document
Modify the content :
Add the name of the host , Follow hadoop The same is for group and cluster services
This file also needs to be modified , If nothing, change the reference file
Configure the content
This is designated to run on that machine master, And specify the port number
The next step is to send the configuration file to slaves Each host in
This file modification specifies java The running address of
History server
To add a historical server, you need to modify this configuration file and add the following contents
Be careful : Take out the notes
As above, the configuration file needs to be modified
stay hdps Created in directory file , And start up hdfs
HA
Modify the configuration file
Add content
Appoint zookeeper Location of , And hdfs At the same time, comment out
To avoid conflict
Then distribute the configuration file
Yarn Pattern
effect : You don't need to build Spark Cluster of
The configuration file :
1.
modify hadoop in yarn Configuration file for
Modify the content :
The function of this configuration is to turn off operations with excessive memory , Otherwise, when the computing memory exceeds a certain limit spark It will turn off automatically
2. modify
Modify the content :
Be careful
Before using different modes, you should comment or delete the contents of other modes in the configuration file .
Use of official cases
Pay attention to each different mode ,master The content of is different , There will be some differences in other places
api Use
object WordCount { def main(args: Array[String]): Unit = { //WordCount Development //local Pattern // establish sparkConf object // Set up spark Deployment environment for val config = new SparkConf().setMaster("local[*]").setAppName("WordCount") // establish spark Context object val sc=new SparkContext(config) // Read the file , Read line by line ( Local search file:///xxxxx) val lines = sc.textFile("file:///opt/module/ha/spark/in") // Decompose the data into words one by one val words = lines.flatMap(_.split(" ")) // Transformation structure val wordToOne = words.map((_, 1)) // Group aggregation val wordToSum = wordToOne.reduceByKey(_ + _) // Printout println(wordToSum.collect()) } }
summary :
Today I learned how to spark At the same time, the official use of spark Three modes of and their configuration files ,spark The configuration file of is relatively simple, but it is inconvenient that it cannot coexist , At the same time, the use method of his official operation is too complex to be easy to remember , It is estimated that you still need to check the document for later use . Next is in java It is also cumbersome when used on , But it is much simpler than the command line
边栏推荐
- Flink parsing (IV): recovery mechanism
- [ASM] introduction and use of bytecode operation classwriter class
- Re signal writeup
- List set data removal (list.sublist.clear)
- CTF reverse entry question - dice
- The art of Engineering (2): the transformation from general type to specific type needs to be tested for legitimacy
- mysql高级(索引,视图,存储过程,函数,修改密码)
- The solution to the left-right sliding conflict caused by nesting Baidu MapView in the fragment of viewpager
- 【逆向中级】跃跃欲试
- Interpretation of Flink source code (I): Interpretation of streamgraph source code
猜你喜欢
[reverse primary] Unique
网络分层概念及基本知识
基于LNMP部署flask项目
学 SQL 必须了解的 10 个高级概念
C#版Selenium操作Chrome全屏模式显示(F11)
07 personal R & D products and promotion - human resources information management system
Flink parsing (III): memory management
轻量级计划服务工具研发与实践
February database ranking: how long can Oracle remain the first?
Akamai 反混淆篇
随机推荐
The art of Engineering (3): do not rely on each other between functions of code robustness
Guidelines for preparing for the 2022 soft exam information security engineer exam
Pyspark operator processing spatial data full parsing (5): how to use spatial operation interface in pyspark
TCP连接不止用TCP协议沟通
[rapid environment construction] openharmony 10 minute tutorial (cub pie)
Vscode replaces commas, or specific characters with newlines
轻量级计划服务工具研发与实践
Final review of information and network security (full version)
网络分层概念及基本知识
Vscode matches and replaces the brackets
February database ranking: how long can Oracle remain the first?
【MySQL入门】第一话 · 初入“数据库”大陆
MySQL error reporting solution
Solrcloud related commands
The solution to the left-right sliding conflict caused by nesting Baidu MapView in the fragment of viewpager
07 personal R & D products and promotion - human resources information management system
07个人研发的产品及推广-人力资源信息管理系统
02个人研发的产品及推广-短信平台
Precipitated database operation class - version C (SQL Server)
MySQL Advanced (index, view, stored procedures, functions, Change password)