当前位置:网站首页>Spark practice 1: build spark operation environment in single node local mode
Spark practice 1: build spark operation environment in single node local mode
2022-07-03 13:22:00 【Brother Xing plays with the clouds】
Preface :
Spark For its own use scala Written , Running on the JVM above .
JAVA edition :java 6 /higher edition.
1 download Spark
http://spark.apache.org/downloads.html
You can choose the version you need , My choice here is :
http://d3kbcqa49mib13.cloudfront.net/spark-1.1.0-bin-hadoop1.tgz
If you are a good farmer who works hard , You can download the source code yourself :http://github.com/apache/spark.
Be careful : I'm running here Linux In the environment . You can install it on the virtual machine without conditions !
2 decompression & Entry directory
tar -zvxf spark-1.1.0-bin-Hadoop1.tgz
cd spark-1.1.0-bin-hadoop1/
3 start-up shell
./bin/spark-shell
You will see a lot of things printed , Finally, it shows
4 A profound
Execute the following statements successively
val lines = sc.textFile("README.md")
lines.count()
lines.first()
val pythonLines = lines.filter(line => line.contains("Python"))
scala> lines.first() res0: String = ## Interactive Python Shel
--- explain , What is? sc
sc It is generated by default SparkContext object .
such as
scala> sc res13: org.apache.spark.SparkContext = [email protected]
Here is only local operation , Let's know in advance Distributed A diagram of the calculation :
5 Independent program
Finally, I will conclude this section with an example
In order to make it run smoothly , Just follow the steps below :
-------------- The directory structure is as follows :
/usr/local/spark-1.1.0-bin-hadoop1/test$ find . . ./src ./src/main ./src/main/scala ./src/main/scala/example.scala ./simple.sbt
then simple.sbt Is as follows :
name := "Simple Project" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"example.scala Is as follows :
import org.apache.spark.SparkConf import org.apache.spark.SparkContext import org.apache.spark.SparkContext._
object example { def main(args: Array[String]) { val conf = new SparkConf().setMaster("local").setAppName("My App") val sc = new SparkContext("local", "My App") sc.stop() //System.exit(0) //sys.exit() println("this system exit ok!!!") } }
Red local: One colony Of URL, Here is local, tell spark How to connect a colony ,local It means running in a single thread on the local machine without connecting to a cluster .
Orange My App: The name of a project ,
And then execute :sbt package
Execute after success
./bin/spark-submit --class "example" ./target/scala-2.10/simple-project_2.10-1.0.jar
give the result as follows :
It shows that the implementation is indeed successful !
end !
边栏推荐
- 人身变声器的原理
- 剑指 Offer 14- I. 剪绳子
- 剑指 Offer 17. 打印从1到最大的n位数
- sitesCMS v3.0.2发布,升级JFinal等依赖
- JSP and filter
- Image component in ETS development mode of openharmony application development
- Slf4j log facade
- Flink SQL knows why (16): dlink, a powerful tool for developing enterprises with Flink SQL
- Elk note 24 -- replace logstash consumption log with gohangout
- Luogup3694 Bangbang chorus standing in line
猜你喜欢

人身变声器的原理

用户和组命令练习

Flink SQL knows why (12): is it difficult to join streams? (top)
[email protected] chianxin: Perspective of Russian Ukrainian cyber war - Security confrontation and sanctions g"/>Start signing up CCF C ³- [email protected] chianxin: Perspective of Russian Ukrainian cyber war - Security confrontation and sanctions g

Elk note 24 -- replace logstash consumption log with gohangout
![[sort] bucket sort](/img/52/95514b5a70cea75821883e016d8adf.jpg)
[sort] bucket sort

DQL basic query

Seven habits of highly effective people

2022-02-14 analysis of the startup and request processing process of the incluxdb cluster Coordinator

OpenHarmony应用开发之ETS开发方式中的Image组件
随机推荐
2022-02-10 introduction to the design of incluxdb storage engine TSM
February 14, 2022, incluxdb survey - mind map
Sword finger offer 17 Print from 1 to the maximum n digits
Setting up remote links to MySQL on Linux
Kivy tutorial how to load kV file design interface by string (tutorial includes source code)
开始报名丨CCF C³[email protected]奇安信:透视俄乌网络战 —— 网络空间基础设施面临的安全对抗与制裁博弈...
106. 如何提高 SAP UI5 应用路由 url 的可读性
Flink SQL knows why (17): Zeppelin, a sharp tool for developing Flink SQL
php:  The document cannot be displayed in Chinese
Flink SQL knows why (19): the transformation between table and datastream (with source code)
untiy世界边缘的物体阴影闪动,靠近远点的物体阴影正常
Comprehensive evaluation of double chain notes remnote: fast input, PDF reading, interval repetition / memory
双链笔记 RemNote 综合评测:快速输入、PDF 阅读、间隔重复/记忆
Some thoughts on business
显卡缺货终于到头了:4000多块可得3070Ti,比原价便宜2000块拿下3090Ti
Start signing up CCF C ³- [email protected] chianxin: Perspective of Russian Ukrainian cyber war - Security confrontation and sanctions g
Convolution emotion analysis task4
MapReduce实现矩阵乘法–实现代码
Detailed explanation of multithreading
Sitescms v3.1.0 release, launch wechat applet