当前位置:网站首页>Spark SQL learning bullet 2

Spark SQL learning bullet 2

2022-07-05 02:26:00 Several storehouses of cabbage white

I'm busy every day, but I don't know what I've done

Small talk

         I'm a little weak recently , Study without energy , Driving test appointment has been hanging on me . Cry and chirp

On the subject

         Last time Data Frame yes Spark SQL At the heart of , This article introduces two ways to Spark SQL operation

         Use Spark SQL Data analysis , There are two options , The first one is DSL Language , The second one is SQL Language . Of course , You can also use Hive SQL.

         Use DSL To operate

         First create Spark Session object

val sparkConf = new SparkConf().setMaster("local[*]").setAppName("Spark_Sql") 
val sparkSession = SparkSession.builder().config(sparkConf).getOrCreate() 
import sparkSession.implicits._

         Use Data Frame Of API Analyze

// Read Json file , Return to one Data Frame object  
val frame = sparkSession.read.json("date/people.json") 
// View in the form of a table people Information  
frame.show()

         Take a look at the style of the table

         By reading the json File creation Data Frame object ,Data Frame Provides flexibility , And powerful with optimization API, for example select,where,order by,group by ,limit,union Such operators operate ,Data Frame take SQL select Each component of the statement is encapsulated into the same name API, We can SQL Boy Get familiar more quickly Spark SQL. There is no need for RDD Can also perform data analysis .

         Here are some examples to use Data Frame Of Api

  1. Output in tree format Data Frame Object structure information
frame.printSchema()

         You can see ,root After the root node , The following fields are json The fields inside , These fields are also output in table structure . At the same time, the type of each field is also marked .

        2. stay SQL Inside , The most frequently written sentence is Select, Most of them are queries , It can be said that almost all of them are queries . stay DSL There are also select, It said that ,Data Frame Of Api There is also a corresponding SQL keyword . Pass below select show

         stay SQL It uses query statements

        select name from surface

        DSL It's like this in English

        frame.select("name").show()

         Why show,show The data will be displayed in table structure , If not show Method , What will it look like

        println(frame.select("name"))

         The type of this field displayed , At the beginning printSchema Method

         What's shown above Select Method is just one of them , For the author , I don't think this way is good , I prefer the following way

        frame.select($"name")

Take a look at the results  

2. Use combination DSL Analyze

         Look for ages greater than 25 And the next year's age and gender, in ascending order of age

frame.select($"name",$"age" +1,$"gender") .where($"age" > 25)
.orderBy(frame("age").asc) .show()

Take a look at the results

         Of course! , You can also group and aggregate . stay SQL Inside is group by + sum, stay DSL in , It needs to be done first groupby Then use it directly when counting count

val frame = sparkSession.read.json("date/people.json") 
// Group aggregation  
frame.groupBy($"age" ).count().show()

 

         The above example can be flexibly used Data Frame Provided Api Realized SQL Same operation , But if you put it in RDD Programming , For the operation of grouping aggregation , It needs to be done first groupbykey And then again map transformation .

         Not to mention RDD Read Json Will be converted into a RDD[String], And then convert to other RDD Type of .

        Spark SQL It can be parsed directly Json And infer the structural information (Schema)

         If you don't want to learn DSL, It doesn't matter. , Here's how to use it SQL The query

perform SQL Inquire about

        SparkSession Provides direct reading SQL Methods ,SQL Statements can be passed directly as strings to sql Method inside , And the object returned is Data Frame object . But I want to implement it like this , You need to Data Frame Object is registered as a temporary table , Then you can operate

val sparkConf = new SparkConf().setMaster("local[*]").setAppName("Spark_Sql") 
val sparkSession = SparkSession.builder().config(sparkConf).getOrCreate() 
import sparkSession.implicits._ 
// Read json file  
val frame = sparkSession.read.json("date/people.json") 
// Register as a temporary form  
frame.createOrReplaceTempView("people") 
// call Spark Session Of SQL Interface , The temporary table is SQL Inquire about  
sparkSession.sql("select age,count(*) from people group by age").show()

Take a look at the results

         You need to register as a temporary table , Only then can we carry on SQL Inquire about . But there is a problem with the temporary table , This Spark Session At the end , This watch won't work , Therefore, there is a comprehensive temporary table

Comprehensive temporary table

         The scope of the comprehensive temporary table is a Spark All sessions within the application , Will persist , Share... In all sessions . Let's demonstrate

val sparkConf = new SparkConf().setMaster("local[*]").setAppName("Spark_Sql") 
val sparkSession = SparkSession.builder().config(sparkConf).getOrCreate() 
import sparkSession.implicits._ 
// Read json file  
val frame = sparkSession.read.json("date/people.json") 
// Register as a global table  
frame.createOrReplaceGlobalTempView("people") 
// The query  
sparkSession.sql("select name,age from people").show() 
// Create a new session  
sparkSession.newSession().sql("select name,age from people").show()

Be careful , Wrong.

Look at the error message

 

         Can't find people This global table . We have set up , Why is there no such table ?

         Referencing a global table requires global_temp Are identified . This global_temp It is equivalent to the database of the system , The global table is in this database .

         Now that you know the reason for the mistake , Let's take a look at the correct code and results

val sparkConf = new SparkConf().setMaster("local[*]").setAppName("Spark_Sql") 
val sparkSession = SparkSession.builder().config(sparkConf).getOrCreate() 
import sparkSession.implicits._ 
// Read json file  
val frame = sparkSession.read.json("date/people.json") 
// Register as a global table  
frame.createOrReplaceGlobalTempView("people") 
// The query  
sparkSession.sql("select name,age from global_temp.people").show() 
// New conversation  
sparkSession.newSession().sql("select name,age from global_temp.people").show()

No report error , And there are results

 

summary

         What will be updated tomorrow is RDDS and Data Frame and Data Set The relationship and transformation between .

         I didn't write much today , Also quite few , Tomorrow I will spend an afternoon writing a long article

         Today's subject 4 is still being accepted

Today is also a day to miss her

原网站

版权声明
本文为[Several storehouses of cabbage white]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202140922593215.html