当前位置:网站首页>Flink: from introduction to Zhenxiang (6. Flink implements UDF function - realizes more fine-grained control flow)
Flink: from introduction to Zhenxiang (6. Flink implements UDF function - realizes more fine-grained control flow)
2020-11-08 12:06:00 【osc_15vyay19】
Flink Provides a variety of data conversion operations , But in the actual business process, there are many data structures that need to be processed in business 、 Rules and so on , You need to write your own business code , It's used at this time flink Provided function class (Function Class)
Flink Exposed everything udf Function interface ( The implementation mode is interface or abstract class ), for example MapFunction,FilterFunction,ProcessFunction etc. .
A small chestnut , To filter the data to sensor3 Start with data
Still com.mafei.apitest Create a new one scala Object UDFTest1
The rest of the code is the same as before , Read the file and do some simple processing , A custom function class is added here MyFilterFunction, When use , Just add... To the logic .filter The method can ,
package com.mafei.apitest
import org.apache.flink.api.common.functions.{FilterFunction, ReduceFunction, RichFilterFunction}
import org.apache.flink.streaming.api.scala.{StreamExecutionEnvironment, createTypeInformation}
// Get sensor data
case class SensorReadingTest1(id: String,timestamp: Long, temperature: Double)
object UdfTest1 {
def main(args: Array[String]): Unit = {
// Create an execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment
case class Person(name: String, age: Int)
val inputStream= env.readTextFile("/opt/java2020_study/maven/flink1/src/main/resources/sensor.txt")
env.setParallelism(1)
// inputStream.print()
// First convert to sample class type
val dataStream = inputStream
.map(data => {
val arr = data.split(",") // according to , Split data , To get the results
SensorReadingTest1(arr(0), arr(1).toLong, arr(2).toDouble) // Generate data for a sensor class , Parameters are passed in the middle toLong and toDouble Because the default split is string category
// }).filter(new MyFilterFunction)
// }).filter(_.id.startsWith("sensor1")) // If it's very simple logic , You can also write anonymous classes like this , It's the same effect as writing a function
// }).filter(new RichFilterFunction[SensorReadingTest1] {
// override def filter(t: SensorReadingTest1): Boolean =
// t.id.startsWith("sensor3")
// }) // Anonymous class implementation effect , And above 2 The effects are the same
}).filter(new KeywordFilterFunction("sensor3")) // You can also pass in the parameters to be filtered
dataStream.print()
env.execute("udf test")
}
}
// Customize a function class , Filter it , Implement... In the interface filter The method can
class MyFilterFunction extends FilterFunction[SensorReadingTest1] {
override def filter(t: SensorReadingTest1): Boolean = t.id.startsWith("sensor3")
}
// Custom function class , Same as above , Added the transmission reference ,
class KeywordFilterFunction(keyword: String) extends FilterFunction[SensorReadingTest1]{
override def filter(t: SensorReadingTest1): Boolean =
t.id.startsWith(keyword)
}
Code structure and running effect diagram
RichMap
Mainly do some data processing and other operations , The code demonstrates MapperDemo and RichMapDemo The difference and operation effect of
package com.mafei.apitest
import org.apache.flink.api.common.functions.{FilterFunction, MapFunction, RichMapFunction}
import org.apache.flink.configuration.Configuration
import org.apache.flink.streaming.api.scala.{StreamExecutionEnvironment, createTypeInformation}
// Get sensor data
case class SensorReadingTest2(id: String,timestamp: Long, temperature: Double)
object UdfTest2 {
def main(args: Array[String]): Unit = {
// Create an execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment
case class Person(name: String, age: Int)
val inputStream= env.readTextFile("/opt/java2020_study/maven/flink1/src/main/resources/sensor.txt")
env.setParallelism(1)
// inputStream.print()
// First convert to sample class type
val dataStream = inputStream
.map(data => {
val arr = data.split(",") // according to , Split data , To get the results
SensorReadingTest2(arr(0), arr(1).toLong, arr(2).toDouble) // Generate data for a sensor class , Parameters are passed in the middle toLong and toDouble Because the default split is string category
}).map(new RichMapDemo())
dataStream.print()
env.execute("udf test")
}
}
class MapperDemo extends MapFunction[SensorReadingTest2, String]{
override def map(t: SensorReadingTest2): String = t.id+" Test to add some strings "
}
// Rich function , There are more classes than above open and close Other methods , Can do some database connection and other operations
class RichMapDemo extends RichMapFunction[SensorReadingTest2, String]{
// The main operations here are initialization , When starting the call , The whole process will only be called once , It is similar to the variables loaded by class initialization , Like database connection and so on
override def open(parameters: Configuration): Unit = {
println(" A database connection was made ..........")
// Get runtime context
getRuntimeContext()
}
// Every data goes through this method
override def map(in: SensorReadingTest2): String = in.id+" Test the rich function and add some strings "
override def close(): Unit = {
// Follow open similar , When the task stops , You can do something like release database connection and so on
print(" Closed database connection ......")
}
}
Running effect : You can see , The whole process , Only one database connection operation
A database connection was made ..........
sensor1 Test the rich function and add some strings
sensor2 Test the rich function and add some strings
sensor3 Test the rich function and add some strings
sensor4 Test the rich function and add some strings
sensor4 Test the rich function and add some strings
sensor4 Test the rich function and add some strings
Closed database connection ......
版权声明
本文为[osc_15vyay19]所创,转载请带上原文链接,感谢
边栏推荐
- 我们采访了阿里云云数据库SQL Server的产品经理,他说了解这四个问题就可以了...
- 211考研失败后,熬夜了两个月拿下字节offer!【面经分享】
- Python basic syntax
- C语言I博客作业03
- Python Gadgets: code conversion
- Implementation of verification code recognition in Python opencv pytesseract
- 阿里教你深入浅出玩转物联网平台!(附网盘链接)
- Top 5 Chinese cloud manufacturers in 2018: Alibaba cloud, Tencent cloud, AWS, telecom, Unicom
- Automatically generate RSS feeds for docsify
- How TCP protocol ensures reliable transmission
猜你喜欢
The progress bar written in Python is so wonderful~
Python basic syntax variables
Top 5 Chinese cloud manufacturers in 2018: Alibaba cloud, Tencent cloud, AWS, telecom, Unicom
C language I blog assignment 03
华为云重大变革:Cloud&AI 升至华为第四大 BG ,火力全开
Powershell 使用.Net对象发送邮件
Analysis of ArrayList source code
TiDB 性能竞赛 11.02-11.06
Personal current technology stack
应届生年薪35w+ !倒挂老员工,互联网大厂薪资为何越来越高?
随机推荐
Win10 terminal + WSL 2 installation and configuration guide, exquisite development experience
2天,利用下班后的4小时开发一个测试工具
当Kubernetes遇到机密计算,看阿里巴巴如何保护容器内数据的安全!(附网盘链接)
211 postgraduate entrance examination failed, stay up for two months, get the byte offer! [face to face sharing]
用科技赋能教育创新与重构 华为将教育信息化落到实处
一个方案提升Flutter内存利用率
211考研失败后,熬夜了两个月拿下字节offer!【面经分享】
When kubernetes encounters confidential computing, see how Alibaba protects the data in the container! (Internet disk link attached)
Flink从入门到真香(3、从集合和文件中读取数据)
Service architecture and transformation optimization process of e-commerce trading platform in mogujie (including ppt)
python基础教程python opencv pytesseract 验证码识别的实现
2018中国云厂商TOP5:阿里云、腾讯云、AWS、电信、联通 ...
渤海银行百万级罚单不断:李伏安却称治理完善,增速呈下滑趋势
Win10 Terminal + WSL 2 安装配置指南,精致开发体验
一文剖析2020年最火十大物联网应用|IoT Analytics 年度重磅报告出炉!
The progress bar written in Python is so wonderful~
供货紧张!苹果被曝 iPhone 12 电源芯片产能不足
next.js实现服务端缓存
2018中国云厂商TOP5:阿里云、腾讯云、AWS、电信、联通 ...
The container with the most water