当前位置:网站首页>Sparksql and flinksql create and link table records
Sparksql and flinksql create and link table records
2022-07-28 08:34:00 【Lu Xinhang】
start-up flink sql:bin/sql-client.sh
Build table
flink Create a table
create table iceberg.xxx.xxx
(
id STRING comment 'id',
dt STRING comment ' Partition field '
)
PARTITIONED BY (dt)
with (
'write.format.default' = 'parquet', -- Specify the file storage format , Default parquet
'write.parquet.compression-codec' = 'gzip', -- Specify the file compression format ,
'commit.manifest-merge.enabled' = 'true', -- Auto merge on write manifest
'history.expire.max-snapshot-age-ms' = '43200000', -- Historical snapshot retention time (ms), Default 5 God , here 12h
'engine.hive.enabled' = 'true', -- Support hive Inquire about
'write.metadata.delete-after-commit.enabled' = 'true', -- Delete the oldest metadata file after each new metadata file is created
'write.metadata.previous-versions-max' = '20', -- The maximum number of metadata files of previous versions deleted after submission
'write.metadata.compression-codec' = 'gzip', -- Enable metadata compression as gzip Format
'location' = 'hdfs://ns1/lakehouse/schema_name/table_name' -- Appoint hdfs Address
);
spark Create a table
create table iceberg.xxx.xxxx
(
id STRING comment 'id',
dt STRING comment ' Partition field '
)
using iceberg
partitioned by (dt)
location 'hdfs://xxx/lakehouse/schema_name/table_name'
tblproperties (
'write.format.default' = 'parquet', -- Specify the file storage format , Default parquet
'write.parquet.compression-codec' = 'gzip', -- Specify the file compression format ,
'commit.manifest-merge.enabled' = 'true', -- Auto merge on write manifest
'history.expire.max-snapshot-age-ms' = '43200000', -- Historical snapshot retention time (ms), Default 5 God , here 12h
'engine.hive.enabled' = 'true', -- Support hive Inquire about
'write.metadata.delete-after-commit.enabled' = 'true', -- Delete the oldest metadata file after each new metadata file is created
'write.metadata.previous-versions-max' = '20', -- The maximum number of metadata files of previous versions deleted after submission
'write.metadata.compression-codec' = 'gzip' -- Enable metadata compression as gzip Format
);
Create a target table based on the source table
CREATE TABLE iceberg.schema_name.table_name
using iceberg
partitioned by (dt)
location 'hdfs://ns1/lakehouse/schema_name/table_name'
tblproperties (
'write.format.default' = 'parquet',
'write.parquet.compression-codec' = 'gzip',
'commit.manifest-merge.enabled' = 'true',
'engine.hive.enabled' = 'true',
'write.metadata.delete-after-commit.enabled' = 'true',
'write.metadata.previous-versions-max' = '20',
'write.metadata.compression-codec' = 'gzip'
) AS SELECT * from iceberg.schema_name.original_table_name where 1=2;
Connect mysql、 sql server
spark
-- mysql
CREATE
TEMPORARY
VIEW tb_order_group
USING org.apache.spark.sql.jdbc
OPTIONS (
url 'jdbc:mysql://xx.xxx.xx.xx:4909/db_name?serverTimezone=GMT%2B8&useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&zeroDateTimeBehavior=convertToNull',
dbtable 'xxx',
user 'xxx',
password 'xxx'
);
-- sqlserver
CREATE
TEMPORARY
VIEW gxywhz
USING org.apache.spark.sql.jdbc
OPTIONS (
url 'jdbc:sqlserver://192.168.1.xx:xxx;DatabaseName=xxx',
dbtable 'dbo.xxx',
user 'xxx',
password 'xxxx'
);
flink
-- Connector Connect Create a mapping table
CREATE TABLE mysql_source
(
id int comment 'id'
primary key (id) NOT ENFORCED
) WITH (
'connector' = 'jdbc',
'url' = 'jdbc:mysql://192.168.xx.xx:3306/xxx',
'table-name' = 'tableName',
'driver' = 'com.mysql.jdbc.Driver',
'username' = 'root',
'password' = 'xxx'
);
CREATE TABLE sqlserver_source
(
id STRING comment 'id'
) WITH (
'connector' = 'jdbc',
'url' = 'jdbc:jtds:sqlserver://192.168.xx.xxx:10009;databaseName=xxx;useLOBs=false',
'table-name' = 'schema.tableName',
'driver' = 'net.sourceforge.jtds.jdbc.Driver',
'username' = 'xx',
'password' = 'xxxxx'
);
边栏推荐
- Characteristics of EMC EMI beads
- No super high-rise buildings | new regulations: what information does it reveal that no new buildings above 500 meters should be built?
- Understand the propagation process of EMI electromagnetic interference through five diagrams - the influence of square wave steepness on high-frequency components, the spectrum graph from time sequenc
- Change the dataDir path after mysql8.0.16 installation
- PMP practice once a day | don't get lost in the exam -7.13
- c语言中函数的介绍(血书20000字!!!!)
- Usage of qcombobox
- [pyqt] pyqt development experience_ How to find events and methods of controls
- Can‘t connect to server on ‘IP‘ (60)
- JS cartoon English alphabet typing game source code
猜你喜欢

Common solutions for distributed ID - take one

Draw.io image saving path settings

2022 Niuke multi school second problem solving Report

Prescan quick start to proficient in lecture 17, speed curve editor

Meituan Er Mian: why does redis have sentinels?

2022牛客多校第二场解题报告

Understand the propagation process of EMI electromagnetic interference through five diagrams - the influence of square wave steepness on high-frequency components, the spectrum graph from time sequenc

JS thoroughly understand this point

五张图看懂EMI电磁干扰的传播过程-方波陡峭程度对高频成分的影响,时序到频域频谱图形,波形形状对EMI辐射的影响。

GBASE亮相联通云巡展(四川站) 以专业赋能云生态
随机推荐
CarSim simulation quick start (XIII) - steering system
Understand the propagation process of EMI electromagnetic interference through five diagrams - the influence of square wave steepness on high-frequency components, the spectrum graph from time sequenc
How to set it to pop up the right-click menu
Deluxe H5 Tetris game source code
tkMapper的使用-超详细
OSPF comprehensive experiment (7.12)
2022 Niuke multi school second problem solving Report
Change the dataDir path after mysql8.0.16 installation
(Reprinted) plantuml Quick Guide
一篇文章搞懂数据仓库:元数据分类、元数据管理
中标捷报!南大通用GBase 8s中标南瑞集团2022年数据库框架项目
MySQL query error [err] 1046 - no database selected
sparksql 与flinksql 建表 与 连表记录
How to write a JMeter script common to the test team
GBase 8s是否支持存储关系型数据和对象型数据?
Usage of qmap
Simple use of unity queue
豪华版h5俄罗斯方块小游戏源码
MCU IO port controls 12V voltage on and off, MOS and triode circuit
Three different numbers with 0 in leetcode/ array