当前位置:网站首页>Technology sharing | introduction to linkis parameters
Technology sharing | introduction to linkis parameters
2022-07-01 11:02:00 【Wechat open source】

Introduction : This paper mainly focuses on Linkis The parameter system of , Include Linkis Server parameters 、 Client parameters 、 Introduction of management console parameters .
Linkis Parameters are mainly divided into the following three parts :
Linkis Server parameters , It mainly includes Linkis Its own parameters and Spring Parameters of
Linkis SDK、Restful Wait for the client side to call the submitted parameters
Linkis Management console parameters
├──conf The configuration directory│ ├── application-eureka.yml│ ├── application-linkis.yml│ ├── linkis-cg-engineconnmanager.properties│ ├── linkis-cg-engineplugin.properties│ ├── linkis-cg-entrance.properties│ ├── linkis-cg-linkismanager.properties│ ├── linkis-mg-gateway.properties│ ├── linkis-ps-cs.properties│ ├── linkis-ps-data-source-manager.properties│ ├── linkis-ps-metadatamanager.properties│ ├── linkis-ps-publicservice.properties│ ├── linkis.properties│ ├── log4j2-console.xml│ ├── log4j2.xml
It is recommended that general parameters be placed in the main configuration file , The personality profile is placed in the service profile
Linkis Service is based on SpringBoot Applied ,Spring Relevant parameters are supported in application-linkis.yml Set it up , Also support in linkis Configure in the configuration file . stay linkis The configuration in the configuration file needs to be added spring. The prefix of . as follows :
spring port defaultserver.port=9102in linkis conf need spring prefixspring.server.port=9102
{"executionContent": {"code": "show tables", "runType": "sql"},"params": { // Submit parameters"variable":{ // Custom variables needed in the code"k1":"v1"},"configuration":{"special":{ // Special configuration parameters Such as log path , Result set path, etc"k2":"v2"},"runtime":{ // Runtime parameters , Such as JDBC Database connection parameters of the engine ,presto Data source parameters of the engine"k3":"v3"},"startup":{ // Launch parameters , If you start EC Memory parameters ,spark Engine parameters 、hive Engine parameters, etc"k4":"v4" Such as :spark.executor.memory:5G Set up Spark Actuator memory , Bottom Spark、hive Equal engine parameters keyName It is consistent with the native parameters}}},"labels": { // Tag parameters , Support setting engine version 、 Users and Applications"engineType": "spark-2.4.3","userCreator": "hadoop-IDE"}}
notes : Comments on methodsJobSubmitAction jobSubmitAction =JobSubmitAction.builder().addExecuteCode(code)// Launch parameters , If you start EC Memory parameters ,spark Engine parameters 、hive Engine parameters, etc , Such as :spark.executor.memory:5G Set up Spark Actuator memory , Bottom Spark、hive Equal engine parameters keyName It is consistent with the native parameters.setStartupParams(startupMap)// Runtime parameters , Such as JDBC Database connection parameters of the engine ,presto Data source parameters of the engine.setRuntimeParams(runTimeMap)// Custom variables needed in the code.setVariableMap(varMap)// Tag parameters , Support setting engine version 、 Users and Applications.setLabels(labels)//submit user.setUser(user)// execute user.addExecuteUser(user).build();
linkis-cli -runtieMap key1=value -runtieMap key2=value-labelMap key1=value-varMap key1=value-startUpMap key1=value
Map<String, Object> labels = new HashMap<String, Object>();labels.put(LabelKeyConstant.ENGINE_TYPE_KEY, "spark-2.4.3"); // Specify the engine type and versionlabels.put(LabelKeyConstant.USER_CREATOR_TYPE_KEY, user + "-IDE");// Specify the user running and your APPNamelabels.put(LabelKeyConstant.CODE_TYPE_KEY, "sql"); // Specify the type of script to run :spark Support :sql、scala、py;Hive:hql;shell:sh;python:python;presto:psqllabels.put(LabelKeyConstant.JOB_RUNNING_TIMEOUT_KEY, "10");//job function 10s Failed to complete automatic initiation Kill, Unit is slabels.put(LabelKeyConstant.JOB_QUEUING_TIMEOUT_KEY, "10");//job Queue over 10s Failed to complete automatic initiation Kill, Unit is slabels.put(LabelKeyConstant.RETRY_TIMEOUT_KEY, "10000");//job Waiting time for retry due to failure of resources , Unit is ms, If the queue fails because of insufficient resources , By default, it will initiate 10 Retrieslabels.put(LabelKeyConstant.TENANT_KEY,"hduser02");// Tenant label , If the tenant parameter is specified for the task, the task will be routed to a separate ECM machinelabels.put(LabelKeyConstant.EXECUTE_ONCE_KEY,"");// Execute label once , This parameter is not recommended , After setting, the engine will not reuse, and the engine will end after the task runs , Only a task parameter with specialization can be set

queue CPU Use the upper limit [wds.linkis.rm.yarnqueue.cores.max], At this stage, only restrictions are supported Spark Use of total queue resources for type tasksMaximum queue memory usage [wds.linkis.rm.yarnqueue.memory.max]Global memory usage limit of each engine [wds.linkis.rm.client.memory.max] This parameter does not refer to the total memory that can only be used , Instead, specify a Creator Total memory usage of a specific engine , Such as restrictions IDE-SPARK Tasks can only use 10G MemoryThe maximum number of cores of each engine in the whole world [wds.linkis.rm.client.core.max] This parameter does not mean that it can only be used in total CPU, Instead, specify a Creator Total memory usage of a specific engine , Such as restrictions IDE-SPARK Tasks can only use 10CoresThe maximum concurrency of each engine globally [wds.linkis.rm.instance], This parameter has two meanings , One is to limit one Creator How many specific engines can be started in total , And limit a Creator The number of tasks that a specific engine task can run at the same time


How to be a community contributor
1 ► Official document contribution . Discover the inadequacy of the document 、 Optimize the document , Participate in community contribution by continuously updating documents . Contribute through documentation , Let developers know how to submit PR And really participate in the construction of the community . Reference guide : Nanny class course : How to become Apache Linkis Document contributors
2 ► Code contribution . We sorted out the simple and easy to get started tasks in the community , It is very suitable for newcomers to make code contributions . Please refer to the novice task list :https://github.com/apache/incubator-linkis/issues/1161
3 ► Content contribution : Release WeDataSphere Content related to open source components , Including but not limited to installation and deployment tutorials 、 Use experience 、 Case practice, etc , There is no limit to form , Please submit your contribution to the little assistant . for example :
Technical dry cargo | Linkis practice : The new engine implements process parsing
Community developer column | MariaCarrie:Linkis1.0.2 Installation and use guide
4 ► Community Q & A : Actively answer questions in the community 、 Share technology 、 Help developers solve problems, etc ;
5 ► other : Actively participate in community activities 、 Become a community volunteer 、 Help the community promote 、 Provide effective suggestions for community development ;

This article is from WeChat official account. - WeDataSphere(gh_273e85fce73b).
If there is any infringement , Please contact the [email protected] Delete .
Participation of this paper “OSC Source creation plan ”, You are welcome to join us , share .
边栏推荐
- Packet mode and three streaming modes in SDP protocol
- 基金国际化的发展概况
- 云上“视界” 创新无限 | 2022阿里云直播峰会正式上线
- LeetCode 438. Find all letter ectopic words in the string__ sliding window
- 新品大揭秘!雅迪冠能 3 多元产品矩阵,满足全球用户出行需求
- flutter path_ Provider: ^2.0.10 can get temporary directory
- LeetCode.515. 在每个树行中找最大值___逐一BFS+DFS+按层BFS
- Simulink simulation circuit model of open loop buck buck buck chopper circuit based on MATLAB
- 《数据安全法》出台一周年,看哪四大变化来袭?
- CVPR22 |CMT:CNN和Transformer的高效结合(开源)
猜你喜欢

Rising stars in Plant Sciences (rsps2022) final Science Lecture (6.30 pm)

CVPR 2022 | 基于密度与深度分解的自增强非成对图像去雾

个人商城二开逍遥B2C商城系统源码-可商用版/拼团拼购优惠折扣秒杀源码

Intel Labs annonce de nouveaux progrès en photonique intégrée

The list of winners of the digital collection of "century master" was announced

The exclusive collection of China lunar exploration project is limited to sale!

Half of 2022 has passed, isn't it sudden?

Mobile hard drive reads but does not display drive letter

Global filter (processing time format)

106. construct binary tree from middle order and post order traversal sequence
随机推荐
. Net 5.0+ does not need to rely on third-party native implementation of scheduled tasks
华泰证券网上开户安全吗?
CRC verification
Leetcode 181 Employees exceeding the manager's income (June 29, 2022)
The project bar on the left side of CodeBlocks disappears, workspace automatically saves the project, default workspace, open the last workspace, workspace (Graphic tutorial, solved)
Value 1000 graduation project campus information publishing platform website source code
商城小程序源码开源版-可二开
Error: missing revert data in call exception
12 plateformes de gestion de produits utilisées par tout le monde
Global filter (processing time format)
12 product management platforms that everyone is using
【Matytype】在CSDN博客中插入Mathtype行间与行内公式
Uncover the secrets of new products! Yadi Guanneng 3 multi product matrix to meet the travel needs of global users
《数据安全法》出台一周年,看哪四大变化来袭?
内存泄漏定位工具之 valgrind 使用
.NET 5.0+ 无需依赖第三方 原生实现定时任务
[.net6] use ml.net+onnx pre training model to liven the classic "Huaqiang buys melons" in station B
When is testing not unit testing- When is a Test not a Unit-test?
Lack of comparator, operational amplifier to save the field! (the op amp is recorded as a comparator circuit)
Want to open an account, is it safe to open an account of Huatai Securities online?