当前位置:网站首页>Interpretation of Apache linkage parameters in computing middleware

Interpretation of Apache linkage parameters in computing middleware

2022-07-05 14:46:00 Wechat open source

  Introduction : This paper mainly focuses on Linkis The parameter system of , Include Linkis Server parameters 、 Client parameters 、 Introduction of management console parameters .

1. Parameter classification

Linkis Parameters are mainly divided into the following three parts :

  • Linkis Server parameters , It mainly includes Linkis Its own parameters and Spring Parameters of
  • Linkis SDK、Restful Wait for the client side to call the submitted parameters
  • Linkis Management console parameters

2. Linkis Server parameters

(1)Linkis Its own parameters

linkis Its own parameters can be set in the configuration file , It also supports setting through environment variables and system attributes , It is recommended to use the configuration file to set .

Linkis The configuration file format is as follows :

├──conf   The configuration directory   │   ├── application-eureka.yml│   ├── application-linkis.yml│   ├── linkis-cg-engineconnmanager.properties│   ├── linkis-cg-engineplugin.properties│   ├── linkis-cg-entrance.properties│   ├── linkis-cg-linkismanager.properties│   ├── linkis-mg-gateway.properties│   ├── linkis-ps-cs.properties│   ├── linkis-ps-data-source-manager.properties│   ├── linkis-ps-metadatamanager.properties│   ├── linkis-ps-publicservice.properties│   ├── linkis.properties│   ├── log4j2-console.xml│   ├── log4j2.xml

Each service will load two attribute profiles , A public master configuration file linkis.properties, And service configuration file linkis-serviceName.properties. The priority set is that the service configuration file is higher than the main configuration file. It is recommended that the general parameters be placed in the main configuration file , The personality profile is placed in the service profile

(2)Spring Parameters Linkis Service is based on SpringBoot Applied ,Spring Relevant parameters are supported in application-linkis.yml Set it up , Also support in linkis Configure in the configuration file . stay linkis The configuration in the configuration file needs to be added spring. The prefix of . as follows :

# spring port default server.port=9102# in linkis conf need spring prefixspring.server.port=9102

3. Linkis Client parameters

Linkis Client parameters mainly refer to the parameters when the task is submitted , It mainly specifies parameters in the submission interface .

(1)restful How to set parameters :

{    "executionContent": {"code": "show tables", "runType":  "sql"},    "params": { //  Submit parameters                         "variable":{  // Custom variables needed in the code                                 "k1":"v1"                        },                        "configuration":{                                "special":{ // Special configuration parameters   Such as log path , Result set path, etc                                         "k2":"v2"                                },                                "runtime":{ // Runtime parameters , Such as JDBC Database connection parameters of the engine ,presto Data source parameters of the engine                                         "k3":"v3"                                },                                "startup":{ // Launch parameters , If you start EC Memory parameters ,spark Engine parameters 、hive Engine parameters, etc                                         "k4":"v4"  Such as :spark.executor.memory:5G  Set up Spark Actuator memory , Bottom Spark、hive Equal engine parameters keyName It is consistent with the native parameters                                 }                        }                },    "labels": { // Tag parameters , Support setting engine version 、 Users and Applications         "engineType": "spark-2.4.3",        "userCreator": "hadoop-IDE"    }}

(2)SDK How to set parameters :

 notes : Comments on methods JobSubmitAction jobSubmitAction =JobSubmitAction.builder().addExecuteCode(code)// Launch parameters , If you start EC Memory parameters ,spark Engine parameters 、hive Engine parameters, etc ,  Such as :spark.executor.memory:5G  Set up Spark Actuator memory , Bottom Spark、hive Equal engine parameters keyName It is consistent with the native parameters .setStartupParams(startupMap) // Runtime parameters , Such as JDBC Database connection parameters of the engine ,presto Data source parameters of the engine .setRuntimeParams(runTimeMap) // Custom variables needed in the code .setVariableMap(varMap) // Tag parameters , Support setting engine version 、 Users and Applications .setLabels(labels)//submit user.setUser(user) // execute user.addExecuteUser(user)  .build();

(3)linkis-cli How to set parameters

linkis-cli -runtieMap key1=value -runtieMap key2=value           -labelMap key1=value          -varMap key1=value          -startUpMap key1=value

 

Be careful : When submitting client parameters , Only engine related parameters and label parameters , as well as Yarn Queue settings can take effect , other Linkis Server parameters and resource limitation parameters , Such as task and engine concurrency parameters wds.linkis.rm.instances Task settings are not supported

(4) Common label parameters :

Map<String, Object> labels = new HashMap<String, Object>();     labels.put(LabelKeyConstant.ENGINE_TYPE_KEY, "spark-2.4.3"); //  Specify the engine type and version      labels.put(LabelKeyConstant.USER_CREATOR_TYPE_KEY, user + "-IDE");//  Specify the user running and your APPName     labels.put(LabelKeyConstant.CODE_TYPE_KEY, "sql"); //  Specify the type of script to run :spark Support :sql、scala、py;Hive:hql;shell:sh;python:python;presto:psql     labels.put(LabelKeyConstant.JOB_RUNNING_TIMEOUT_KEY, "10");//job function 10s Failed to complete automatic initiation Kill, Unit is s     labels.put(LabelKeyConstant.JOB_QUEUING_TIMEOUT_KEY, "10");//job Queue over 10s Failed to complete automatic initiation Kill, Unit is s     labels.put(LabelKeyConstant.RETRY_TIMEOUT_KEY, "10000");//job Waiting time for retry due to failure of resources , Unit is ms, If the queue fails because of insufficient resources , By default, it will initiate 10 Retries      labels.put(LabelKeyConstant.TENANT_KEY,"hduser02");// Tenant label , If the tenant parameter is specified for the task, the task will be routed to a separate ECM machine      labels.put(LabelKeyConstant.EXECUTE_ONCE_KEY,"");// Execute label once , This parameter is not recommended , After setting, the engine will not reuse, and the engine will end after the task runs , Only a task parameter with specialization can be set 

4. Linkis Management console parameters

Linkis Management console parameters are provided for users to specify resource limit parameters and default task parameters Web Interface . Global configuration parameter :

 

It mainly includes global queue parameters [wds.linkis.rm.yarnqueue], The task defaults to Yarn queue , Support on the client StartUPMap In addition, specify resource limitation parameters , These parameters do not support task settings , Support the management console to adjust .

 queue CPU Use the upper limit [wds.linkis.rm.yarnqueue.cores.max], At this stage, only restrictions are supported Spark Usage of total queue resources for type tasks. Upper limit of queue memory usage [wds.linkis.rm.yarnqueue.memory.max] Global memory usage limit of each engine [wds.linkis.rm.client.memory.max]  This parameter does not refer to the total memory that can only be used , Instead, specify a Creator Total memory usage of a specific engine , Such as restrictions IDE-SPARK Tasks can only use 10G Maximum number of cores of each engine in the global memory [wds.linkis.rm.client.core.max] This parameter does not mean that it can only be used in total CPU, Instead, specify a Creator Total memory usage of a specific engine , Such as restrictions IDE-SPARK Tasks can only use 10Cores The maximum concurrency of each engine globally [wds.linkis.rm.instance], This parameter has two meanings , One is to limit one Creator How many specific engines can be started in total , And limit a Creator The number of tasks that a specific engine task can run at the same time 

Engine configuration parameters :

 

It mainly specifies the startup parameters and runtime parameters of the engine , These parameters can be set on the client , It is recommended to use the client for personalized submission settings , The page only sets the default value

5. Last

The community is currently holding an essay contest , The highest reward is available “1000 Yuan Jingdong card +500 Yuan community gifts ” Oh , Click the figure below to view the details

 

— END —

How to be a community contributor

 

1 ► Official document contribution . Discover the inadequacy of the document 、 Optimize the document , Participate in community contribution by continuously updating documents . Contribute through documentation , Let developers know how to submit PR And really participate in the construction of the community . Reference guide : Nanny class course : How to become Apache Linkis Document contributors

2 ► Code contribution . We sorted out the simple and easy to get started tasks in the community , It is very suitable for newcomers to make code contributions . Please refer to the novice task list :https://github.com/apache/incubator-linkis/issues/1161

3 ► Content contribution : Release WeDataSphere Content related to open source components , Including but not limited to installation and deployment tutorials 、 Use experience 、 Case practice, etc , There is no limit to form , Please submit your contribution to the little assistant . for example :

4 ► Community Q & A : Actively answer questions in the community 、 Share technology 、 Help developers solve problems, etc ;

5 ► other : Actively participate in community activities 、 Become a community volunteer 、 Help the community promote 、 Provide effective suggestions for community development ;

原网站

版权声明
本文为[Wechat open source]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/186/202207051443073484.html