当前位置:网站首页>Yan required executor memory is above the max threshold (8192mb) of this cluster!
Yan required executor memory is above the max threshold (8192mb) of this cluster!
2022-07-25 15:14:00 【The south wind knows what I mean】
Problem description :
Sparn on Yarn In this scenario Spark When the task , Single executor Of memeory Set more than 8G The following errors will be reported , This is a yarn Unreasonable settings lead to ,yarn Default single nodemanager by 8G resources , This is not reasonable , So I make adjustments on the cluster .
Main log performance :
Solution :
modify yarn-site.xml
<property>
<description>The minimum allocation for every container request at the RM,
in MBs. Memory requests lower than this won't take effect,
and the specified value will get allocated at minimum.</description>
<name>yarn.scheduler.minimum-allocation-mb</name>
<value>2048</value>
</property>
<property>
<description>The maximum allocation for every container request at the RM,
in MBs. Memory requests higher than this won't take effect,
and will get capped to this value.</description>
<name>yarn.scheduler.maximum-allocation-mb</name>
<value>16384</value>
</property>
<property>
<description>yarn Assigned to nodemanager Maximum number of cores </description>
<name>yarn.scheduler.maximum-allocation-vcores</name>
<value>16</value>
</property>
Adjusted performance

边栏推荐
猜你喜欢
随机推荐
万能通用智能JS表单验证
安装EntityFramework方法
How much memory can a program use at most?
Spark002---spark任务提交,传入json作为参数
Example of password strength verification
iframe嵌套其它网站页面 全屏设置
打开虚拟机时出现VMware Workstation 未能启动 VMware Authorization Service
[C topic] the penultimate node in the Niuke linked list
Object.prototype. Hasownproperty() and in
Leo-sam: tightly coupled laser inertial odometer with smoothing and mapping
深入:微任务与宏任务
ES5写继承的思路
用OpenPose进行单个或多个人体姿态估计
如何更新更新数据库中的json值?
树莓派入门:树莓派的初始设置
node学习
sql to linq 之存储过程偏
[Android] recyclerview caching mechanism, is it really difficult to understand? What level of cache is it?
[C topic] Li Kou 88. merge two ordered arrays
Spark获取DataFrame中列的方式--col,$,column,apply









