当前位置:网站首页>JUC multithreading: creation and working principle of thread pool
JUC multithreading: creation and working principle of thread pool
2022-06-29 14:12:00 【Full stack programmer webmaster】
One 、 What is a thread pool :
Thread pool is mainly used to solve When a new task is executed , The application creates a new thread for the task as well as When the task is completed , The overhead of destroying threads . Through the thread pool , You can create a thread collection when the project is initialized , Then reuse these threads when new tasks need to be performed instead of creating a new thread each time , Once the task has been completed , The thread returns to the thread pool and waits for the next assignment , To achieve the effect of resource reuse .
1、 The main advantages of thread pooling are :
(1) Reduce resource consumption : Reuse created threads... Through pooling technology , Reduce the loss caused by thread creation and destruction . (2) Improve response time : When the mission arrives , No need to wait for thread creation to execute immediately . (3) Improve the manageability of threads : Threads are scarce resources , If unlimited creation , Not only does it consume system resources , It will also cause resource scheduling imbalance due to the unreasonable distribution of threads , Reduce the stability of the system . Uniform allocation is possible using thread pools 、 Tune and monitor . (4) More and more powerful functions : Thread pool has scalability , Allow developers to add more functionality to it . For example, delay timed thread pool ScheduledThreadPoolExecutor, To allow tasks to be postponed or performed on a regular basis .
Two 、 Creating a thread pool :
1、 adopt Executors Creating a thread pool :
stay JUC In bag Executors in , Provides some static methods , Used to quickly create thread pools , Common thread pools are :
(1)newSingleThreadExecutor: Create a thread pool with only one thread , Serial execution of all tasks , It will not be turned off even when it is idle . It can ensure that all tasks are executed in the order of task submission . If the only thread ends because of an exception , Then there will be a new thread to replace it .
Applicable scenario : It is necessary to ensure that the tasks are performed in sequence ; And at any point in time , Application scenarios where there will be no multiple thread activity .
(2)newFixedThreadPool: Create a thread pool with a fixed number of threads (corePoolSize == maximumPoolSize, Use LinkedBlockingQuene As a blocking queue ). The number of threads during initialization is zero , After that, a thread is created each time a task is submitted , Until the thread reaches the maximum capacity of the thread pool . Once the size of the thread pool reaches the maximum, it remains unchanged , If a thread ends because of an exception , Then the thread pool will add a new thread .
Applicable scenario : To meet the needs of resource management , The application scenarios that need to limit the number of current threads , It is suitable for servers with heavy load .
(3)newCachedThreadPool: Create a cacheable thread pool , The maximum number of threads is Integer.MAX_VALUE. Idle threads are temporarily cached , Thread will wait 60s If no task is added, it will be closed .
Applicable scenario : It is suitable for small programs that perform many short-time asynchronous tasks , Or a lighter load server .
(4)newScheduledThreadPool: Create a thread pool that supports the execution of deferred tasks or periodic tasks .
2、ThreadPoolExecutor Description of constructor parameters :
Use Executors Created thread pool , Its essence is to construct a through different parameters ThreadPoolExecutor object , It mainly includes the following 7 Parameters :
public ThreadPoolExecutor(int corePoolSize,
int maximumPoolSize,
long keepAliveTime,
TimeUnit unit,
BlockingQueue<Runnable> workQueue,
ThreadFactory threadFactory,
RejectedExecutionHandler handler) {
// Omit ...
}(1)corePoolSize: The number of core threads in the thread pool , When a task is submitted , Thread pool creates a new thread to execute tasks , Until the current number of threads is equal to corePoolSize; If the current number of threads is corePoolSize, The task that continues to be submitted is saved to the blocking queue workQueue in , Waiting to be executed ; If thread pool is executed prestartAllCoreThreads() Method , The thread pool is created ahead of time and starts all the core threads .
(2)maximumPoolSize: The maximum number of threads allowed in the thread pool . If at present workQueue The maximum number of threads that can be created when full .
(3)keepAliveTime: Lifetime of idle threads .
(4)unit:keepAliveTime The unit of idle thread lifetime ;
(5)workQueue: Blocking queues , Used to store tasks waiting to be performed , And the task must achieve Runnable Interface , stay JDK The following blocking queue is provided in :
- ArrayBlockingQueue: Bounded blocking queue based on array structure , Press FIFO Sorting task ;
- LinkedBlockingQuene: Blocking queue based on linked list structure , Press FIFO Sorting task , Throughput is usually higher ArrayBlockingQuene;
- SynchronousQuene: A blocked queue that does not store elements , Each insert must wait until another thread calls the remove operation , Otherwise, the insert operation is always blocked , Throughput is usually higher LinkedBlockingQuene
- PriorityBlockingQuene: Unbounded blocking queue with priority ;
- DelayQueue: An unbounded blocking queue using priority queue , You can only extract elements from it when the delay expires .
- LinkedTransferQueue: An unbounded blocking queue composed of linked list structure . And SynchronousQueue similar , There are also non blocking methods .
- LinkedBlockingDeque: A bidirectional blocking queue composed of linked list structure .
(6)threadFactory: Thread factory , Mainly used to create threads , The default is normal priority 、 Non-daemon thread .
(7)handler: Processing strategy when the thread pool rejects a task .
3、 Do not use Executors Creating a thread pool :
Alibaba development manual has a provision on concurrent programming : Thread pools are not allowed Executors To create , But through ThreadPoolExecutor The way , Why is that ? This is mainly because it can avoid the risk of resource exhaustion , Because use Executors The disadvantages of returning thread pool objects are :
(1)FixedThreadPool and SingleThreadPool The allowed blocking queue length is Integer.MAX_VALUE, This can cause a large number of requests to accumulate , Which leads to OOM;
(2)CachedThreadPool The number of threads allowed to be created is Integer.MAX_VALUE, A large number of threads may be created , Which leads to OOM.
So create a thread pool , It is better to use the line pass pool according to its purpose , Then create your own thread pool .
3、 ... and 、 Thread pool execution policy :
Execution logic description :
(1) When the client submits a task , The thread pool first determines whether the number of core threads is less than corePoolSize, If it is , Create a new number of core threads to run this task ;
(2) If the number of running threads is greater than or equal to corePoolSize, Then judge workQueue Is the queue full , If not full , Then put the task into workQueue in ;
(3) If workQueue The queue is full , Then judge whether the number of threads in the current thread pool is greater than maximumPoolSize, If it is less than maximumPoolSize, Start a non core thread to execute the task ;
(4) If the number of threads in the thread pool is greater than or equal to maximumPoolSize, Then the thread pool will be based on the set rejection policy , Take corresponding measures .
- ThreadPoolExecutor.AbortPolic( Default ): Throw out RejectedExecutionException abnormal ;
- ThereadPoolExecutor.CallerRunsPolicy: On the currently executing thread execute Method to run the rejected task .
- ThreadPoolExecutor.DiscardOldestPoliy: discarded workQueue The longest waiting task in , And add the rejected task to the queue .
- ThreadPoolExecutor.DiscardPolicy: This thread will be discarded directly .
(5) When a thread completes a task , It will be from workQueue Get the next task to execute .
(6) When a thread is idle for more than keepAliveTime Set the time , Thread pool will judge , If the current number of threads is greater than corePoolSize, Then the thread is stopped . So after all tasks of thread pool are completed , It will eventually shrink to corePoolSize Size .
Four 、 How to reasonably configure Java Thread pool ?
1、 High concurrency 、 Short task execution time : The number of threads is about the same as that of the machine cpu Nuclear number equivalent , You can make every thread execute tasks , Reduce thread context switching ;
2、 Low concurrency 、 The task takes a long time :
(1)IO intensive : because IO Operation does not occupy CPU, Most threads are blocked , Therefore, it is necessary to configure multiple threads , Give Way CPU Handle more business ;
(2)CPU intensive : The number of threads in the thread pool should be set to follow CPU Same number of cores , Reduce thread context switching ;
3、 High concurrency 、 Business execution takes a long time :
The key to solve this type of task is not the thread pool but the design of the overall architecture , The first step is to see if some data in these businesses can be cached , Adding servers is the second step . Last , The problem of long execution time of business , It may also be necessary to analyze , See if you can use middleware to split and decouple tasks .
4、 Configuration of bounded queue and unbounded queue :
In general, configure bounded queues , Use unbounded queues in some situations where explosive growth may occur . There are many tasks , Use non blocking queues and use CAS Good throughput can be achieved by operating an alternate lock .
Publisher : Full stack programmer stack length , Reprint please indicate the source :https://javaforall.cn/100038.html Link to the original text :https://javaforall.cn
边栏推荐
- Dynamics 365Online Lookup查找字段多选
- [high concurrency] cache idea
- “死掉”的诺基亚,一年躺赚1500亿
- 单端口RAM实现FIFO
- Grep exact match
- Detailed explanation of machine learning out of fold prediction | using out of fold prediction oof to evaluate the generalization performance of models and build integrated models
- 如何优雅的写 Controller 层代码?
- 【重要通知】中国图象图形学学会2022年度系列奖励推荐工作启动
- ES6 array method
- golang6 反射
猜你喜欢

TikTok全球短视频霸主地位或被YouTube反超

Problems in replacing RESNET convolution of mmdet with ghost convolution group

台式机主板上保护cpu的盖子安装和拆卸

BYD has three years left

【VEUX开发者工具的使用-getters使用】

Stable currency risk profile: are usdt and usdc safe?

你还在用命令看日志?快用 Kibana 吧,一张图胜过千万行日志

喜迎市科协“十大”•致敬科技工作者 | 卢毅:守护电网传输安全的探索者

直觉与实现:Batch Normalization

节点数据采集和标签信息的远程洪泛传输
随机推荐
【VEUX开发者工具的使用-getters使用】
Interview high concurrent, cool!! (high energy in the whole process, collection recommended)
Uncover the practice of Baidu intelligent test in the field of automatic test execution
Applet Wechat: un nouveau réseau exclusif de microgroupes de développement de Cloud
台式机主板上保护cpu的盖子安装和拆卸
windows平台下的mysql启动等基本操作
Grep exact match
tcpdump如何对特定的tcp标志位进行过滤
Turbulent intermediary business, restless renters
MySQL数据库:存储引擎
布隆过滤器Bloom Filter简介
golang代码规范整理
leetcode:226. 翻转二叉树
win11怎么看cpu几核几线程? win11查看cpu是几核几线程的教程
NuScenes关于Radar的配置信息
传输层 选择性确认 SACK
【重要通知】中国图象图形学学会2022年度系列奖励推荐工作启动
Hardware development notes (VIII): basic process of hardware development, making a USB to RS232 module (VII): creating a basic dip component (crystal oscillator) package and associating the principle
zabbix 5.0如何将esxi6.7添加到监控
Interpretation of RESNET source code in mmdet +ghost module