当前位置:网站首页>Go implements concurrent non blocking caching
Go implements concurrent non blocking caching
2022-06-13 00:13:00 【Ethan97】
Let's start with an example Function memory :
// A Memo caches the results of calling a Func.
type Memo struct {
f Func
cache map[string]result
}
// Func is the type of the function to memoize.
type Func func(key string) (interface{
}, error)
type result struct {
value interface{
}
err error
}
func New(f Func) *Memo {
return &Memo{
f: f, cache: make(map[string]result)}
}
// NOTE: not concurrency-safe!
func (memo *Memo) Get(key string) (interface{
}, error) {
res, ok := memo.cache[key]
if !ok {
res.value, res.err = memo.f(key)
memo.cache[key] = res
}
return res.value, res.err
}
The function f
Is a heavyweight computing function , It is expensive to call it , So cache the results in a map
To speed up each call . This is functional memory .
Every time you call Get
, Will be taken from memo
Query results in , If we don't find , You have to call the function f
The result of the calculation is , Record it in the cache .
The above implementation Get
Method updates the cache without using synchronization cache
, Whole Get
Functions are not concurrency safe .
Consider each call Get
All methods are locked :
type Memo struct {
f Func
mu sync.Mutex // guards cache
cache map[string]result
}
// Get is concurrency-safe.
func (memo *Memo) Get(key string) (value interface{
}, err error) {
memo.mu.Lock()
res, ok := memo.cache[key]
if !ok {
res.value, res.err = memo.f(key)
memo.cache[key] = res
}
memo.mu.Unlock()
return res.value, res.err
}
Because each call requests a mutex ,Get
The parallel request operation is serialized again .
Consider the following improvements :
func (memo *Memo) Get(key string) (value interface{
}, err error) {
memo.mu.Lock()
res, ok := memo.cache[key]
memo.mu.Unlock()
if !ok {
res.value, res.err = memo.f(key)
// Between the two critical sections, several goroutines
// may race to compute f(key) and update the map.
memo.mu.Lock()
memo.cache[key] = res
memo.mu.Unlock()
}
return res.value, res.err
}
This version acquires locks twice : First used for query cache , The second time is used to update the query when there is no result .
In an ideal situation , We should avoid this extra processing . This function is sometimes called repetition inhibition (duplication suppression).
In the fourth version of the cache , For each entry
A new channel has been added ready
. After setting up entry
Of result
Field after , The channel will close , Waiting for goroutine Will receive a broadcast , You can start from entry
Read the results in .
// Func is the type of the function to memoize.
type Func func(string) (interface{
}, error)
type result struct {
value interface{
}
err error
}
type entry struct {
res result
ready chan struct{
} // closed when res is ready
}
func New(f Func) *Memo {
return &Memo{
f: f, cache: make(map[string]*entry)}
}
type Memo struct {
f Func
mu sync.Mutex // guards cache
cache map[string]*entry // Now the cache returns a entry
}
func (memo *Memo) Get(key string) (value interface{
}, err error) {
memo.mu.Lock()
e := memo.cache[key]
if e == nil {
// This is the first request for this key.
// This goroutine becomes responsible for computing
// the value and broadcasting the ready condition.
e = &entry{
ready: make(chan struct{
})}
memo.cache[key] = e
memo.mu.Unlock()
e.res.value, e.res.err = memo.f(key)
close(e.ready) // broadcast ready condition
} else {
// This is a repeat request for this key.
memo.mu.Unlock()
<-e.ready // wait for ready condition
}
return e.res.value, e.res.err
}
When Get
Function discovery cache memo
When there is no record in , It constructs a entry
Put it in the cache , But at this moment key
The corresponding result has not been calculated yet .
At this time , If other goroutine Called Get
The function queries the same key
when , It will arrive <-e.ready
Statement and blocked waiting for channel data . Only when the calculation ends , Responsible for the calculation results goroutine After closing the channel , Other goroutine To be able to continue , And find out the result .
- When one goroutine When trying to query a result that does not exist , It creates a
entry
Put it in the cache , And unlock , And then callf
Calculate . Update the corresponding after the calculationentry
It can beready
The passage is closed ; - When one goroutine When trying to query an existing result , He should give up the lock immediately , And wait to find out
entry
The closing of the channel .
Another design
So that's the introduction Share variables and lock Methods , Another option is Communication sequence process .
In the new design ,map
Variables are limited to one monitor goroutine in , and Get
The caller of has to change to Send a message .
// Func is the type of the function to memoize.
type Func func(key string) (interface{
}, error)
// A result is the result of calling a Func.
type result struct {
value interface{
}
err error
}
type entry struct {
res result
ready chan struct{
} // closed when res is ready
}
// A request is a message requesting that the Func be applied to key.
type request struct {
key string
response chan<- result // the client wants a single result
}
type Memo struct{
requests chan request }
// New returns a memoization of f. Clients must subsequently call Close.
func New(f Func) *Memo {
memo := &Memo{
requests: make(chan request)}
go memo.server(f)
return memo
}
func (memo *Memo) Get(key string) (interface{
}, error) {
response := make(chan result)
memo.requests <- request{
key, response}
res := <-response
return res.value, res.err
}
func (memo *Memo) Close() {
close(memo.requests) }
//!-get
//!+monitor
func (memo *Memo) server(f Func) {
cache := make(map[string]*entry)
for req := range memo.requests {
e := cache[req.key]
if e == nil {
// This is the first request for this key.
e = &entry{
ready: make(chan struct{
})}
cache[req.key] = e
go e.call(f, req.key) // call f(key)
}
go e.deliver(req.response)
}
}
func (e *entry) call(f Func, key string) {
// Evaluate the function.
e.res.value, e.res.err = f(key)
// Broadcast the ready condition.
close(e.ready)
}
func (e *entry) deliver(response chan<- result) {
// Wait for the ready condition.
<-e.ready
// Send the result to the client.
response <- e.res
}
Get
Method to create a response
passageway , And put it in a request , Then send it to the monitor goroutine, And then immediately start from your own response
Read in channel .
monitor goroutine( namely server
Method ) Constantly from request
Read in channel , Until the channel is closed . For each request , It first queries from the cache , If not found, create and insert a new entry
:
monitor goroutine So let's create one entry
Put it in the cache , And then it calls go e.call(f, req.key)
Create a gorouitne To calculate the result 、 close ready
passageway . At the same time it calls go e.deliver(req.response)
wait for ready
The passage is closed , And send the results to response
In the passage ;
If you monitor goroutine
Results found directly from the cache , So according to key
Checked up entry
Already contains a closed channel , It calls go e.deliver(req.response)
You can put the results directly into response
In the passage .
Compare the two methods , The first method directly calls Get
Method ,Get
Method is directly responsible for concurrency control ; In the second method Get
Method sends a request to the channel , And then in response
Waiting for a response in the channel , The specific concurrency control is monitored by goroutine Realization .
边栏推荐
- 支持Canvas的Leaflet.Path.DashFlow动态流向线
- [matlab] matrix
- How to gracefully solve the offset problem of Baidu and Gaode maps in leaflet
- Actual combat | UI automation test framework design and pageobject transformation
- 启牛商学院里面的券商账户是安全的吗?开户费率低吗
- PMP renewal | PDU specific operation diagram
- [matlab] basic knowledge
- The PMP examination time in March 2022 is set -- "March 27"
- Masa auth - overall design from the user's perspective
- 电商员工离职后将产品价格改为1折出售,已被刑拘
猜你喜欢
【Matlab】符号计算
一篇文章学会子网划分
What are the levels of safety accidents
【Matlab】二维曲线
[matlab] matrix transformation and matrix evaluation
A detailed explanation of synchronized
Matlab [path planning] - UAV drug distribution route optimization
PMP test experience
Information collection for network security (2)
leaflet中如何优雅的解决百度、高德地图的偏移问题
随机推荐
[hcie discussion] multicast igmp-a
[colorful] Net dto mapping
MASA Auth - 从用户的角度看整体设计
Memory address mapping of u-boot
【Matlab】二维曲线
2022-06-13日报: 图灵奖得主:想要在学术生涯中获得成功,需要注意哪些问题?
启牛商学院里面的券商账户是安全的吗?开户费率低吗
C language standard IO, for example: fread(), fwrite(), fgetc(), etc. (end)
How to use Huawei cloud disaster tolerance solution to replace disaster recovery all-in-one machine
【Matlab】多项式计算
63. 不同路径 II
一篇文章学会子网划分
Is the brokerage account in qiniu business school safe? Is the account opening rate low
The e-commerce employee changed the product price to 10% off after leaving the company, and has been detained
vs studio_ How to use scanf in 2022
[matlab] polynomial calculation
浏览器缓存的执行流程
The most complete preview! Huawei cloud wonderful agenda collection
【HCIE论述】组播IGMP-A
Leaflet that supports canvas Path. Dashflow dynamic flow direction line