当前位置:网站首页>Neuralcf neural collaborative filtering network
Neuralcf neural collaborative filtering network
2022-07-29 06:46:00 【yc_ ZZ】
NeuralCF
One 、 summary
CF( Collaborative filtering ) Combined with deep learning
Review matrix decomposition

Background of the model
The model structure of matrix decomposition is relatively simple , It is obtained by inner product of user hidden vector and item hidden vector “ Similarity degree ”, Here, similarity refers to the prediction of the score . In particular, the output layer cannot effectively fit the optimization objectives , The model is easy to be in the state of under fitting ( Under fitting means that the degree of model fitting is not high ). Therefore, on this basis NeuralCF Model .
Two 、NeuralCF The structure of the original model
NeuralCF use “ Multilayer neural network + Output layer ” Instead of the simple inner product operation in the matrix decomposition model
Advantage is :
1、 Let the user vector and the item vector intersect more fully , Get more valuable feature combination information
2、 Introduce more nonlinear features , Make the model more expressive 
3、 ... and 、 Introduction of hybrid model
GMF- Generalized matrix factorization

Here we explain the user hidden vector P u = P T v u U P_{u}=P^{T}v^{U}_{u} Pu=PTvuU , Item implicit vector isomorphism 
class GMF(nn.Module):
def __init__(self,user_num, item_num, factor_num):
super(GMF,self).__init__()
self.embed_user_GMF = nn.Embedding(user_num,factor_num)
self.embed_item_GMF = nn.Embedding(item_num,factor_num)
self.predict_layer = nn.Linear(factor_num,1)
self._init_weight_()
def _init_weight_(self):
nn.init.normal_(self.embed_item_GMF.weight,std=0.01)
nn.init.normal_(self.embed_user_GMF.weight,std=0.01)
def forward(self,user,item):
embed_user_GMF = self.embed_user_GMF(user)
embed_item_GMF = self.embed_item_GMF(item)
output_GMF = embed_user_GMF*embed_item_GMF
prediction = self.predict_layer(output_GMF)
return prediction.view(-1)
MLP( Multilayer perceptron )
Usually , Each layer is fully connected to the next layer , The output of each artificial neuron on one layer becomes the input of several artificial neurons on the next layer .MLP There are at least three layers of artificial neurons , As shown in the figure below .
The formula is as follows :

Four 、Neural CF hybrid model
In order to make the fusion model more flexible , We allow GMF and MLP Learning independent embeddedness , And combine the two models by connecting their final hidden layer output .
The formula is as follows :
边栏推荐
猜你喜欢

Using STP spanning tree protocol to solve the problem of two-layer loop in network

day10_异常处理&枚举

day10_ Exception handling & enumeration

Hongke automation SoftPLC | modk operation environment and construction steps (1) -- Introduction to operation environment

FIR filter design (1) -- using the FDATool toolbox of MATLAB to design FIR filter parameters

软件包设置成——>YUM源

Day16 set

SDN拓扑发现原理

day15_ generic paradigm

day06_ Classes and objects
随机推荐
Merkletree builds QT implementation UI
day04_ array
day12_ Multithreading
What is WAF protection
Those vulnerability attacks on app
案例补充、ATM
day03_ 2_ task
Common server faults and their solutions
什么是DNS放大攻击
day14_ Unit test & Date common class & String common class
Thinking about MySQL taking shell through OS shell
Hongke shares | how to test and verify complex FPGA designs (1) -- entity or block oriented simulation
Sequence list and linked list
DDoS攻击与CC攻击的区别
ss命令详解
LDAP简述及统一认证说明
vmstat 内存消耗查询
Merkle tree existential function modified for the first time
Hongke share | let you have a comprehensive understanding of "can bus error" (III) -- can node status and error counter
Hongke automation SoftPLC | Hongke kPa modk operation environment and construction steps (2) -- modk operation environment construction