当前位置:网站首页>nn. PReLU(planes)

nn. PReLU(planes)

2022-06-12 20:58:00 Human high quality Algorithm Engineer

PReLU Activation function , Internal source code implementation

def __init__(self, num_parameters: int = 1, init: float = 0.25) -> None:
    self.num_parameters = num_parameters
    super(PReLU, self).__init__()
    self.weight = Parameter(torch.Tensor(num_parameters).fill_(init))

def forward(self, input: Tensor) -> Tensor:
    return F.prelu(input, self.weight)
def prelu(input, weight):
    # type: (Tensor, Tensor) -> Tensor
    r"""prelu(input, weight) -> Tensor Applies element-wise the function :math:`\text{PReLU}(x) = \max(0,x) + \text{weight} * \min(0,x)` where weight is a learnable parameter. See :class:`~torch.nn.PReLU` for more details. """
    if not torch.jit.is_scripting():
        if type(input) is not Tensor and has_torch_function((input,)):
            return handle_torch_function(prelu, (input,), input, weight)
    return torch.prelu(input, weight)

The above formula leads to typora Li is in the following form :
 Insert picture description here
weight Here is a learnable parameter , When called nn.PReLU(planes) Only the number of channels is entered , Implement Li Jiang weight Parameters are initialized to 0.25.

This is similar to what we know PRelu The formula is the same ,
 Insert picture description here

原网站

版权声明
本文为[Human high quality Algorithm Engineer]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/02/202202281434270347.html