当前位置:网站首页>Loss function and positive and negative sample allocation in target detection: retinanet and focal loss
Loss function and positive and negative sample allocation in target detection: retinanet and focal loss
2022-07-07 05:53:00 【cartes1us】
RetinaNet
In the field of target detection , For the first time, the accuracy of single-stage algorithm exceeds that of two-stage algorithm , Namely RetinaNet.
Network structure :
The network structure designed by the author is not very innovative , This is what the article says :
The design of our RetinaNet detector shares many similarities with
previous dense detectors, in particular the concept of ‘anchors’
introduced by RPN [3] and use of features pyramids as in SSD [9] and
FPN [4]. We emphasize that our simple detector achieves top results
not based on innovations in network design but due to our novel loss.
Detection head is Classification and BBox Regression decoupling Of , And it is based on anchor frame , after FPN After that, five characteristic maps with different scales are output , Each layer corresponds to 32~512 Anchor frame of scale , And each layer is based on scale and ratios There are different combinations of 9 Seed anchor frame , Finally, the anchor frame size of the whole network is 32 ~ 813 Between . Use the offset predicted by the network relative to the anchor box to calculate BBox Methods and Faster R-CNN identical . The following figure is the picture of thunderbolt .
The structure diagram in the paper is as follows , Only by FPN The characteristic diagrams of three scales are drawn ,W,H,K,A Respectively represent the width of the feature map , high , Number of categories ( Does not contain background classes ), Number of anchor frames (9).
Positive and negative samples match
Positive sample : Predicted BBox And gt IoU>=0.5,
Negative sample : Predicted BBox And gt IoU<0.4,
Other samples are discarded
prospects , Background quantity imbalance
CE Variants of loss
The biggest innovation of this work :Focal loss, Rewrite the classic cross entropy loss , Apply to class subnet Branch , The weight of the loss of easily classified samples is greatly reduced , Beautiful form . In the paper γ \gamma γ Recommended 2, if γ \gamma γ take 0, be FL It degenerates into CE.
Loss :
The first category loss is to calculate all samples ( Including positive and negative ) Of Focal loss, Then remove the number of positive samples N p o s N_{pos} Npos.BBox The return loss is Fast R-CNN Proposed in smooth L1 loss.
To be continued
边栏推荐
- C#可空类型
- async / await
- Nvisual network visualization
- Unity keeps the camera behind and above the player
- What EDA companies are there in China?
- Modes of optical fiber - single mode and multimode
- Common skills and understanding of SQL optimization
- Harmonyos practice - Introduction to development, analysis of atomized services
- The 2022 China low / no code Market Research and model selection evaluation report was released
- 【已解决】记一次EasyExcel的报错【读取xls文件时全表读不报错,指定sheet名读取报错】
猜你喜欢
AI face editor makes Lena smile
An example of multi module collaboration based on NCF
Lombok plug-in
[云原生]微服务架构是什么?
Three level menu data implementation, nested three-level menu data
常用消息队列有哪些?
What are the common message queues?
Harmonyos practice - Introduction to development, analysis of atomized services
C nullable type
Nvisual network visualization
随机推荐
I didn't know it until I graduated -- the principle of HowNet duplication check and examples of weight reduction
Flask 1.1.4 werkzeug1.0.1 analyse du code source: processus de démarrage
OpenSergo 即将发布 v1alpha1,丰富全链路异构架构的服务治理能力
Industrial Finance 3.0: financial technology of "dredging blood vessels"
EMMC print cqhci: timeout for tag 10 prompt analysis and solution
Flinksql 读写pgsql
力扣102题:二叉树的层序遍历
MySQL-CentOS7通过YUM安装MySQL
集群、分布式、微服務的區別和介紹
Go language context explanation
yarn入门(一篇就够了)
"Multimodal" concept
成为资深IC设计工程师的十个阶段,现在的你在哪个阶段 ?
mac版php装xdebug环境(m1版)
什么是消息队列?
谈fpga和asic的区别
STM32 key state machine 2 - state simplification and long press function addition
架构设计的五个核心要素
消息队列:如何确保消息不会丢失
Web architecture design process