๐Ÿ… Top 5% in ์ œ2ํšŒ ์—ฐ๊ตฌ๊ฐœ๋ฐœํŠน๊ตฌ ์ธ๊ณต์ง€๋Šฅ ๊ฒฝ์ง„๋Œ€ํšŒ AI SPARK ์ฑŒ๋ฆฐ์ง€

Overview

AI_SPARK_CHALLENG_Object_Detection

์ œ2ํšŒ ์—ฐ๊ตฌ๊ฐœ๋ฐœํŠน๊ตฌ ์ธ๊ณต์ง€๋Šฅ ๊ฒฝ์ง„๋Œ€ํšŒ AI SPARK ์ฑŒ๋ฆฐ์ง€

๐Ÿ… Top 5% in mAP(0.75) (443๋ช… ์ค‘ 13๋“ฑ, mAP: 0.98116)

๋Œ€ํšŒ ์„ค๋ช…

  • Edge ํ™˜๊ฒฝ์—์„œ์˜ ๊ฐ€์ถ• Object Detection (Pig, Cow)
  • ์‹ค์ œ ํ™˜๊ฒฝ์—์„œ ํ™œ์šฉ๊ฐ€๋Šฅํ•œ Edge Device (ex: ์ ฏ์Šจ ๋‚˜๋…ธ๋ณด๋“œ ๋“ฑ) ๊ธฐ๋ฐ˜์˜ ๊ฐ€๋ฒผ์šด ๊ฒฝ๋Ÿ‰ํ™” ๋ชจ๋ธ์„ ๊ฐœ๋ฐœํ•˜๋Š” ๊ฒƒ์ด ๋ชฉํ‘œ์ด๋‹ค.
  • ๊ฐ€์ค‘์น˜ ํŒŒ์ผ์˜ ์šฉ๋Ÿ‰์€ 100MB๋กœ ์ œํ•œํ•œ๋‹ค.
  • ๊ฐ€์ค‘์น˜ ํŒŒ์ผ์˜ ์šฉ๋Ÿ‰์ด 100MB์ดํ•˜์ด๋ฉด์„œ mAP(IoU 0.75)๋ฅผ ๊ธฐ์ค€์œผ๋กœ ์ˆœ์œ„๋ฅผ ๋งค๊ธด๋‹ค.
  • ๋ณธ ๋Œ€ํšŒ์˜ ๋ชจ๋“  ๊ณผ์ •์€ Colab Pro ํ™˜๊ฒฝ์—์„œ ์ง„ํ–‰ ๋ฐ ์žฌํ˜„ํ•œ๋‹ค.

Hardware

  • Colab Pro (P100 or T4)

Data

  • AI Hub์—์„œ ์ œ๊ณตํ•˜๋Š” ๊ฐ€์ถ• ํ–‰๋™ ์˜์ƒ ๋ฐ์ดํ„ฐ์…‹ (๋‹ค์šด๋กœ๋“œ ๋งํฌ)
  • [์›์ฒœ]์†Œ_bbox.zip: ์†Œ image ํŒŒ์ผ
  • [๋ผ๋ฒจ]์†Œ_bbox.zip: ์†Œ annotation ํŒŒ์ผ
  • [์›์ฒœ]๋ผ์ง€_bbox.zip: ๋ผ์ง€ image ํŒŒ์ผ
  • [๋ผ๋ฒจ]๋ผ์ง€_bbox.zip: ๋ผ์ง€ annotation ํŒŒ์ผ
  • ์ถ”๊ฐ€์ ์œผ๋กœ, annotation์—์„œ์˜ "categories"์˜ ๊ฐ’๊ณผ annotation list์˜ "category_id"๋Š” ์†Œ, ๋ผ์ง€ ํด๋ž˜์Šค์™€ ๋ฌด๊ด€ํ•˜๋ฏ€๋กœ ์ด๋ฅผ ํ™œ์šฉํ•  ๊ฒฝ์šฐ ์ž˜๋ชป๋œ ๊ฒฐ๊ณผ๋กœ ์ด์–ด์งˆ ์ˆ˜ ์žˆ๋‹ค.

Code

+- data (.gitignore) => zipํŒŒ์ผ๋งŒ ์ตœ์ดˆ ์ƒ์„ฑ(AI Hub) ํ›„ ์ถ”๊ฐ€ ๋ฐ์ดํ„ฐ๋Š” EDA ํด๋” ์ฝ”๋“œ๋กœ๋ถ€ํ„ฐ ์ƒ์„ฑ
|   +- [๋ผ๋ฒจ]๋ผ์ง€_bbox.zip
|   +- [๋ผ๋ฒจ]์†Œ_bbox.zip
|   +- [์›์ฒœ]๋ผ์ง€_bbox.zip
|   +- [์›์ฒœ]์†Œ_bbox.zip
|   +- Train_Dataset.tar (EDA - Make_Dataset_Multilabel.ipynb์—์„œ ์ƒ์„ฑ) 
|   +- Valid_Dataset.tar (EDA - Make_Dataset_Multilabel.ipynb์—์„œ ์ƒ์„ฑ)
|   +- Train_Dataset_Full.tar (EDA - Make_Dataset_Full.ipynb์—์„œ ์ƒ์„ฑ)
|   +- Train_Dataset_mini.tar (EDA - Make_Dataset_Mini.ipynb์—์„œ ์ƒ์„ฑ)
|   +- Valid_Dataset_mini.tar (EDA - Make_Dataset_Mini.ipynb์—์„œ ์ƒ์„ฑ)
|   +- plus_image.tar (EDA - Data_Augmentation.ipynb์—์„œ ์ƒ์„ฑ)
|   +- plus_lable.tar (EDA - Data_Augmentation.ipynb์—์„œ ์ƒ์„ฑ)
+- data_test (.gitignore) => Inference์‹œ ์‚ฌ์šฉํ•  test data (AI Hub์œผ๋กœ๋ถ€ํ„ฐ ๋‹ค์šด๋กœ๋“œ)
|   +- [์›์ฒœ]๋ผ์žฌ_bbox.zip
|   +- [์›์ฒœ]์†Œ_bbox.zip
+- trained_model (.gitignore) => ํ•™์Šต ๊ฒฐ๊ณผ๋ฌผ ์ €์žฅ
|   +- m6_pretrained_full_b10_e20_hyp_tuning_v1_linear.pt
+- EDA
|   +- Data_Augmentation.ipynb (Plus Dataset ์ƒ์„ฑ)
|   +- Data_Checking.ipynb (Error Analysis)
|   +- EDA.ipynb
|   +- Make_Dataset_Multilabel.ipynb (Train / Valid Dataset ์ƒ์„ฑ)
|   +- Make_Dataset_Full.ipynb (Train + Valid Dataset ์ƒ์„ฑ)
|   +- Make_Dataset_Mini.ipynb (Train mini / Valid mini Dataset ์ƒ์„ฑ)
+- hyp
|   +- experiment_hyp_v1.yaml (์ตœ์ข… HyperParameter)
+- exp
|   +- hyp_train.py (๋ณธ ์ฝ”๋“œ์™€ ๊ฐ™์ด ์ˆ˜์ •ํ•˜์—ฌ, ์—ฌ๋Ÿฌ ์‹คํ—˜ ์ง„ํ–‰)
|   +- YOLOv5_hp_search_lr_momentum.ipynb (HyperParameter Tuning with mini dataset)
+- train
|   +- YOLOv5_ExpandDataset_hp_tune.ipynb (Plus Dataset์„ ํ™œ์šฉํ•˜์—ฌ ํ•™์Šต)
|   +- YOLOv5_FullDataset_hp_tune.ipynb (์ตœ์ข… ๊ฒฐ๊ณผ๋ฌผ ์ƒ์„ฑ)
|   +- YOLOv5_MultiLabelSplit.ipynb (์ดˆ๊ธฐ ํ•™์Šต ์ฝ”๋“œ)
+- YOLOv5_inference.ipynb
+- answer.csv (์ตœ์ข… ์ •๋‹ต csv)

Core Strategy

  • YOLOv5m6 Pretrained Model ์‚ฌ์šฉ (68.3MB)
  • MultiLabelStratified KFold (Box count, Class, Box Ratio, Box Size)
  • HyperParameter Tuning (with GA Algorithm)
  • Data Augmentation with Error Analysis
  • Inference Tuning (IoU Threshold, Confidence Threshold)

EDA

์ž์„ธํžˆ

Cow Dataset vs Pig dataset

PIG COW
Image ๊ฐœ์ˆ˜ 4303 12152
  • Data์˜ ๋ถ„ํฌ๊ฐ€ "Cow : Pig = 3 : 1"
  • Train / Valid splitํ•  ๊ฒฝ์šฐ, ๊ณจ๊ณ ๋ฃจ ๋ถ„ํฌํ•˜๋„๋ก ์ง„ํ–‰

Image size ๋ถ„ํฌ

Pig Image Size Cow Image Size
1920x1080 3131 12152
1280x960 1172 0
  • ๋Œ€๋ถ€๋ถ„์˜ Image์˜ ํฌ๊ธฐ๋Š” 1920x1080
  • Pig Data์—์„œ ์ผ๋ถ€ image์˜ ํฌ๊ธฐ๊ฐ€ 1280x960
  • ์ขŒํ‘œ๋ณ€ํ™˜ ์ ์šฉ์‹œ, Image size๋ฅผ ๊ณ ๋ คํ•˜์—ฌ ๋ณ€ํ™˜

Box์˜ ๊ฐœ์ˆ˜์— ๋”ฐ๋ฅธ ๋ถ„ํฌ

3

  • pig data์™€ cow data์—์„œ Box์˜ ๊ฐœ์ˆ˜๊ฐ€ ์„œ๋กœ ์ƒ์ดํ•˜๊ฒŒ ๋ถ„ํฌ
  • Train / Valid splitํ•  ๊ฒฝ์šฐ, ๊ฐ image๋ณ„๋กœ ๊ฐ€์ง€๋Š” Box์˜ ๊ฐœ์ˆ˜์— ๋”ฐ๋ผ์„œ ๊ณจ๊ณ ๋ฃจ ๋ถ„ํฌํ•  ์ˆ˜ ์žˆ๋„๋ก ์ง„ํ–‰.

Box์˜ ๋น„์œจ์— ๋”ฐ๋ฅธ ๋ถ„ํฌ

4

  • pig data์™€ cow data์—์„œ Box์˜ ๋น„์œจ์€ ์œ ์‚ฌ
  • Train / Valid splitํ•  ๊ฒฝ์šฐ, ๊ฐ image๋ณ„๋กœ ๊ฐ€์ง€๋Š” Box์˜ ๋น„์œจ์— ๋”ฐ๋ผ์„œ ๊ณจ๊ณ ๋ฃจ ๋ถ„ํฌํ•  ์ˆ˜ ์žˆ๋„๋ก ์ง„ํ–‰.

Box์˜ ํฌ๊ธฐ์— ๋”ฐ๋ฅธ ๋ถ„ํฌ

5

  • pig data, cow data ๋ชจ๋‘ small size bounding box (๋„“์ด: 1000~10000)์˜ ๊ฐœ์ˆ˜๊ฐ€ ์ƒ๋Œ€์ ์œผ๋กœ ์ ์Œ.
  • small size bounding box๋ฅผ ์ง€์šธ ๊ฒƒ์ธ๊ฐ€? => ์„ ํƒ์˜ ๋ฌธ์ œ (๋ณธ ๊ณผ์ •์—์„œ๋Š” ์ง€์šฐ์ง€ ์•Š์Œ)

Small size bounding box์— ๋Œ€ํ•œ ์„ธ๋ฐ€ํ•œ ๋ถ„ํฌ ์กฐ์‚ฌ

6

๋„“์ด๊ฐ€ 4000์ดํ•˜์ธ Data์˜ ๊ฐœ์ˆ˜ PIG COW
๊ฐœ์ˆ˜ 137 71
๋น„์œจ 0.003 0.0018
  • ๋„“์ด๊ฐ€ 4000์ดํ•˜์ธ Data์˜ ๊ฐœ์ˆ˜๊ฐ€ pig data 137๊ฐœ, cow data 71๊ฐœ
  • ์ „์ฒด Data์— ๋Œ€ํ•œ ๋น„์œจ (137 -> 0.003, 71 -> 0.0018). ์ฆ‰, 0.3%, 0.18%
  • ๋„“์ด๊ฐ€ 4000์ดํ•˜์ธ Bounding Box๋ฅผ ์ง€์šธ ๊ฒƒ์ธ๊ฐ€? => ์„ ํƒ์˜ ๋ฌธ์ œ (๋ณธ ๊ณผ์ •์—์„œ๋Š” ์ง€์šฐ์ง€ ์•Š์Œ)

Box๊ฐ€ ์—†๋Š” ์ด๋ฏธ์ง€ ๋ถ„ํฌ

Box๊ฐ€ ์—†๋Š” ์ด๋ฏธ์ง€ PIG COW
๊ฐœ์ˆ˜ 0 3
  • Cow Image์—์„œ 3๊ฐœ ์กด์žฌ
  • White Noise๋กœ ํŒ๋‹จํ•˜์—ฌ ์‚ญ์ œํ•˜์ง€ ์•Š์Œ.

Model

  • YOLOv5m6 Pretrained Model ์‚ฌ์šฉ
  • YOLOv5 ๊ณ„์—ด Pretrained Model ์ค‘ 100MB ์ดํ•˜์ธ Model ์„ ์ •
YOLOv5l Pretrained YOLOv5m6 w/o Pretrained YOLOv5m6 Pretrained
[email protected] 0.9806 0.9756 0.9838
[email protected]:.95 0.9002 0.8695 0.9156
  • ์ตœ์ข… ์‚ฌ์šฉ Model๋กœ์„œ YOLOv5m6 Pretrained Model ์„ ํƒ

MultiLabelStratified KFold

  • PIG / COW์˜ Data์˜ ๊ฐœ์ˆ˜์— ๋Œ€ํ•œ ์ฐจ์ด
  • Image๋ณ„ ์†Œ์œ ํ•˜๋Š” Box์˜ ๊ฐœ์ˆ˜์— ๋Œ€ํ•œ ์ฐจ์ด
  • ์œ„ ๋‘ Label์„ ๋ฐ”ํƒ•์œผ๋กœ Stratifiedํ•˜๊ฒŒ Train/valid Split ์ง„ํ–‰
Cow-Many Cow-Medium Cow-Little Pig-Many Pig-Medium Pig-Little
Train 2739 1097 5886 2190 827 425
Valid 674 259 1497 559 221 81

HyperParameter Tuning

  • Genetic Algorithm์„ ํ™œ์šฉํ•œ HyperParameter Tuning (YOLOv5 default ์ œ๊ณต)
  • Runtime์˜ ์ œ์•ฝ(Colab Pro)์œผ๋กœ ์ธํ•œ, Mini Dataset(50% ์‚ฌ์šฉ) ์ œ์ž‘ ๋ฐ HyperParameter Search ๊ฐœ๋ณ„ํ™” ์ž‘์—…์ง„ํ–‰

Core Code ์ˆ˜์ •

์ž์„ธํžˆ
meta = {'lr0': (1, 1e-5, 1e-1),  # initial learning rate (SGD=1E-2, Adam=1E-3)
        'lrf': (1, 0.01, 1.0),  # final OneCycleLR learning rate (lr0 * lrf)
        'momentum': (0.3, 0.6, 0.98),  # SGD momentum/Adam beta1
        }

        with open(opt.hyp, errors='ignore') as f:
            hyp = yaml.safe_load(f)  # load hyps dict
            if 'anchors' not in hyp:  # anchors commented in hyp.yaml
                hyp['anchors'] = 3

        # Updateํ•  HyperParameter๋งŒ new_hyp์— ์ €์žฅ
        new_hyp = {}
        for k, v in hyp.items():
            if k in meta.keys():
                new_hyp[k] = v
        
        opt.noval, opt.nosave, save_dir = True, True, Path(opt.save_dir)  # only val/save final epoch
        # ei = [isinstance(x, (int, float)) for x in hyp.values()]  # evolvable indices
        evolve_yaml, evolve_csv = save_dir / 'hyp_evolve.yaml', save_dir / 'evolve.csv'
        if opt.bucket:
            os.system(f'gsutil cp gs://{opt.bucket}/evolve.csv {save_dir}')  # download evolve.csv if exists

        for _ in range(opt.evolve):  # generations to evolve
            if evolve_csv.exists():  # if evolve.csv exists: select best hyps and mutate
                # Select parent(s)
                parent = 'single'  # parent selection method: 'single' or 'weighted'
                x = np.loadtxt(evolve_csv, ndmin=2, delimiter=',', skiprows=1)
                n = min(5, len(x))  # number of previous results to consider
                x = x[np.argsort(-fitness(x))][:n]  # top n mutations
                w = fitness(x) - fitness(x).min() + 1E-6  # weights (sum > 0)
                if parent == 'single' or len(x) == 1:
                    # x = x[random.randint(0, n - 1)]  # random selection
                    x = x[random.choices(range(n), weights=w)[0]]  # weighted selection
                elif parent == 'weighted':
                    x = (x * w.reshape(n, 1)).sum(0) / w.sum()  # weighted combination

                # Mutate
                mp, s = 0.8, 0.2  # mutation probability, sigma
                npr = np.random
                npr.seed(int(time.time()))
                # new_hyp์— ์žˆ๋Š” HyperParameter์— ๋Œ€ํ•ด์„œ๋งŒ meta๊ฐ’ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
                g = np.array([meta[k][0] for k in new_hyp.keys()])  # gains 0-1
                ng = len(meta)
                v = np.ones(ng)
                while all(v == 1):  # mutate until a change occurs (prevent duplicates)
                    v = (g * (npr.random(ng) < mp) * npr.randn(ng) * npr.random() * s + 1).clip(0.3, 3.0)
                for i, k in enumerate(hyp.keys()):  # plt.hist(v.ravel(), 300)
                    if k in new_hyp.keys(): # new_hyp์— ์กด์žฌํ•˜๋Š” hyperParameter์— ๋Œ€ํ•ด์„œ๋งŒ Update
                        hyp[k] = float(x[i + 7] * v[i])  # mutate

            # Constrain to limits
            for k, v in meta.items():
                hyp[k] = max(hyp[k], v[1])  # lower limit
                hyp[k] = min(hyp[k], v[2])  # upper limit
                hyp[k] = round(hyp[k], 5)  # significant digits

            # Train mutation
            results = train(hyp.copy(), opt, device, callbacks)

Default HyperParameter vs Tuning HyperParameter

  • obj, box, cls์— ๋Œ€ํ•œ HyperParameter์— ๋”ฐ๋ฅธ ์„ฑ๋Šฅ ๋ณ€ํ™”ํญ ์ฆ๊ฐ€ (NOTE: ํ•™์Šต ํ™˜๊ฒฝ์˜ ์ œ์•ฝ์œผ๋กœ ์ธํ•ด, ๊ฐ ์„ฑ๋Šฅ๋น„๊ตํ‘œ ๋งˆ๋‹ค Epoch ์ˆ˜์˜ ์ฐจ์ด๊ฐ€ ์กด์žฌํ•˜์—ฌ ์„ฑ๋Šฅ์˜ ์ฐจ์ด๊ฐ€ ์žˆ๋‹ค. ์„ฑ๋Šฅ ๋น„๊ต์—๋งŒ ์ฐธ๊ณ ํ•˜๋„๋ก ํ•˜์ž)
Default Tuning
obj_loss 0.023 0.003
box_loss 0.0095 0.0038
cls_loss 0.00003 0.00001
Default Tuning
[email protected] 0.9826 0.9824
[email protected]:.95 0.8924 0.9016
  • Optimizer
Adam AdamW SGD
[email protected] 0.9635 0.9804 0.9848
[email protected]:.95 0.8302 0.8994 0.914

์ตœ์ข… ๋ณ€๊ฒฝ HyperParameter

optimizer lr_scheduler lr0 lrf momentum weight_decay warmup_epochs warmup_momentum warmup_bias_lr box cls cls_pw obj obj_pw iou_t anchor_t fl_gamma hsv_h hsv_s hsv_v degrees translate scale shear perspective flipud fliplr mosaic mixup copy_paste
SGD linear 0.009 0.08 0.94 0.001 0.11 0.77 0.0004 0.02 0.2 0.95 0.2 0.5 0.2 4.0 0.0 0.009 0.1 0.9 0.0 0.1 0.5 0.0 0.0 0.0095 0.1 1.0 0.0 0.0

Error Analysis

ํ•™์Šต ๊ฒฐ๊ณผ ํ™•์ธ

Data ์–‘ Train Valid
PIG 3442 881
COW 9722 2430
์˜ˆ์ธก ๊ฒฐ๊ณผ Label ๊ฐœ์ˆ˜ Precision Recall [email protected] [email protected]:.95
PIG 3291 0.984 0.991 0.993 0.928
COW 3291 0.929 0.911 0.974 0.889
  • ์œ„์˜ ํ‘œ์™€ ๊ฐ™์ด, Cow์˜ Data์˜ ์–‘์ด PIG์˜ Data๋ณด๋‹ค ๋” ๋งŽ๋‹ค.
  • YOLOv5 Pretrained Model์˜ ๊ฒฝ์šฐ COCO Dataset์—์„œ Cow ์ด๋ฏธ์ง€๋ฅผ ๋ณด์œ ํ•˜๊ณ  ์žˆ๋‹ค.
  • ์œ„์˜ ๋‘ ๊ฐ€์ง€ ์ด์ ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ , Model์ด Cow Detection์—์„œ์˜ ์–ด๋ ค์›€์„ ๊ฒช๋Š”๋‹ค.

Box์˜ ๊ฐœ์ˆ˜ ๋ฐ Plotting

Box์˜ ๊ฐœ์ˆ˜

9

Train - Bounding Box Plotting

10

Valid - Bounding Box Plotting

11

Error ๋ถ„์„ ๊ฒฐ๊ณผ

  • ์ „๋ฐ˜์ ์œผ๋กœ Cow Dataset์—์„œ์˜ Bounding Box์˜ ๊ฐœ์ˆ˜๊ฐ€ ์ ๋‹ค.
  • Image๋ฅผ Plottingํ•œ ๊ฒฐ๊ณผ, Cow Dataset์—์„œ์˜ Labeling์ด ์ œ๋Œ€๋กœ ๋˜์–ด์žˆ์ง€ ์•Š๋‹ค.
    • FP์˜ ์ฆ๊ฐ€๋กœ ์ด์–ด์งˆ ์ˆ˜ ์žˆ๋‹ค. (Labeling์ด ๋˜์–ด์žˆ์ง€ ์•Š์ง€๋งŒ, Cow๋ผ๊ณ  ์˜ˆ์ธก)
  • ์ด๋Ÿฌํ•œ ๊ฒฐ๊ณผ๋กœ๋ถ€ํ„ฐ, Silver Dataset์„ ๋งŒ๋“ค์–ด ์žฌํ•™์Šต์‹œํ‚ค๋„๋ก ํ•œ๋‹ค.
    • ํ•™์Šต๋œ Model๋กœ Cow Image์— ๋Œ€ํ•˜์—ฌ Bounding Box๋ฅผ ์˜ˆ์ธกํ•œ๋‹ค.
    • ์˜ˆ์ธก๋œ ๊ฒฐ๊ณผ๋ฅผ ์ถ”๊ฐ€ํ•™์Šต๋ฐ์ดํ„ฐ๋กœ ํ™œ์šฉํ•œ๋‹ค.

Data Augmentation with Silver Dataset

  • YOLOv5m6 Pretrained with Full_Dataset(Train + Valid) (๊ธฐ์กด Dataset์œผ๋กœ ํ•™์Šตํ•œ ๋ชจ๋ธ ํ™œ์šฉ)
  • ์ด 12151๊ฐœ์˜ Cow Data์— ๋Œ€ํ•˜์—ฌ Detection ์ง„ํ–‰ (IoU threshod: 0.7, Confidence threshold: 0.05)

Bounding Box ๊ฐœ์ˆ˜ ์‹œ๊ฐํ™”

12

  • ์œ„์˜ ์‹œ๊ฐํ™”์ž๋ฃŒ๋กœ ๋ถ€ํ„ฐ, ๋ถ„์„๊ฐ€(๋ณธ์ธ)์˜ ์ž„์˜๋Œ€๋กœ Bounding Box์˜ ๊ฐœ์ˆ˜๊ฐ€ 4๊ฐœ ์ด์ƒ์ธ Image๋งŒ ์ตœ์ข… ์„ ์ •
  • ์ด 6628๊ฐœ์˜ Cow์— ๋Œ€ํ•œ Silver Dataset ์ถ”๊ฐ€

๊ฒฐ๊ณผ

์ตœ์ข… ์„ ์ • ๋ชจ๋ธ

  • Dataset: Train + Valid Dataset์„ ํ•™์Šต
  • YOLOv5m6 Pretrained Model ํ™œ์šฉ
  • HyperParameter Tuning (์œ„์˜ HyperParameter Tuning์—์„œ ์ž‘์„ฑํ•œ ํ‘œ ์ฐธ๊ณ )
  • Inference Tuning (IoU Threshold: 0.68, Confidence Threshold: 0.001)
Silver Dataset ๊ฒฐ๊ณผ๋น„๊ต [email protected]
์ตœ์ข… ๋ชจ๋ธ(w/o Silver Dataset) 0.98116
Plus Model(w Silver Dataset) 0.97965
Full vs Split ๊ฒฐ๊ณผ๋น„๊ต [email protected] [email protected]:.95
Full(Train + Valid) 0.9858 0.9271
Split(Train) 0.9845 0.9215

์‹œ๋„ํ–ˆ์œผ๋‚˜ ์•„์‰ฌ์› ๋˜ ์ 

Knowledge Distillation

  • 1 Stage Model to 1 Stage Model
  • ์„ฑ๋Šฅ์ด ๋†’์€ 1 Stage Model์„ ์ฐพ์œผ๋ ค๊ณ  ํ–ˆ์œผ๋‚˜ YOLOv5x6์„ ์ ์šฉํ•˜์˜€์„ ๋•Œ, [email protected]: 0.9821 / [email protected]:.95: 0.939๋กœ ์ ์ˆ˜์˜ ํฐ ๊ฐœ์„ ์ด ์—†์—ˆ์Œ.
  • ์ฆ‰, Teacher Model๋กœ ํ™œ์šฉํ•จ์œผ๋กœ์„œ ์–ป์–ด์ง€๋Š” ์ด๋“์ด ์ ๋‹ค.

ํšŒ๊ณ 

  • Pretrained Model
    • COCO Dataset์—์„œ์˜ Cow Image์˜ ํ˜•ํƒœ๋Š” ์–ด๋– ํ•œ์ง€?
    • Pig(COCO Dataset์— ์—†์Œ)์˜ ๊ฒฝ์šฐ, ์ž˜ ๋งž์ท„๊ธฐ ๋•Œ๋ฌธ์— PreTrained Weight์„ ์‚ฌ์šฉํ•˜์ง€ ์•Š๊ณ  Epoch์„ ๋Š˜๋ ค์„œ ํ•™์Šตํ•˜๋ฉด ๋” ์ข‹์€ ๊ฒฐ๊ณผ๋กœ ์ด์–ด์ง€์ง€ ์•Š์„๊นŒ?
  • Silver Dataset
    • Silver Dataset์„ ๋งŒ๋“œ๋Š” ๊ณผ์ •์— ์žˆ์–ด์„œ, IoU Threshold์™€ Confidence Threshold๋ฅผ ์ตœ์ ํ™”ํ•œ๋‹ค๋ฉด ์„ฑ๋Šฅ๊ฐœ์„ ์œผ๋กœ ์ด์–ด์งˆ ์ˆ˜ ์žˆ์ง€ ์•Š์„๊นŒ?
    • Test Datsaet์—์„œ ์• ์ดˆ์— Labeling์ด ์ œ๋Œ€๋กœ ๋˜์–ด์žˆ์ง€ ์•Š๋Š”๋‹ค๋ฉด, ์ด๋Ÿฌํ•œ ์ด์œ ๋กœ ์ธํ•ด ํ•„์—ฐ์ ์œผ๋กœ ์„ฑ๋Šฅ๊ฐœ์„ ์ด ์•ˆ ์ด๋ฃจ์–ด์งˆ ์ˆ˜ ์žˆ์ง€ ์•Š์„๊นŒ?
  • MultiLabelStratified SPlit
    • Bounding Box์™€ Ratio์™€ Size์— ๋”ฐ๋ฅธ ๋ถ„๋ฅ˜๋ฅผ ํ•จ๊ป˜ ์ง„ํ–‰ํ•ด๋ณด๋ฉด ์–ด๋–จ๊นŒ?
    • ๋”๋ถˆ์–ด, Bounding Box์˜ ๊ฒฝ์šฐ, Image๊ฐ€ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” Box๋งˆ๋‹ค ๋‹ค๋ฅธ๋ฐ ์ด๋Š” ์–ด๋–ป๊ฒŒ MultiLabelํ•˜๊ฒŒ Splitํ•  ์ˆ˜ ์žˆ์„๊นŒ?
  • ํ™•์‹คํ•œ ๋ฐฉ๋ฒ•์œผ๋กœ์„œ ๊ธฐ์กด Train Dataset์— Cow Image์— ๋Œ€ํ•œ Labeling์„ ์ง์ ‘ํ–ˆ๋‹ค๋ฉด ์„ฑ๋Šฅ ๊ฐœ์„ ์œผ๋กœ ์ด์–ด์ง€์ง€ ์•Š์•˜์„๊นŒ?!

์ถ”ํ›„ ๊ณผ์ œ

  • MultiLabelStratified Split ์ง„ํ–‰์‹œ, ๊ฐ ์ด๋ฏธ์ง€๊ฐ€ ๊ฐ€์ง€๋Š” Bounding Box์˜ Ratio, Size์— ๋”ฐ๋ฅธ ๋ถ„๋ฅ˜ ๋ฐฉ๋ฒ• ์—ฐ๊ตฌ
  • BackGround Image ๋„ฃ๊ธฐ => ํƒ์ง€ํ•  ๋ฌผ์ฒด๊ฐ€ ์—†๋Š” Image๋ฅผ ์ถ”๊ฐ€ํ•ด์คŒ์œผ๋กœ์„œ False Positive๋ฅผ ์ค„์ผ ์ˆ˜ ์žˆ๋‹ค๊ณ  ํ•œ๋‹ค.
  • ๊ณ ๋„ํ™”๋œ HyperParameter Tuning ๊ธฐ๋ฒ• ์ ์šฉ (ex, Bayesian Algorithm)
  • Train Dataset์— ๋Œ€ํ•œ Silver Dataset์„ ๋งŒ๋“ค์–ด ์ด๋ฅผ ์ถ”๊ฐ€์ ์œผ๋กœ ํ•™์Šตํ•  ๊ฒฝ์šฐ ์„ฑ๋Šฅ ํ–ฅ์ƒ์œผ๋กœ ์ด์–ด์ง€๋Š”์ง€ ์•Œ์•„๋ณด๊ธฐ (Train Gold + Train Silver)
  • Object Detection์—์„œ SGD๊ฐ€ AdamW๋ณด๋‹ค ์ข‹์€ ๊ฒƒ์€ ๊ฒฝํ—˜์ ์ธ ๊ฒฐ๊ณผ์ธ์ง€ ํ˜น์€ ์—ฐ๊ตฌ๊ฒฐ๊ณผ๊ฐ€ ์žˆ๋Š”์ง€ ํ™•์ธํ•˜๊ธฐ
  • Pruning, Tensor Decomposition ์ ์šฉํ•ด๋ณด๊ธฐ
  • Object Detection Knowledge Distillation์˜ ๊ฒฝ์šฐ, 2 Stage to 1 Stage์— ๋Œ€ํ•œ ๋ฐฉ๋ฒ•๋ก  ์ฐพ์•„๋ณด๊ธฐ
Neural network for recognizing the gender of people in photos

Neural Network For Gender Recognition How to test it? Install requirements.txt file using pip install -r requirements.txt command Run nn.py using pyth

Valery Chapman 1 Sep 18, 2022
The codes and related files to reproduce the results for Image Similarity Challenge Track 2.

ISC-Track2-Submission The codes and related files to reproduce the results for Image Similarity Challenge Track 2. Required dependencies To begin with

Wenhao Wang 89 Jan 02, 2023
Pose estimation for iOS and android using TensorFlow 2.0

๐Ÿ’ƒ Mobile 2D Single Person (Or Your Own Object) Pose Estimation for TensorFlow 2.0 This repository is forked from edvardHua/PoseEstimationForMobile wh

tucan9389 165 Nov 16, 2022
ADB-IP-ROTATION - Use your mobile phone to gain a temporary IP address using ADB and data tethering

ADB IP ROTATE This an Python script based on Android Debug Bridge (adb) shell sc

Dor Bismuth 2 Jul 12, 2022
[TIP2020] Adaptive Graph Representation Learning for Video Person Re-identification

Introduction This is the PyTorch implementation for Adaptive Graph Representation Learning for Video Person Re-identification. Get started git clone h

WuYiming 41 Dec 12, 2022
Playing around with FastAPI and streamlit to create a YoloV5 object detector

FastAPI-Streamlit-based-YoloV5-detector Playing around with FastAPI and streamlit to create a YoloV5 object detector It turns out that a User Interfac

2 Jan 20, 2022
Liver segmentation using MONAI and pytorch

Machine Learning use case in the field of Healthcare. In this project MONAI and pytorch frameworks are used for 3D Liver segmentation.

Abhishek Gajbhiye 2 May 30, 2022
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition

Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition

107 Dec 02, 2022
Open source Python implementation of the HDR+ photography pipeline

hdrplus-python Open source Python implementation of the HDR+ photography pipeline, originally developped by Google and presented in a 2016 article. Th

77 Jan 05, 2023
A high performance implementation of HDBSCAN clustering.

HDBSCAN HDBSCAN - Hierarchical Density-Based Spatial Clustering of Applications with Noise. Performs DBSCAN over varying epsilon values and integrates

2.3k Jan 02, 2023
gitใ€ŠBeta R-CNN: Looking into Pedestrian Detection from Another Perspectiveใ€‹(NeurIPS 2020) GitHub:[fig3]

Beta R-CNN: Looking into Pedestrian Detection from Another Perspective This is the pytorch implementation of our paper "[Beta R-CNN: Looking into Pede

35 Sep 08, 2021
Free-duolingo-plus - Duolingo account creator that uses your invite code to get you free duolingo plus

free-duolingo-plus duolingo account creator that uses your invite code to get yo

1 Jan 06, 2022
PyTorch implementation for Convolutional Networks with Adaptive Inference Graphs

Convolutional Networks with Adaptive Inference Graphs (ConvNet-AIG) This repository contains a PyTorch implementation of the paper Convolutional Netwo

Andreas Veit 176 Dec 07, 2022
TransFGU: A Top-down Approach to Fine-Grained Unsupervised Semantic Segmentation

TransFGU: A Top-down Approach to Fine-Grained Unsupervised Semantic Segmentation Zhaoyun Yin, Pichao Wang, Fan Wang, Xianzhe Xu, Hanling Zhang, Hao Li

DamoCV 25 Dec 16, 2022
PiRank: Learning to Rank via Differentiable Sorting

PiRank: Learning to Rank via Differentiable Sorting This repository provides a reference implementation for learning PiRank-based models as described

54 Dec 17, 2022
A curated list of neural rendering resources.

Awesome-of-Neural-Rendering A curated list of neural rendering and related resources. Please feel free to pull requests or open an issue to add papers

Zhiwei ZHANG 43 Dec 09, 2022
The Curious Layperson: Fine-Grained Image Recognition without Expert Labels (BMVC 2021)

The Curious Layperson: Fine-Grained Image Recognition without Expert Labels Subhabrata Choudhury, Iro Laina, Christian Rupprecht, Andrea Vedaldi Code

Subhabrata Choudhury 18 Dec 27, 2022
Official PyTorch implementation of the paper: Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting.

Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting Official PyTorch implementation of the paper: Improving Graph Neural Net

Giorgos Bouritsas 58 Dec 31, 2022
McGill Physics Hackathon 2021: Reaction-Diffusion Models for the Generation of Biological Patterns

DiffuseAnimals: Reaction-Diffusion Models for the Generation of Biological Patterns Introduction Reaction-diffusion equations can be utilized in order

Austin Szuminsky 2 Mar 07, 2022
Run Effective Large Batch Contrastive Learning on Limited Memory GPU

Gradient Cache Gradient Cache is a simple technique for unlimitedly scaling contrastive learning batch far beyond GPU memory constraint. This means tr

Luyu Gao 198 Dec 29, 2022