[ICCV 2021] Learning A Single Network for Scale-Arbitrary Super-Resolution

Related tags

Deep LearningArbSR
Overview

ArbSR

Pytorch implementation of "Learning A Single Network for Scale-Arbitrary Super-Resolution", ICCV 2021

[Project] [arXiv]

Highlights

  • A plug-in module to extend a baseline SR network (e.g., EDSR and RCAN) to a scale-arbitrary SR network with small additional computational and memory cost.
  • Promising results for scale-arbitrary SR (both non-integer and asymmetric scale factors) while maintaining the state-of-the-art performance for SR with integer scale factors.

Demo

gif

Motivation

Although recent CNN-based single image SR networks (e.g., EDSR, RDN and RCAN) have achieved promising performance, they are developed for image SR with a single specific integer scale (e.g., x2, x3, x4). In real-world applications, non-integer SR (e.g., from 100x100 to 220x220) and asymmetric SR (e.g., from 100x100 to 220x420) are also necessary such that customers can zoom in an image arbitrarily for better view of details.

Overview

overview

Requirements

  • Python 3.6
  • PyTorch == 1.1.0
  • numpy
  • skimage
  • imageio
  • cv2

Train

1. Prepare training data

1.1 Download DIV2K training data (800 training images) from DIV2K dataset or SNU_CVLab.

1.2 Cd to ./utils and run gen_training_data.m in Matlab to prepare HR/LR images in your_data_path as belows:

your_data_path
└── DIV2K
	├── HR
		├── 0001.png
		├── ...
		└── 0800.png
	└── LR_bicubic
		├── X1.10
			├── 0001.png
			├── ...
			└── 0800.png
		├── ...
		└── X4.00_X3.50
			├── 0001.png
			├── ...
			└── 0800.png

2. Begin to train

Run ./main.sh to train on the DIV2K dataset. Please update dir_data in the bash file as your_data_path.

Test

1. Prepare test data

1.1 Download benchmark datasets (e.g., Set5, Set14 and other test sets).

1.2 Cd to ./utils and run gen_test_data.m in Matlab to prepare HR/LR images in your_data_path as belows:

your_data_path
└── benchmark
	├── Set5
		├── HR
			├── baby.png
			├── ...
			└── woman.png
		└── LR_bicubic
			├── X1.10
				├── baby.png
				├── ...
				└── woman.png
			├── ...
			└── X4.00_X3.50
				├── baby.png
				├── ...
				└── woman.png
	├── Set14
	├── B100
	├── Urban100
	└── Manga109
		├── HR
			├── AisazuNihalrarenai.png
			├── ...
			└── YouchienBoueigumi.png
		└── LR_bicubic
			├── X1.10
				├── AisazuNihalrarenai.png
				├── ...
				└── YouchienBoueigumi.png
			├── ...
			└── X4.00_X3.50
				├── AisazuNihalrarenai.png
				├── ...
				└── YouchienBoueigumi.png

2. Begin to test

Run ./test.sh to test on benchmark datasets. Please update dir_data in the bash file as your_data_path.

Quick Test on An LR Image

Run ./quick_test.sh to enlarge an LR image to an arbitrary size. Please update dir_img in the bash file as your_img_path.

Visual Results

1. SR with Symmetric Scale Factors

non-integer

2. SR with Asymmetric Scale Factors

asymmetric

3. SR with Continuous Scale Factors

Please try our interactive viewer.

Citation

@InProceedings{Wang2020Learning,
  title={Learning A Single Network for Scale-Arbitrary Super-Resolution},
  author={Longguang Wang, Yingqian Wang, Zaiping Lin, Jungang Yang, Wei An, and Yulan Guo},
  booktitle={ICCV},
  year={2021}
}

Acknowledgements

This code is built on EDSR (PyTorch) and Meta-SR. We thank the authors for sharing the codes.

Owner
Longguang Wang
Longguang Wang
PyTorch implementation of 'Gen-LaneNet: a generalized and scalable approach for 3D lane detection'

(pytorch) Gen-LaneNet: a generalized and scalable approach for 3D lane detection Introduction This is a pytorch implementation of Gen-LaneNet, which p

Yuliang Guo 233 Jan 06, 2023
Ranger deep learning optimizer rewrite to use newest components

Ranger21 - integrating the latest deep learning components into a single optimizer Ranger deep learning optimizer rewrite to use newest components Ran

Less Wright 266 Dec 28, 2022
A Factor Model for Persistence in Investment Manager Performance

Factor-Model-Manager-Performance A Factor Model for Persistence in Investment Manager Performance I apply methods and processes similar to those used

Omid Arhami 1 Dec 01, 2021
LyaNet: A Lyapunov Framework for Training Neural ODEs

LyaNet: A Lyapunov Framework for Training Neural ODEs Provide the model type--config-name to train and test models configured as those shown in the pa

Ivan Dario Jimenez Rodriguez 21 Nov 21, 2022
Training Structured Neural Networks Through Manifold Identification and Variance Reduction

Training Structured Neural Networks Through Manifold Identification and Variance Reduction This repository is a pytorch implementation of the Regulari

0 Dec 23, 2021
Official code repository for the work: "The Implicit Values of A Good Hand Shake: Handheld Multi-Frame Neural Depth Refinement"

Handheld Multi-Frame Neural Depth Refinement This is the official code repository for the work: The Implicit Values of A Good Hand Shake: Handheld Mul

55 Dec 14, 2022
Multi-Glimpse Network With Python

Multi-Glimpse Network Our code requires Python ≥ 3.8 Installation For example, venv + pip: $ python3 -m venv env $ source env/bin/activate (env) $ pyt

9 May 10, 2022
NeRD: Neural Reflectance Decomposition from Image Collections

NeRD: Neural Reflectance Decomposition from Image Collections Project Page | Video | Paper | Dataset Implementation for NeRD. A novel method which dec

Computergraphics (University of Tübingen) 195 Dec 29, 2022
An interactive DNN Model deployed on web that predicts the chance of heart failure for a patient with an accuracy of 98%

Heart Failure Predictor About A Web UI deployed Dense Neural Network Model Made using Tensorflow that predicts whether the patient is healthy or has c

Adit Ahmedabadi 0 Jan 09, 2022
LBK 35 Dec 26, 2022
Some pre-commit hooks for OpenMMLab projects

pre-commit-hooks Some pre-commit hooks for OpenMMLab projects. Using pre-commit-hooks with pre-commit Add this to your .pre-commit-config.yaml - rep

OpenMMLab 16 Nov 29, 2022
PyExplainer: A Local Rule-Based Model-Agnostic Technique (Explainable AI)

PyExplainer PyExplainer is a local rule-based model-agnostic technique for generating explanations (i.e., why a commit is predicted as defective) of J

AI Wizards for Software Management (AWSM) Research Group 14 Nov 13, 2022
Multi-Task Learning as a Bargaining Game

Nash-MTL Official implementation of "Multi-Task Learning as a Bargaining Game". Setup environment conda create -n nashmtl python=3.9.7 conda activate

Aviv Navon 87 Dec 26, 2022
RoadMap and preparation material for Machine Learning and Data Science - From beginner to expert.

ML-and-DataScience-preparation This repository has the goal to create a learning and preparation roadMap for Machine Learning Engineers and Data Scien

33 Dec 29, 2022
Unofficial PyTorch implementation of Guided Dropout

Unofficial PyTorch implementation of Guided Dropout This is a simple implementation of Guided Dropout for research. We try to reproduce the algorithm

2 Jan 07, 2022
tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.

Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai

timeseriesAI 2.8k Jan 08, 2023
ByteTrack: Multi-Object Tracking by Associating Every Detection Box

ByteTrack ByteTrack is a simple, fast and strong multi-object tracker. ByteTrack: Multi-Object Tracking by Associating Every Detection Box Yifu Zhang,

Yifu Zhang 2.9k Jan 04, 2023
Official code for "Eigenlanes: Data-Driven Lane Descriptors for Structurally Diverse Lanes", CVPR2022

[CVPR 2022] Eigenlanes: Data-Driven Lane Descriptors for Structurally Diverse Lanes Dongkwon Jin, Wonhui Park, Seong-Gyun Jeong, Heeyeon Kwon, and Cha

Dongkwon Jin 106 Dec 29, 2022
Deal or No Deal? End-to-End Learning for Negotiation Dialogues

Introduction This is a PyTorch implementation of the following research papers: (1) Hierarchical Text Generation and Planning for Strategic Dialogue (

Facebook Research 1.4k Dec 29, 2022
Face Identity Disentanglement via Latent Space Mapping [SIGGRAPH ASIA 2020]

Face Identity Disentanglement via Latent Space Mapping Description Official Implementation of the paper Face Identity Disentanglement via Latent Space

150 Dec 07, 2022