TransPrompt - Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification

Overview

TransPrompt

This code is implement for our EMNLP 2021's paper 《TransPrompt:Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification》.

Our proposed TransPrompt is motivated by the join of prompt-tuning and cross-task transfer learning. The aim is to explore and exploit the transferable knowledge from similar tasks in the few-shot scenario, and make the Pre-trained Language Model (PLM) better few-shot transfer learner. Our proposed framework is accepted by the main conference (long paper track) in EMNLP-2021. This code is the default multi-GPU version. We will teach you how to use our code in the following parts.

Ps: We also commit the same code in Alibaba EasyTransfer.

1. Data Preparation

We follow PET to use the same dataset. Please run the scripts to download the data:

sh data/download_data.sh

or manually download the dataset from https://nlp.cs.princeton.edu/projects/lm-bff/datasets.tar.

Then you will obtain a new director data/original

Our work has two kind of scenario, such as single-task and cross-task. Different kind scenario has corresponding splited examples. Defaultly, we generate few-shot learning examples, you can also generate full data by edit the parameter (-scene=full). We only demostrate the few-shot data generation.

1.1 Single-task Few-shot

Please run the scripts to obtain the single-task few-shot examples:

python3 data_utils/generate_k_shot_data.py --scene few-shot --k 16

Then you will obtain a new folder data/k-shot-single

1.2 Cross-task Few-shot

Run the scripts

python3 data_utils/generate_k_shot_cross_task_data.py --scene few-shot --k 16

and you will obtain a new folder data/k-shot-cross

After the generation, the similar tasks will be divided into the same group. We have three groups:

  • Group1 (Sentiment Analysis): SST-2, MR, CR
  • Group2 (Natural Language Inference): MNLI, SNLI
  • Group3 (Paraphrasing): MRPC, QQP

2. Have a Training Games

Please follow our papers, we have mask following experiments:

  • Single-task few-shot learning: It is the same as LM-BFF and P-tuning, we prompt-tune the PLM only on one task.
  • Cross-task few-shot learning: We mix up the similar task in group. At first, we prompt-tune the PLM on cross-task data, then we prompt-tune on each task again. For the Cross-task Learning, we have two cross-task method:
  • (Cross-)Task Adaptation: In one group, we prompt-tune on all the tasks, and then evaluate on each task both in few-shot scenario.
  • (Cross-)Task Generalization: In one group, we randomly choose one task for few-shot evaluation (do not used for training), others are used for prompt-tuning.

2.1 Single-task few-shot learning

Take MRPC as an example, please run:

CUDA_VISIBLE_DEVICES=0 sh scripts/run_single_task.sh

figure1.png

2.2 Cross-task few-shot Learning (Task Adaptaion)

Take Group1 as an example, please run the scripts:

CUDA_VISIBLE_DEVICES=0 sh scripts/run_cross_task_adaptation.sh

figure2.png

2.3 Cross-task few-shot Learning (Task Generalization)

Also take Group1 as an example, please run the scripts: Ps: the unseen task is SST-2.

CUDA_VISIBLE_DEVICES=0 sh scripts/run_cross_task_generalization.sh

figure3.png

Citation

Our paper citation is:

@inproceedings{DBLP:conf/emnlp/0001WQH021,
  author    = {Chengyu Wang and
               Jianing Wang and
               Minghui Qiu and
               Jun Huang and
               Ming Gao},
  editor    = {Marie{-}Francine Moens and
               Xuanjing Huang and
               Lucia Specia and
               Scott Wen{-}tau Yih},
  title     = {TransPrompt: Towards an Automatic Transferable Prompting Framework
               for Few-shot Text Classification},
  booktitle = {Proceedings of the 2021 Conference on Empirical Methods in Natural
               Language Processing, {EMNLP} 2021, Virtual Event / Punta Cana, Dominican
               Republic, 7-11 November, 2021},
  pages     = {2792--2802},
  publisher = {Association for Computational Linguistics},
  year      = {2021},
  url       = {https://aclanthology.org/2021.emnlp-main.221},
  timestamp = {Tue, 09 Nov 2021 13:51:50 +0100},
  biburl    = {https://dblp.org/rec/conf/emnlp/0001WQH021.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}

Acknowledgement

The code is developed based on pet. We appreciate all the authors who made their code public, which greatly facilitates this project. This repository would be continuously updated.

Owner
WangJianing
My name is Wang Jianing.Nowadays I am a postgraduate of East China Normal University in Shanghai.My research field is Machine Learning;Deep Learning and NLP
WangJianing
Code for Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions

EMS-COLS-recourse Initial Code for Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions Folder structure: data folder contains raw an

Prateek Yadav 1 Nov 25, 2022
Get started with Machine Learning with Python - An introduction with Python programming examples

Machine Learning With Python Get started with Machine Learning with Python An engaging introduction to Machine Learning with Python TL;DR Download all

Learn Python with Rune 130 Jan 02, 2023
A High-Performance Distributed Library for Large-Scale Bundle Adjustment

MegBA: A High-Performance and Distributed Library for Large-Scale Bundle Adjustment This repo contains an official implementation of MegBA. MegBA is a

旷视研究院 3D 组 336 Dec 27, 2022
SPT_LSA_ViT - Implementation for Visual Transformer for Small-size Datasets

Vision Transformer for Small-Size Datasets Seung Hoon Lee and Seunghyun Lee and Byung Cheol Song | Paper Inha University Abstract Recently, the Vision

Lee SeungHoon 87 Jan 01, 2023
Official implementation of "Towards Good Practices for Efficiently Annotating Large-Scale Image Classification Datasets" (CVPR2021)

Towards Good Practices for Efficiently Annotating Large-Scale Image Classification Datasets This is the official implementation of "Towards Good Pract

Sanja Fidler's Lab 52 Nov 22, 2022
Repo for code associated with Modeling the Mitral Valve.

Project Title Mitral Valve Getting Started Repo for code associated with Modeling the Mitral Valve. See https://arxiv.org/abs/1902.00018 for preprint,

Alex Kaiser 1 May 17, 2022
OntoProtein: Protein Pretraining With Ontology Embedding

OntoProtein This is the implement of the paper "OntoProtein: Protein Pretraining With Ontology Embedding". OntoProtein is an effective method that mak

ZJUNLP 80 Dec 14, 2022
NVIDIA Deep Learning Examples for Tensor Cores

NVIDIA Deep Learning Examples for Tensor Cores Introduction This repository provides State-of-the-Art Deep Learning examples that are easy to train an

NVIDIA Corporation 10k Dec 31, 2022
PyTorch3D is FAIR's library of reusable components for deep learning with 3D data

Introduction PyTorch3D provides efficient, reusable components for 3D Computer Vision research with PyTorch. Key features include: Data structure for

Facebook Research 6.8k Jan 01, 2023
VISNOTATE: An Opensource tool for Gaze-based Annotation of WSI Data

VISNOTATE: An Opensource tool for Gaze-based Annotation of WSI Data Introduction Requirements Installation and Setup Supported Hardware and Software R

SigmaLab 1 Jun 14, 2022
This is project is the implementation of the DeepShift: Towards Multiplication-Less Neural Networks paper

DeepShift This is project is the implementation of the DeepShift: Towards Multiplication-Less Neural Networks paper, that aims to replace multiplicati

Mostafa Elhoushi 88 Dec 23, 2022
Graph InfoClust: Leveraging cluster-level node information for unsupervised graph representation learning

Graph-InfoClust-GIC [PAKDD 2021] PAKDD'21 version Graph InfoClust: Maximizing Coarse-Grain Mutual Information in Graphs Preprint version Graph InfoClu

Costas Mavromatis 21 Dec 03, 2022
Official Repository of NeurIPS2021 paper: PTR

PTR: A Benchmark for Part-based Conceptual, Relational, and Physical Reasoning Figure 1. Dataset Overview. Introduction A critical aspect of human vis

Yining Hong 32 Jun 02, 2022
(AAAI2020)Grapy-ML: Graph Pyramid Mutual Learning for Cross-dataset Human Parsing

Grapy-ML: Graph Pyramid Mutual Learning for Cross-dataset Human Parsing This repository contains pytorch source code for AAAI2020 oral paper: Grapy-ML

54 Aug 04, 2022
Code for "Long-tailed Distribution Adaptation"

Long-tailed Distribution Adaptation (Accepted in ACM MM2021) This project is built upon BBN. Installation pip install -r requirements.txt Usage Traini

Zhiliang Peng 10 May 18, 2022
Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"

Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter Code and checkpoints for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling

274 Dec 06, 2022
code for paper -- "Seamless Satellite-image Synthesis"

Seamless Satellite-image Synthesis by Jialin Zhu and Tom Kelly. Project site. The code of our models borrows heavily from the BicycleGAN repository an

Light 14 Apr 05, 2022
Official Pytorch Implementation of Length-Adaptive Transformer (ACL 2021)

Length-Adaptive Transformer This is the official Pytorch implementation of Length-Adaptive Transformer. For detailed information about the method, ple

Clova AI Research 93 Dec 28, 2022
3rd Place Solution for ICCV 2021 Workshop SSLAD Track 3A - Continual Learning Classification Challenge

Online Continual Learning via Multiple Deep Metric Learning and Uncertainty-guided Episodic Memory Replay 3rd Place Solution for ICCV 2021 Workshop SS

Rifki Kurniawan 6 Nov 10, 2022
Drone Task1 - Drone Task1 With Python

Drone_Task1 Matching Results 3.mp4 1.mp4

MLV Lab (Machine Learning and Vision Lab at Korea University) 11 Nov 14, 2022