SoGCN: Second-Order Graph Convolutional Networks

Overview

SoGCN: Second-Order Graph Convolutional Networks

This is the authors' implementation of paper "SoGCN: Second-Order Graph Convolutional Networks" in PyTorch. All the hyper-parameters and experiment settings have been included in this repo.

Requirements

For the GNN benchmarking part, our experiments are based on GNN Benchmark. Please follow the instructions in Benchmark Installation to install the running environment. Our code is tested with PyTorch 1.3.1 + Cuda Toolkit 10.0.

For the experiments on OGB Open Graph Benchmark, we built our models based on offical code. Please follow the instructions in Getting Started to configure the running environment. Our code is tested with PyTorch 1.4.0 + Cuda Toolkit 10.1.

Our experiments is conducted on a 4-core Nvidia Quadro P6000 GPU running on Ubuntu 18.04.2 LTS.

Reproduce Results

For SGS and GNN benchmark datasets, we provide a script named 'scripts/exp.py' to run a series of model training in screen sessions. You can type python scripts/exp.py -h to view its usage. To OGB benchmark dataset, we provide shell scripts 'scripts/run_ogbn_proteins.sh' and 'scripts/run_ogbg_molhiv.sh' to reproduce results with our hyper-parameter settings.

For convenience, below presents the commands to reproduce our results.

Synthetic Graph Spectrum Dataset

To train models on SGS datasets, run the following commands:

## High-Pass
python scripts/exp.py -a start -e highpass_sogcn -t SGS -g 1111 --dataset SGS_HIGH_PASS --config 'configs/SGS_node_regression_SoGCN.json'
python scripts/exp.py -a start -e highpass_sogcn -t SGS -g 1111 --dataset SGS_HIGH_PASS --config 'configs/SGS_node_regression_GCN.json'
python scripts/exp.py -a start -e highpass_sogcn -t SGS -g 1111 --dataset SGS_HIGH_PASS --config 'configs/SGS_node_regression_GIN.json'

## Low-Pass
python scripts/exp.py -a start -e lowpass_sogcn -t SGS -g 1111 --dataset SGS_LOW_PASS --config 'configs/SGS_node_regression_SoGCN.json'
python scripts/exp.py -a start -e lowpass_sogcn -t SGS -g 1111 --dataset SGS_LOW_PASS --config 'configs/SGS_node_regression_GCN.json'
python scripts/exp.py -a start -e lowpass_sogcn -t SGS -g 1111 --dataset SGS_LOW_PASS --config 'configs/SGS_node_regression_GIN.json'

## Band-Pass
python scripts/exp.py -a start -e bandpass_sogcn -t SGS -g 1111 --dataset SGS_BAND_PASS --config 'configs/SGS_node_regression_SoGCN.json'
python scripts/exp.py -a start -e bandpass_sogcn -t SGS -g 1111 --dataset SGS_BAND_PASS --config 'configs/SGS_node_regression_GCN.json'
python scripts/exp.py -a start -e bandpass_sogcn -t SGS -g 1111 --dataset SGS_BAND_PASS --config 'configs/SGS_node_regression_GIN.json'

Note the results will be saved to '_out/SGS_node_regression/'.

Open Graph Benchmarks

Running the following commands will reproduce our results on Open Graph Benchmark datasets:

scripts/run_ogbn_proteins.sh <log_dir> [<gpu_id>] [--test]
scripts/run_ogbg_molhiv.sh <log_dir> [<gpu_id>] [--test]

where log_dir specifies the folder to load or save output logs. The downloaded log files will be saved in _out/protein_nodeproppred and _out/molhiv_graphproppred for ogbn-protein and ogbn-molhiv datasets, respectively. gpu_id specifies the GPU device to run our models. Add --test if you only want to reload the log files and read out the testing results. The OGB dataset will be automatically downloaded into data/OGB directory at the first run.

To download the saved log files for ogb datasets, please run the following scripts:

bash scripts/download_logfiles_ogb.sh

GNN Benchmarks

To test on our pretrained models on GNN benchmarks, please follow the steps as below:

  1. Download our pretrained models.
# make sure the commands below are executed in the root directory of this project
bash scripts/download_pretrained_molecules.sh
bash scripts/download_pretrained_superpixels.sh
bash scripts/download_pretrained_SBMs.sh

Pretrained models will be downloaded to '_out/molecules_graph_regression', '_out/superpixels_graph_classification', '_out/SBMs_node_classification', respectively.

  1. Type the commands for different tasks

Molecules Graph Regression

## ZINC
python main_molecules_graph_regression.py --model SoGCN --dataset ZINC --gpu_id 0 --test True --out_dir _out/molecules_graph_regression/zinc_sogcn
python main_molecules_graph_regression.py --model SoGCN --dataset ZINC --gpu_id 0 --test True --out_dir _out/molecules_graph_regression/zinc_sogcn_gru

Superpixels Graph Classification

## MNIST
python main_superpixels_graph_classification.py --model SoGCN --dataset MNIST --gpu_id 0 --test True --out_dir _out/superpixels_graph_classification/mnist_sogcn
python main_superpixels_graph_classification.py --model SoGCN --dataset MNIST --gpu_id 0 --test True --out_dir _out/superpixels_graph_classification/mnist_sogcn_gru


## CIFAR10
python main_superpixels_graph_classification.py --model SoGCN --dataset CIFAR10 --gpu_id 0 --test True --out_dir _out/superpixels_graph_classification/cifar10_sogcn
python main_superpixels_graph_classification.py --model SoGCN --dataset CIFAR10 --gpu_id 0 --test True --out_dir _out/superpixels_graph_classification/cifar10_sogcn_gru

SBMs Node Classification

## CLUSTER
python main_SBMs_node_classification.py --model SoGCN --dataset SBM_CLUSTER  --verbose True --gpu_id 0 --test True --out_dir _out/SBMs_node_classification/cluster_sogcn
python main_SBMs_node_classification.py --model SoGCN --dataset SBM_CLUSTER  --verbose True --gpu_id 0 --test True --out_dir _out/SBMs_node_classification/cluster_sogcn_gru

## PATTERN
python main_SBMs_node_classification.py --model SoGCN --dataset SBM_PATTERN  --verbose True --gpu_id 0 --test True --out_dir _out/SBMs_node_classification/pattern_sogcn
python main_SBMs_node_classification.py --model SoGCN --dataset SBM_PATTERN  --verbose True --gpu_id 0 --test True --out_dir _out/SBMs_node_classification/pattern_sogcn_gru
Owner
Yuehao
PhD in Computer Science & Engineering @ CUHK. Research interest includes Graphics + Vision + Machine Learning.
Yuehao
MlTr: Multi-label Classification with Transformer

MlTr: Multi-label Classification with Transformer This is official implement of "MlTr: Multi-label Classification with Transformer". Abstract The task

程星 38 Nov 08, 2022
DGN pymarl - Implementation of DGN on Pymarl, which could be trained by VDN or QMIX

This is the implementation of DGN on Pymarl, which could be trained by VDN or QM

4 Nov 23, 2022
PyTorch implementation of a Real-ESRGAN model trained on custom dataset

Real-ESRGAN PyTorch implementation of a Real-ESRGAN model trained on custom dataset. This model shows better results on faces compared to the original

Sber AI 160 Jan 04, 2023
g2o: A General Framework for Graph Optimization

g2o - General Graph Optimization Linux: Windows: g2o is an open-source C++ framework for optimizing graph-based nonlinear error functions. g2o has bee

Rainer Kümmerle 2.5k Dec 30, 2022
Extremely simple and fast extreme multi-class and multi-label classifiers.

napkinXC napkinXC is an extremely simple and fast library for extreme multi-class and multi-label classification, that focus of implementing various m

Marek Wydmuch 43 Nov 14, 2022
A framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions.

Telemanom (v2.0) v2.0 updates: Vectorized operations via numpy Object-oriented restructure, improved organization Merge branches into single branch fo

Kyle Hundman 844 Dec 28, 2022
3DV 2021: Synergy between 3DMM and 3D Landmarks for Accurate 3D Facial Geometry

SynergyNet 3DV 2021: Synergy between 3DMM and 3D Landmarks for Accurate 3D Facial Geometry Cho-Ying Wu, Qiangeng Xu, Ulrich Neumann, CGIT Lab at Unive

Cho-Ying Wu 239 Jan 06, 2023
Contrastive Learning for Many-to-many Multilingual Neural Machine Translation(mCOLT/mRASP2), ACL2021

Contrastive Learning for Many-to-many Multilingual Neural Machine Translation(mCOLT/mRASP2), ACL2021 The code for training mCOLT/mRASP2, a multilingua

104 Jan 01, 2023
A minimal solution to hand motion capture from a single color camera at over 100fps. Easy to use, plug to run.

Minimal Hand A minimal solution to hand motion capture from a single color camera at over 100fps. Easy to use, plug to run. This project provides the

Yuxiao Zhou 824 Jan 07, 2023
Build Low Code Automated Tensorflow, What-IF explainable models in just 3 lines of code.

Build Low Code Automated Tensorflow explainable models in just 3 lines of code.

Hasan Rafiq 170 Dec 26, 2022
Spatial Contrastive Learning for Few-Shot Classification (SCL)

This repo contains the official implementation of Spatial Contrastive Learning for Few-Shot Classification (SCL), which presents of a novel contrastive learning method applied to few-shot image class

Yassine 34 Dec 25, 2022
Flexible time series feature extraction & processing

tsflex is a toolkit for flexible time series processing & feature extraction, that is efficient and makes few assumptions about sequence data. Useful

PreDiCT.IDLab 206 Dec 28, 2022
PyTorch implementation of Advantage async actor-critic Algorithms (A3C) in PyTorch

Advantage async actor-critic Algorithms (A3C) in PyTorch @inproceedings{mnih2016asynchronous, title={Asynchronous methods for deep reinforcement lea

LEI TAI 111 Dec 08, 2022
Complex-Valued Neural Networks (CVNN)Complex-Valued Neural Networks (CVNN)

Complex-Valued Neural Networks (CVNN) Done by @NEGU93 - J. Agustin Barrachina Using this library, the only difference with a Tensorflow code is that y

youceF 1 Nov 12, 2021
Wordle Env: A Daily Word Environment for Reinforcement Learning

Wordle Env: A Daily Word Environment for Reinforcement Learning Setup Steps: git pull [email&#

2 Mar 28, 2022
Pytorch code for our paper Beyond ImageNet Attack: Towards Crafting Adversarial Examples for Black-box Domains)

Beyond ImageNet Attack: Towards Crafting Adversarial Examples for Black-box Domains (ICLR'2022) This is the Pytorch code for our paper Beyond ImageNet

Alibaba-AAIG 37 Nov 23, 2022
Automatic learning-rate scheduler

AutoLRS This is the PyTorch code implementation for the paper AutoLRS: Automatic Learning-Rate Schedule by Bayesian Optimization on the Fly published

Yuchen Jin 33 Nov 18, 2022
eXPeditious Data Transfer

xpdt: eXPeditious Data Transfer About xpdt is (yet another) language for defining data-types and generating code for serializing and deserializing the

Gianni Tedesco 3 Jan 06, 2022
Sequence modeling benchmarks and temporal convolutional networks

Sequence Modeling Benchmarks and Temporal Convolutional Networks (TCN) This repository contains the experiments done in the work An Empirical Evaluati

CMU Locus Lab 3.5k Jan 01, 2023