GNNLens2 is an interactive visualization tool for graph neural networks (GNN).

Overview

GNNLens2 is an interactive visualization tool for graph neural networks (GNN). It allows seamless integration with deep graph library (DGL) and can meet your various visualization requirements for presentation, analysis and model explanation. It is an open source version of GNNLens with simplification and extension.

Installation

Requirements

You can install Flask-CORS with

pip install -U flask-cors

Installation for the latest stable version

pip install gnnlens

Installation from source

If you want to try experimental features, you can install from source as follows:

git clone https://github.com/dmlc/GNNLens2.git
cd GNNLens2/python
python setup.py install

Verifying successful installation

Once you have installed the package, you can verify the success of installation with

import gnnlens

print(gnnlens.__version__)
# 0.1.0

Tutorials

We provide a set of tutorials to get you started with the library:

Team

HKUST VisLab: Zhihua Jin, Huamin Qu

AWS Shanghai AI Lab: Mufei Li, Wanru Zhao (work done during internship), Jian Zhang, Minjie Wang

SMU: Yong Wang

Comments
  • Support for heterogeneous graphs

    Support for heterogeneous graphs

    Thanks for this wonderful library. In writer.add_graph(), I saw that it only accepts homogeneous graphs, do you have any plans to support heterogeneous graphs?

    opened by Mrugankakarte 2
  • Is there any test about the limit of graph size?

    Is there any test about the limit of graph size?

    Hi there,

    I'm wondering if the limit of graph size has been tested, such as the maximal number of nodes and edges in the visualized graph? And the correspondence between the memory overhead/simulation time and the graph size also interests me. For example, how big the graph is that can be processed by the GNNLens2 within 10secs on a machine with 16 CPUs and 64GB memory?

    opened by LspongebobJH 1
  • Several update

    Several update

    1. Rename GNNVis to GNNLens
    2. Remove the examples folder, which will be fully replaced by the tutorials
    3. Rename gnnvis to python
    4. Rename gnnvis/mini_serve to python/gnnlens
    5. Update setup.py
    6. Add Writer
    opened by mufeili 0
  • Bump flask-cors from 3.0.0 to 3.0.9 in /gnnvis

    Bump flask-cors from 3.0.0 to 3.0.9 in /gnnvis

    Bumps flask-cors from 3.0.0 to 3.0.9.

    Release notes

    Sourced from flask-cors's releases.

    Release 3.0.9

    Security

    • Escape path before evaluating resource rules (thanks @​praetorian-colby-morgan). Prior to this, flask-cors incorrectly evaluated CORS resource matching before path expansion. E.g. "/api/../foo.txt" would incorrectly match resources for "/api/*" whereas the path actually expands simply to "/foo.txt"

    Release 3.0.8

    Fixes DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working

    Thank you @​juanmaneo and @​jdevera!

    Release 3.0.7

    Updated logging.warn to logging.warning (#234) Thanks Vaibhav

    Release 3.0.6

    Manual error in release process. Identical contents at 3.0.5.

    Release 3.0.5

    Fixes incorrect handling of regexes containg '[', and a few other special characters. https://github-redirect.dependabot.com/corydolphin/flask-cors/issues/212

    Release 3.0.4

    Handle response.headers being None. (Fixes issue #217) Thanks @​dusktreader for the improvement!

    Release 3.0.3

    Ensure that an Origin of '*' is never sent if supports_credentials is True (fixes Issue #202)

    • If always_send=True, and '*' is in the allowed origins, and a request is made without an Origin header, no Access-Control-Allow-Origins header will now be returned. This is breaking if you depended on it, but was a bug as it goes against the spec.

    Release 3.0.2

    Fixes Issue #187: regression whereby header (and domain) matching was incorrectly case sensitive. Now it is not, making the behavior identical to 2.X and 1.X.

    Release 3.0.1

    Fixes Issue #183: regression whereby regular expressions for origins with an "?" are not properly matched.

    Thanks @​di for the report!

    Changelog

    Sourced from flask-cors's changelog.

    3.0.9

    Security

    • Escape path before evaluating resource rules (thanks to Colby Morgan). Prior to this, flask-cors incorrectly evaluated CORS resource matching before path expansion. E.g. "/api/../foo.txt" would incorrectly match resources for "/api/*" whereas the path actually expands simply to "/foo.txt"

    3.0.8

    Fixes : DeprecationWarning: Using or importing the ABCs from 'collections' in Python 3.7. Thank you @​juanmaneo and @​jdevera for the contribution.

    3.0.7

    Updated logging.warn to logging.warning (#234) Thanks Vaibhav

    3.0.6

    Manual error in release process. Identical contents at 3.0.5.

    3.0.5

    Fixes incorrect handling of regexes containg [, and a few other special characters. Fixes Issue #212

    3.0.4

    Handle response.headers being None. (Fixes issue #217)

    3.0.3

    Ensure that an Origin of '*' is never sent if supports_credentials is True (fixes Issue #202)

    • If always_send=True, and '*' is in the allowed origins, and a request is made without an Origin header, no Access-Control-Allow-Origins header will now be returned. This is breaking if you depended on it, but was a bug as it goes against the spec.

    3.0.2

    Fixes Issue #187: regression whereby header (and domain) matching was incorrectly case sensitive. Now it is not, making the behavior identical to 2.X and 1.X.

    3.0.1

    Fixes Issue #183: regression whereby regular expressions for origins with an "?" are not properly matched.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies python 
    opened by dependabot[bot] 0
  • Unable to show the graph of ogbn-products

    Unable to show the graph of ogbn-products

    Hi, I follow the instructions in tutorial 1 and try to show the graph structure of ogbn-products. There is no error showed when I run the code, however, when I open the gnnlens website and select the ogbn-products graph in the graph selector, it shows nothing. I wonders what may cause this happen? Is it because the graph too large, or because the graph does not include the label? Hope I describe it clearly, if needed I could show my code.

    opened by EntongL 1
  • ImportError: cannot import name 'safe_join' from 'flask'

    ImportError: cannot import name 'safe_join' from 'flask'

    Since 2.1.0, flask deprecates safe_join, as elaborated in its release note here. For now, a workaround is to degrade flask to an older version like pip install Flask==2.0.3. This should be fixed in the future release of GNNLens2 by either restricting Flask version or follow the latest recommended practice.

    Credit to @SherylHYX for reporting the issue.

    bug 
    opened by mufeili 0
Releases(v0.1.0)
Owner
Distributed (Deep) Machine Learning Community
A Community of Awesome Machine Learning Projects
Distributed (Deep) Machine Learning Community
Pytorch implementation of convolutional neural network visualization techniques

Convolutional Neural Network Visualizations This repository contains a number of convolutional neural network visualization techniques implemented in

Utku Ozbulak 7k Jan 03, 2023
L2X - Code for replicating the experiments in the paper Learning to Explain: An Information-Theoretic Perspective on Model Interpretation.

L2X Code for replicating the experiments in the paper Learning to Explain: An Information-Theoretic Perspective on Model Interpretation at ICML 2018,

Jianbo Chen 113 Sep 06, 2022
A library that implements fairness-aware machine learning algorithms

Themis ML themis-ml is a Python library built on top of pandas and sklearnthat implements fairness-aware machine learning algorithms. Fairness-aware M

Niels Bantilan 105 Dec 30, 2022
A python library for decision tree visualization and model interpretation.

dtreeviz : Decision Tree Visualization Description A python library for decision tree visualization and model interpretation. Currently supports sciki

Terence Parr 2.4k Jan 02, 2023
A game theoretic approach to explain the output of any machine learning model.

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allo

Scott Lundberg 18.3k Jan 08, 2023
Algorithms for monitoring and explaining machine learning models

Alibi is an open source Python library aimed at machine learning model inspection and interpretation. The focus of the library is to provide high-qual

Seldon 1.9k Dec 30, 2022
Implementation of linear CorEx and temporal CorEx.

Correlation Explanation Methods Official implementation of linear correlation explanation (linear CorEx) and temporal correlation explanation (T-CorEx

Hrayr Harutyunyan 34 Nov 15, 2022
A library for debugging/inspecting machine learning classifiers and explaining their predictions

ELI5 ELI5 is a Python package which helps to debug machine learning classifiers and explain their predictions. It provides support for the following m

2.6k Dec 30, 2022
Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Webis 42 Aug 14, 2022
A data-driven approach to quantify the value of classifiers in a machine learning ensemble.

Documentation | External Resources | Research Paper Shapley is a Python library for evaluating binary classifiers in a machine learning ensemble. The

Benedek Rozemberczki 187 Dec 27, 2022
Auralisation of learned features in CNN (for audio)

AuralisationCNN This repo is for an example of auralisastion of CNNs that is demonstrated on ISMIR 2015. Files auralise.py: includes all required func

Keunwoo Choi 39 Nov 19, 2022
python partial dependence plot toolbox

PDPbox python partial dependence plot toolbox Motivation This repository is inspired by ICEbox. The goal is to visualize the impact of certain feature

Li Jiangchun 722 Dec 30, 2022
Delve is a Python package for analyzing the inference dynamics of your PyTorch model.

Delve is a Python package for analyzing the inference dynamics of your PyTorch model.

Delve 73 Dec 12, 2022
Lime: Explaining the predictions of any machine learning classifier

lime This project is about explaining what machine learning classifiers (or models) are doing. At the moment, we support explaining individual predict

Marco Tulio Correia Ribeiro 10.3k Jan 01, 2023
Lucid library adapted for PyTorch

Lucent PyTorch + Lucid = Lucent The wonderful Lucid library adapted for the wonderful PyTorch! Lucent is not affiliated with Lucid or OpenAI's Clarity

Lim Swee Kiat 520 Dec 26, 2022
⬛ Python Individual Conditional Expectation Plot Toolbox

⬛ PyCEbox Python Individual Conditional Expectation Plot Toolbox A Python implementation of individual conditional expecation plots inspired by R's IC

Austin Rochford 140 Dec 30, 2022
Portal is the fastest way to load and visualize your deep neural networks on images and videos 🔮

Portal is the fastest way to load and visualize your deep neural networks on images and videos 🔮

Datature 243 Jan 05, 2023
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)

Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)

Jesse Vig 4.7k Jan 01, 2023
An Empirical Review of Optimization Techniques for Quantum Variational Circuits

QVC Optimizer Review Code for the paper "An Empirical Review of Optimization Techniques for Quantum Variational Circuits". Each of the python files ca

Owen Lockwood 5 Jun 28, 2022
Python Library for Model Interpretation/Explanations

Skater Skater is a unified framework to enable Model Interpretation for all forms of model to help one build an Interpretable machine learning system

Oracle 1k Dec 27, 2022