A FAIR dataset of TCV experimental results for validating edge/divertor turbulence models.

Related tags

Deep LearningTCV-X21
Overview

TCV-X21 validation for divertor turbulence simulations

Quick links

arXiv PDF

Binder DOI

Dataset licence Software licence

Test Python package codecov

Intro

Welcome to TCV-X21. We're glad you've found us!

This repository is designed to let you perform the analysis presented in Oliveira and Body et. al., Nuclear Fusion, 2021, both using the data given in the paper, and with a turbulence simulation of your own. We hope that, by providing the analysis, the TCV-X21 case can be used as a standard validation and bench-marking case for turbulence simulations of the divertor in fusion experiments. The repository allows you to scrutinise and suggest improvements to the analysis (there's always room for improvement), to directly interact with and explore the data in greater depth than is possible in a paper, and — we hope — use this case to test a simulation of your own.

To use this repository, you'll need to either use the mybinder.org link below OR user rights on a computer with Python-3, conda and git-lfs pre-installed.

Video tutorial

This quick tutorial shows you how to navigate the repository and use some of the functionality of the library.

Video_tutorial.mp4

What can you find in this repository

  • 1.experimental_data: data from the TCV experimental campaign, in NetCDF, MATLAB and IMAS formats, as well as information about the reference scenario, and the reference magnetic geometry (in .eqdsk, IMAS and PARALLAX-nc formats)
  • 2.simulation_data: data from simulations of the TCV-X21 case, in NetCDF format, as well as raw data files and conversion routines
  • 3.results: high resolution PNGs and LaTeX-ready tables for a paper
  • tcvx21: a Python library of software, which includes
    • record_c: a class to interface with NetCDF/HDF5 formatted data files
    • observable_c: a class to interact with and plot observables
    • file_io: tools to interact with MATLAB and JSON files
    • quant_validation: routines to perform the quantitative validation
    • analysis: statistics, curve-fitting, bootstrap algorithms, contour finding
    • units_m.py: setting up pint-based unit-aware analysis (it's difficult to overstate how cool this library is)
    • grillix_post: a set of routines used for post-processing GRILLIX simulation data, which might help if you're trying to post-process your own simulation. You can see a worked example in simulation_postprocessing.ipynb
  • notebooks: Jupyter notebooks, which allow us to provide code with outputs and comments together
    • simulation_setup.ipynb: what you might need to set up a simulation to test
    • simulation_postprocessing.ipynb: how to post-process the data
    • data_exploration.ipynb: some examples to get you started exploring the data
    • bulk_process.ipynb: runs over every observable to make the results — which you'll need to do if you're writing a paper from the results
  • tests: tests to make sure that we haven't broken anything in the analysis routines
  • README.md: this file, which helps you to get the software up and running, and to explain where you can find everything you need. It also provides the details of the licencing (below). There's more specific README.md files in several of the subfolders.

and lots more files. If you're not a developer, you can safely ignore these.

What can't you find in this repository

Due to licencing issues, the source code of the simulations is not provided. Sorry!

Also, the raw simulations are not provided here due to space limitations (some runs have more than a terabyte of data), but they are all backed up on archive servers. If you'd like to access the raw data, get in contact.

License and attribution notice

The TCV-X21 datasets are licenced under a Creative Commons Attribution 4.0 license, given in LICENCE. The source code of the analysis routines and Python library is licenced under a MIT license, given in tcvx21/LICENCE.

For the datasets, we ask that you provide attribution if using this data via the citation in the CITATION.cff file. We additionally require that you mark any changes to the dataset, and state specifically that the authors do not endorse your work unless such endorsement has been expressly given.

For the software, you can use, modify and share without attribution or marking changes.

Running the Jupyter notebooks (installation as non-root user)

To run the Jupyter notebooks, you have two options. The first is to use the mybinder.org interface, which let you interact with the notebooks via a web interface. You can launch the binder for this repository by clicking the binder badge in the repository header. Note that not all of the repository content is copied to the Docker image (this is specified in .dockerignore). The large checkpoint files are not included in the image, although they can be found in the repository at 2.simulation_data/GRILLIX/checkpoints_for_1mm. Additionally, the default docker image will not work with git.

Alternatively, if you'd like to run the notebooks locally or to extend the repository, you'll need to install additional Python packages. First of all, you need Python-3 and conda installed (latest versions recommended). Then, to install the necessary packages, we make a sandbox environment. This has a few advantages to installing packages globally — sudo rights are not required, you can install package versions without risking breaking other Python scripts, and if everything goes terribly wrong you can easily delete everything and restart. We've included a simple shell script to perform the necessary steps, which you can execute with

./install_env.sh

This will install the library in a subfolder of the TCV-X21 repository called tcvx21_env. It will also add a kernel to your global Jupyter installation. To remove the repository, you can delete the folder tcvx21_env and run jupyter kernelspec uninstall tcvx21.

To run tests and open Jupyter

Once you've installed via either option, you can activate the python environment with conda activate ./tcvx21_env. To deactivate, run conda deactivate.

Then, it is recommended to run the test suite with pytest which ensures that everything is installed and working correctly. If something fails, let us know in the issues. Note that this executes all of the analysis notebooks, so it might take a while to run.

Finally, run jupyter lab to open a Jupyter server in the TCV-X21 repository. Then, you can open any of the notebooks (.ipynb extension) by clicking in the side-bar.

A note on pinned dependencies

To ensure that the results are reproducible, the environment.yml file has pinned dependencies. However, if you want to use this software as a library, pinned dependencies are unnecessarily restrictive. You can remove the versions after the = sign in the environment.yml, but be warned that things might break.

You might also like...
Fair Recommendation in Two-Sided Platforms

Fair Recommendation in Two-Sided Platforms

Code for Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022)

Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022) We consider how a user of a web servi

Regulatory Instruments for Fair Personalized Pricing.

Fair pricing Source code for WWW 2022 paper Regulatory Instruments for Fair Personalized Pricing. Installation Requirements Linux with Python = 3.6 p

This is the official repo for TransFill:  Reference-guided Image Inpainting by Merging Multiple Color and Spatial Transformations at CVPR'21. According to some product reasons, we are not planning to release the training/testing codes and models. However, we will release the dataset and the scripts to prepare the dataset.
This code reproduces the results of the paper, "Measuring Data Leakage in Machine-Learning Models with Fisher Information"

Fisher Information Loss This repository contains code that can be used to reproduce the experimental results presented in the paper: Awni Hannun, Chua

A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]
Experimental solutions to selected exercises from the book [Advances in Financial Machine Learning by Marcos Lopez De Prado]

Advances in Financial Machine Learning Exercises Experimental solutions to selected exercises from the book Advances in Financial Machine Learning by

An experimental technique for efficiently exploring neural architectures.
An experimental technique for efficiently exploring neural architectures.

SMASH: One-Shot Model Architecture Search through HyperNetworks An experimental technique for efficiently exploring neural architectures. This reposit

A simple but complete full-attention transformer with a set of promising experimental features from various papers
A simple but complete full-attention transformer with a set of promising experimental features from various papers

x-transformers A concise but fully-featured transformer, complete with a set of promising experimental features from various papers. Install $ pip ins

Comments
  • Repair results

    Repair results

    It appears that the 3.results folder had not been updated with the outputs of the notebooks.

    I've rerun the notebooks and now have the latest results in the folder.

    opened by TBody 1
Releases(v1.0)
Unofficial PyTorch Implementation of "Augmenting Convolutional networks with attention-based aggregation"

Pytorch Implementation of Augmenting Convolutional networks with attention-based aggregation This is the unofficial PyTorch Implementation of "Augment

DK 20 Sep 09, 2022
CM building dataset Timisoara

CM_building_dataset_Timisoara Date created: Febr-2020 The Timi\c{s}oara Building Dataset - TMBuD - is composed of 160 images with the resolution of 76

Orhei Ciprian 5 Sep 07, 2022
Hooks for VCOCO

Verbs in COCO (V-COCO) Dataset This repository hosts the Verbs in COCO (V-COCO) dataset and associated code to evaluate models for the Visual Semantic

Saurabh Gupta 131 Nov 24, 2022
PyTorch implementation of Neural Dual Contouring.

NDC PyTorch implementation of Neural Dual Contouring. Citation We are still writing the paper while adding more improvements and applications. If you

Zhiqin Chen 140 Dec 26, 2022
The codes and related files to reproduce the results for Image Similarity Challenge Track 2.

ISC-Track2-Submission The codes and related files to reproduce the results for Image Similarity Challenge Track 2. Required dependencies To begin with

Wenhao Wang 89 Jan 02, 2023
Some experiments with tennis player aging curves using Hilbert space GPs in PyMC. Only experimental for now.

NOTE: This is still being developed! Setup notes This document uses Jeff Sackmann's tennis data. You can obtain it as follows: git clone https://githu

Martin Ingram 1 Jan 20, 2022
A Dataset of Python Challenges for AI Research

Python Programming Puzzles (P3) This repo contains a dataset of python programming puzzles which can be used to teach and evaluate an AI's programming

Microsoft 850 Dec 24, 2022
Chess reinforcement learning by AlphaGo Zero methods.

About Chess reinforcement learning by AlphaGo Zero methods. This project is based on these main resources: DeepMind's Oct 19th publication: Mastering

Samuel 2k Dec 29, 2022
implement of SwiftNet:Real-time Video Object Segmentation

SwiftNet The official PyTorch implementation of SwiftNet:Real-time Video Object Segmentation, which has been accepted by CVPR2021. Requirements Python

haochen wang 64 Dec 14, 2022
Airborne Optical Sectioning (AOS) is a wide synthetic-aperture imaging technique

AOS: Airborne Optical Sectioning Airborne Optical Sectioning (AOS) is a wide synthetic-aperture imaging technique that employs manned or unmanned airc

JKU Linz, Institute of Computer Graphics 39 Dec 09, 2022
This is an official implementation for "Video Swin Transformers".

Video Swin Transformer By Ze Liu*, Jia Ning*, Yue Cao, Yixuan Wei, Zheng Zhang, Stephen Lin and Han Hu. This repo is the official implementation of "V

Swin Transformer 981 Jan 03, 2023
A playable implementation of Fully Convolutional Networks with Keras.

keras-fcn A re-implementation of Fully Convolutional Networks with Keras Installation Dependencies keras tensorflow Install with pip $ pip install git

JihongJu 202 Sep 07, 2022
Implementation for the paper: Invertible Denoising Network: A Light Solution for Real Noise Removal (CVPR2021).

Invertible Image Denoising This is the PyTorch implementation of paper: Invertible Denoising Network: A Light Solution for Real Noise Removal (CVPR 20

157 Dec 25, 2022
Semi-Supervised Semantic Segmentation with Cross-Consistency Training (CCT)

Semi-Supervised Semantic Segmentation with Cross-Consistency Training (CCT) Paper, Project Page This repo contains the official implementation of CVPR

Yassine 344 Dec 29, 2022
CONditionals for Ordinal Regression and classification in tensorflow

Condor Ordinal regression in Tensorflow Keras Tensorflow Keras implementation of CONDOR Ordinal Regression (aka ordinal classification) by Garrett Jen

9 Jul 31, 2022
Website which uses Deep Learning to generate horror stories.

Creepypasta - Text Generator Website which uses Deep Learning to generate horror stories. View Demo · View Website Repo · Report Bug · Request Feature

Dhairya Sharma 5 Oct 14, 2022
TransPrompt - Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification

TransPrompt This code is implement for our EMNLP 2021's paper 《TransPrompt:Towards an Automatic Transferable Prompting Framework for Few-shot Text Cla

WangJianing 23 Dec 21, 2022
Exploring Image Deblurring via Blur Kernel Space (CVPR'21)

Exploring Image Deblurring via Encoded Blur Kernel Space About the project We introduce a method to encode the blur operators of an arbitrary dataset

VinAI Research 118 Dec 19, 2022
PyTorch implementation of ENet

PyTorch-ENet PyTorch (v1.1.0) implementation of ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation, ported from the lua-torc

David Silva 333 Dec 29, 2022
FIRA: Fine-Grained Graph-Based Code Change Representation for Automated Commit Message Generation

FIRA is a learning-based commit message generation approach, which first represents code changes via fine-grained graphs and then learns to generate commit messages automatically.

Van 21 Dec 30, 2022