A FAIR dataset of TCV experimental results for validating edge/divertor turbulence models.

Related tags

Deep LearningTCV-X21
Overview

TCV-X21 validation for divertor turbulence simulations

Quick links

arXiv PDF

Binder DOI

Dataset licence Software licence

Test Python package codecov

Intro

Welcome to TCV-X21. We're glad you've found us!

This repository is designed to let you perform the analysis presented in Oliveira and Body et. al., Nuclear Fusion, 2021, both using the data given in the paper, and with a turbulence simulation of your own. We hope that, by providing the analysis, the TCV-X21 case can be used as a standard validation and bench-marking case for turbulence simulations of the divertor in fusion experiments. The repository allows you to scrutinise and suggest improvements to the analysis (there's always room for improvement), to directly interact with and explore the data in greater depth than is possible in a paper, and — we hope — use this case to test a simulation of your own.

To use this repository, you'll need to either use the mybinder.org link below OR user rights on a computer with Python-3, conda and git-lfs pre-installed.

Video tutorial

This quick tutorial shows you how to navigate the repository and use some of the functionality of the library.

Video_tutorial.mp4

What can you find in this repository

  • 1.experimental_data: data from the TCV experimental campaign, in NetCDF, MATLAB and IMAS formats, as well as information about the reference scenario, and the reference magnetic geometry (in .eqdsk, IMAS and PARALLAX-nc formats)
  • 2.simulation_data: data from simulations of the TCV-X21 case, in NetCDF format, as well as raw data files and conversion routines
  • 3.results: high resolution PNGs and LaTeX-ready tables for a paper
  • tcvx21: a Python library of software, which includes
    • record_c: a class to interface with NetCDF/HDF5 formatted data files
    • observable_c: a class to interact with and plot observables
    • file_io: tools to interact with MATLAB and JSON files
    • quant_validation: routines to perform the quantitative validation
    • analysis: statistics, curve-fitting, bootstrap algorithms, contour finding
    • units_m.py: setting up pint-based unit-aware analysis (it's difficult to overstate how cool this library is)
    • grillix_post: a set of routines used for post-processing GRILLIX simulation data, which might help if you're trying to post-process your own simulation. You can see a worked example in simulation_postprocessing.ipynb
  • notebooks: Jupyter notebooks, which allow us to provide code with outputs and comments together
    • simulation_setup.ipynb: what you might need to set up a simulation to test
    • simulation_postprocessing.ipynb: how to post-process the data
    • data_exploration.ipynb: some examples to get you started exploring the data
    • bulk_process.ipynb: runs over every observable to make the results — which you'll need to do if you're writing a paper from the results
  • tests: tests to make sure that we haven't broken anything in the analysis routines
  • README.md: this file, which helps you to get the software up and running, and to explain where you can find everything you need. It also provides the details of the licencing (below). There's more specific README.md files in several of the subfolders.

and lots more files. If you're not a developer, you can safely ignore these.

What can't you find in this repository

Due to licencing issues, the source code of the simulations is not provided. Sorry!

Also, the raw simulations are not provided here due to space limitations (some runs have more than a terabyte of data), but they are all backed up on archive servers. If you'd like to access the raw data, get in contact.

License and attribution notice

The TCV-X21 datasets are licenced under a Creative Commons Attribution 4.0 license, given in LICENCE. The source code of the analysis routines and Python library is licenced under a MIT license, given in tcvx21/LICENCE.

For the datasets, we ask that you provide attribution if using this data via the citation in the CITATION.cff file. We additionally require that you mark any changes to the dataset, and state specifically that the authors do not endorse your work unless such endorsement has been expressly given.

For the software, you can use, modify and share without attribution or marking changes.

Running the Jupyter notebooks (installation as non-root user)

To run the Jupyter notebooks, you have two options. The first is to use the mybinder.org interface, which let you interact with the notebooks via a web interface. You can launch the binder for this repository by clicking the binder badge in the repository header. Note that not all of the repository content is copied to the Docker image (this is specified in .dockerignore). The large checkpoint files are not included in the image, although they can be found in the repository at 2.simulation_data/GRILLIX/checkpoints_for_1mm. Additionally, the default docker image will not work with git.

Alternatively, if you'd like to run the notebooks locally or to extend the repository, you'll need to install additional Python packages. First of all, you need Python-3 and conda installed (latest versions recommended). Then, to install the necessary packages, we make a sandbox environment. This has a few advantages to installing packages globally — sudo rights are not required, you can install package versions without risking breaking other Python scripts, and if everything goes terribly wrong you can easily delete everything and restart. We've included a simple shell script to perform the necessary steps, which you can execute with

./install_env.sh

This will install the library in a subfolder of the TCV-X21 repository called tcvx21_env. It will also add a kernel to your global Jupyter installation. To remove the repository, you can delete the folder tcvx21_env and run jupyter kernelspec uninstall tcvx21.

To run tests and open Jupyter

Once you've installed via either option, you can activate the python environment with conda activate ./tcvx21_env. To deactivate, run conda deactivate.

Then, it is recommended to run the test suite with pytest which ensures that everything is installed and working correctly. If something fails, let us know in the issues. Note that this executes all of the analysis notebooks, so it might take a while to run.

Finally, run jupyter lab to open a Jupyter server in the TCV-X21 repository. Then, you can open any of the notebooks (.ipynb extension) by clicking in the side-bar.

A note on pinned dependencies

To ensure that the results are reproducible, the environment.yml file has pinned dependencies. However, if you want to use this software as a library, pinned dependencies are unnecessarily restrictive. You can remove the versions after the = sign in the environment.yml, but be warned that things might break.

You might also like...
Fair Recommendation in Two-Sided Platforms

Fair Recommendation in Two-Sided Platforms

Code for Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022)

Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022) We consider how a user of a web servi

Regulatory Instruments for Fair Personalized Pricing.

Fair pricing Source code for WWW 2022 paper Regulatory Instruments for Fair Personalized Pricing. Installation Requirements Linux with Python = 3.6 p

This is the official repo for TransFill:  Reference-guided Image Inpainting by Merging Multiple Color and Spatial Transformations at CVPR'21. According to some product reasons, we are not planning to release the training/testing codes and models. However, we will release the dataset and the scripts to prepare the dataset.
This code reproduces the results of the paper, "Measuring Data Leakage in Machine-Learning Models with Fisher Information"

Fisher Information Loss This repository contains code that can be used to reproduce the experimental results presented in the paper: Awni Hannun, Chua

A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]
Experimental solutions to selected exercises from the book [Advances in Financial Machine Learning by Marcos Lopez De Prado]

Advances in Financial Machine Learning Exercises Experimental solutions to selected exercises from the book Advances in Financial Machine Learning by

An experimental technique for efficiently exploring neural architectures.
An experimental technique for efficiently exploring neural architectures.

SMASH: One-Shot Model Architecture Search through HyperNetworks An experimental technique for efficiently exploring neural architectures. This reposit

A simple but complete full-attention transformer with a set of promising experimental features from various papers
A simple but complete full-attention transformer with a set of promising experimental features from various papers

x-transformers A concise but fully-featured transformer, complete with a set of promising experimental features from various papers. Install $ pip ins

Comments
  • Repair results

    Repair results

    It appears that the 3.results folder had not been updated with the outputs of the notebooks.

    I've rerun the notebooks and now have the latest results in the folder.

    opened by TBody 1
Releases(v1.0)
An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates neural fields, predictive coding, top-down-bottom-up, and attention (consensus between columns)

GLOM - Pytorch (wip) An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates neural fields, predictive coding,

Phil Wang 173 Dec 14, 2022
pytorch implementation of fast-neural-style

fast-neural-style 🌇 🚀 NOTICE: This codebase is no longer maintained, please use the codebase from pytorch examples repository available at pytorch/e

Abhishek Kadian 405 Dec 15, 2022
This is the official Pytorch implementation of the paper "Diverse Motion Stylization for Multiple Style Domains via Spatial-Temporal Graph-Based Generative Model"

Diverse Motion Stylization (Official) This is the official Pytorch implementation of this paper. Diverse Motion Stylization for Multiple Style Domains

Soomin Park 28 Dec 16, 2022
SMORE: Knowledge Graph Completion and Multi-hop Reasoning in Massive Knowledge Graphs

SMORE: Knowledge Graph Completion and Multi-hop Reasoning in Massive Knowledge Graphs SMORE is a a versatile framework that scales multi-hop query emb

Google Research 135 Dec 27, 2022
This repo includes our code for evaluating and improving transferability in domain generalization (NeurIPS 2021)

Transferability for domain generalization This repo is for evaluating and improving transferability in domain generalization (NeurIPS 2021), based on

gordon 9 Nov 29, 2022
Official pytorch implementation of paper Dual-Level Collaborative Transformer for Image Captioning (AAAI 2021).

Dual-Level Collaborative Transformer for Image Captioning This repository contains the reference code for the paper Dual-Level Collaborative Transform

lyricpoem 160 Dec 11, 2022
AdamW optimizer for bfloat16 models in pytorch.

Image source AdamW optimizer for bfloat16 models in pytorch. Bfloat16 is currently an optimal tradeoff between range and relative error for deep netwo

Alex Rogozhnikov 8 Nov 20, 2022
This repository contains all data used for writing a research paper Multiple Object Trackers in OpenCV: A Benchmark, presented in ISIE 2021 conference in Kyoto, Japan.

OpenCV-Multiple-Object-Tracking Python is version 3.6.7 to install opencv: pip uninstall opecv-python pip uninstall opencv-contrib-python pip install

6 Dec 19, 2021
Experiments for distributed optimization algorithms

Network-Distributed Algorithm Experiments -- This repository contains a set of optimization algorithms and objective functions, and all code needed to

Boyue Li 40 Dec 04, 2022
Trainable Bilateral Filter Layer (PyTorch)

Trainable Bilateral Filter Layer (PyTorch) This repository contains our GPU-accelerated trainable bilateral filter layer (three spatial and one range

FabianWagner 26 Dec 25, 2022
[Link]deep_portfolo - Use Reforcemet earg ad Supervsed learg to Optmze portfolo allocato []

rl_portfolio This Repository uses Reinforcement Learning and Supervised learning to Optimize portfolio allocation. The goal is to make profitable agen

Deepender Singla 165 Dec 02, 2022
Code for Graph-to-Tree Learning for Solving Math Word Problems (ACL 2020)

Graph-to-Tree Learning for Solving Math Word Problems PyTorch implementation of Graph based Math Word Problem solver described in our ACL 2020 paper G

Jipeng Zhang 66 Nov 23, 2022
ComputerVision - This repository aims at realized easy network architecture

ComputerVision This repository aims at realized easy network architecture Colori

DongDong 4 Dec 14, 2022
Object detection, 3D detection, and pose estimation using center point detection:

Objects as Points Object detection, 3D detection, and pose estimation using center point detection: Objects as Points, Xingyi Zhou, Dequan Wang, Phili

Xingyi Zhou 6.7k Jan 03, 2023
An unofficial implementation of "Unpaired Image Super-Resolution using Pseudo-Supervision." CVPR2020

UnpairedSR An unofficial implementation of "Unpaired Image Super-Resolution using Pseudo-Supervision." CVPR2020 turn RCAN(modified) -- xmodel(xilinx

JiaKui Hu 10 Oct 28, 2022
Code to use Augmented Shapiro Wilks Stopping, as well as code for the paper "Statistically Signifigant Stopping of Neural Network Training"

This codebase is being actively maintained, please create and issue if you have issues using it Basics All data files are included under losses and ea

J K Terry 32 Nov 09, 2021
FastCover: A Self-Supervised Learning Framework for Multi-Hop Influence Maximization in Social Networks by Anonymous.

FastCover: A Self-Supervised Learning Framework for Multi-Hop Influence Maximization in Social Networks by Anonymous.

0 Apr 02, 2021
[ICCV 2021] FaPN: Feature-aligned Pyramid Network for Dense Image Prediction

FaPN: Feature-aligned Pyramid Network for Dense Image Prediction [arXiv] [Project Page] @inproceedings{ huang2021fapn, title={{FaPN}: Feature-alig

Shihua Huang 23 Jul 22, 2022
A Light CNN for Deep Face Representation with Noisy Labels

A Light CNN for Deep Face Representation with Noisy Labels Citation If you use our models, please cite the following paper: @article{wulight, title=

Alfred Xiang Wu 715 Nov 05, 2022
Official PyTorch implementation for paper "Efficient Two-Stage Detection of Human–Object Interactions with a Novel Unary–Pairwise Transformer"

UPT: Unary–Pairwise Transformers This repository contains the official PyTorch implementation for the paper Frederic Z. Zhang, Dylan Campbell and Step

Frederic Zhang 109 Dec 20, 2022