Additional tools for particle accelerator data analysis and machine information

Overview

PyLHC Tools

Cron Testing Code Climate coverage Code Climate maintainability (percentage) GitHub last commit GitHub release

This package is a collection of useful scripts and tools for the Optics Measurements and Corrections group (OMC) at CERN.

Documentation

Getting Started

This package is Python 3.7+ compatible, and can be installed through pip:

pip install pylhc

One can also install from VCS:

git clone https://github.com/pylhc/PyLHC
pip install /path/to/PyLHC

Or simply from the online master branch, which is stable:

pip install git+https://github.com/pylhc/PyLHC.git#egg=pylhc

After installing, scripts can be run with either python -m pylhc.SCRIPT --FLAG ARGUMENT or by calling the .py files directly.

Note: some scripts access functionality only available on the CERN Technical Network. To use those, you should make sure to install the relevant extra dependencies with pip install path/to/Pylhc[cern].

Description

This package provides tools for particle accelerator data analysis, simulations management and machine information extraction; complementing the optics measurement analysis tools of the omc3 package.

Functionality

  • Forced DA Analysis - Script to analyse forced DA. (forced_da_analysis.py)
  • Machine Settings Info - Prints an overview over the machine settings at a given time. (machine_settings_info.py)
  • BSRT Logger and BSRT Analysis - Saves data coming straight from LHC BSRT FESA class and allows subsequent analysis. (bsrt_logger.py & bsrt_analysis.py )
  • BPM Calibration Factors - Compute the BPM calibration factors using ballistic optics. Two methods are available: using the beta function and using the dispersion. (bpm_calibration.py)

Quality checks

  • Unit and accuracy tests are run automatically through CI Github Actions. See our workflows in this readme.
  • Additional checks for code-complexity, design-rules, test-coverage and duplication are made through CodeClimate.
  • Pull requests implementing functionality or fixes are merged into the master branch after passing CI, and getting a reviewer's approval.

Changelog

See the CHANGELOG file.

Hints for Developers

In case you want to contribute to PyLHC's development, you should install it in editable mode:

git clone https://github.com/pylhc/PyLHC
pip install --editable PyLHC

You can install extra dependencies (as defined in setup.py) suited to your use case with the following commands:

pip install --editable PyLHC[cern]
pip install --editable PyLHC[test]
pip install --editable PyLHC[test,doc]
pip install --editable PyLHC[all]

Open an issue, make your changes in a branch and submit a pull request.

Authors

  • pyLHC/OMC-Team - Working Group - pyLHC

License

This project is licensed under the GNU GPLv3 License - see the LICENSE file for details.

Comments
  • BPM Calibration

    BPM Calibration

    opened by Mael-Le-Garrec 5
  • Properly organise extra dependencies for the CERN GPN

    Properly organise extra dependencies for the CERN GPN

    This is essentially https://github.com/pylhc/omc3/issues/272 applied to this package. I will also rename the extra from tech to cern as agreed in https://github.com/pylhc/omc3/pull/273.

    Enhancement Feature Release 
    opened by fsoubelet 2
  • Refactor for consistency

    Refactor for consistency

    Refactor for consistency with other pylhc packages, which should close #41

    There's some imports changes in there from Pycharm's optimize imports, but the important changes are in setup.py and conf.py. Includes dependencies version updates.

    Important change regarding pyjapc, which is not kept up-to-date on PyPI (and installing from master can mess things up badly in builds): it is now declared as an extra dependency ([tech]) as we discussed on Mattermost, but this can still change. Tbd in this PR.

    Moving to GA will be in another issue / PR.

    opened by fsoubelet 2
  • Update setup for consistency with pylhc packages

    Update setup for consistency with pylhc packages

    Would also be a good time to think about the dependencies' versions.

    Since we're all fine with pandas 1.x in other packages, we probably shouldn't require 0.25.x here.

    Enhancement 
    opened by fsoubelet 2
  • SDDS update for llong

    SDDS update for llong

    Hi,

    The new format for sdds files often uses llong format. I was able to use your code with the following modifications : classes.py, line 17-19 : add llong format (>i8, 8, int) reader.py line 139 : convert num_dims to int -> int(num_dims)

    Thank you for the good work! Is there any plan to add ascii compatibility?

    opened by pbelange 2
  • LSA knob to MAD-X script converter

    LSA knob to MAD-X script converter

    Adds a script which can take an LSA knob and an optics it is defined for to create both a definition TFS file and a MAD-X script which will reproduce the knob in simulations.

    The script can also be fed a text file with many knobs definitions as well as their trim values, and run for all of these knobs. See examples in the module docstring.

    opened by fsoubelet 1
  • Remove submitter scripts

    Remove submitter scripts

    Closes #71

    Removed:

    • Entrypoint scripts for job_submitter and autosix
    • The htc and sixdesk_tools modules
    • Tests for the above
    • Documentation files for the modules and the entrypoints
    • Mentions in the README
    Release Request 
    opened by fsoubelet 1
  • Import ABCs from the proper modules

    Import ABCs from the proper modules

    Current job_submitter raises the following:

    DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
      from collections import OrderedDict, Iterable
    

    Should be a very quick change to become compliant (the warning is for Iterable only though).

    Enhancement 
    opened by fsoubelet 1
  • Fix lsa to madx writer

    Fix lsa to madx writer

    • LSA to MADX sign convention
    • Better trim naming
    • check madx names for allowed characters
    • option to init all variables
    • machine settings info takes ISO time
    opened by JoschD 0
  • CI Updates

    CI Updates

    Update to the CI workflows making use of the newer versions of certain official worflows.

    • Caching and cache management left to the setup-python action
    • Properly call pip as module everywhere
    • Do not run the build (and build check) twice in the publish repo
    CI/CD 
    opened by fsoubelet 0
  • Rewrite Forced_DA to da_analysis

    Rewrite Forced_DA to da_analysis

    Allow input of measurements from blown-up beams, (single) kicked beams and excited beams (forced da) via switches.

    • add missing formulas
    • input to be checked for heated (no kick)
    • input also nominal emittance (HL-LHC)
    opened by JoschD 0
  • add script to calculate RDT from tracking data

    add script to calculate RDT from tracking data

    complementary to metaclass/opticsclass, create script which takes PTC trackone data and returns processed RDTs

    compared to omc3 RDT reconstruction, no px reconstruction necessary

    Enhancement 
    opened by mihofer 3
Releases(0.7.4)
  • 0.7.4(Oct 19, 2022)

    Path release 0.7.4:

    Changes in Machine Settings Info

    • Default behaviour for no knobs given(no --knobs or knobs=None): extract None.
    • Old behaviour of extracting all restored by giving knobs = ["all"] (CLI: --knobs all)
    • Option ["default"] available for default knobs as used in OMC3. (CLI: --knobs default)
    • Additional debug logging

    What's Changed

    • Machine settings debug logging and knobs by @JoschD in https://github.com/pylhc/PyLHC/pull/105

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.3...0.7.4

    Source code(tar.gz)
    Source code(zip)
  • 0.7.3(Oct 11, 2022)

    Release 0.7.3 is a patch release which fixes:

    • LSA to MADX sign convention
    • Better trim naming
    • check madx names for allowed characters
    • option to init all variables
    • machine settings info takes ISO time

    What's Changed

    • CI Updates by @fsoubelet in https://github.com/pylhc/PyLHC/pull/103
    • Fix lsa to madx writer by @JoschD in https://github.com/pylhc/PyLHC/pull/104

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.2...0.7.3

    Source code(tar.gz)
    Source code(zip)
  • 0.7.2(May 31, 2022)

    Release 0.7.2 brings a fix to the lsa_to_madx module, ensuring it does not make the user run into a MAD-X bug later on when using the created knobs.

    Fixed:

    • Trim variables generated in the MAD-X script will make sure not to be longer than 47 characters (hard MAD-X limit), nor start with an underscore or a digit.

    What's Changed

    • Fix: MAD-X Variable Name Length Limit by @fsoubelet in https://github.com/pylhc/PyLHC/pull/102

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.1...0.7.2

    Source code(tar.gz)
    Source code(zip)
  • 0.7.1(May 23, 2022)

    Release 0.7.1 brings a fix to the lsa_to_madx module.

    Fixed:

    • Will not attempt to write to disk knobs that were not found by LSA in the provided optics.

    What's Changed

    • Fix: Do not attempt to write when knob isn't found in LSA optics by @fsoubelet in https://github.com/pylhc/PyLHC/pull/101

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.7.0...0.7.1

    Source code(tar.gz)
    Source code(zip)
  • 0.7.0(May 23, 2022)

    Release 0.7.0 contains the following changes:

    Added:

    • Added a new module, pylhc.lsa_to_madx, with functionality to parse LSA knobs from the command line or a text file, retrieve relevant information from LSA and create MAD-X files with the commands necessary to reproduce these knobs in simulations. This is of particular use when trying to reproduce a specific machine configuration in simulations.

    What's Changed

    • LSA knob to MAD-X script converter by @fsoubelet in https://github.com/pylhc/PyLHC/pull/100

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.6.2...0.7.0

    Source code(tar.gz)
    Source code(zip)
  • 0.6.2(Apr 23, 2022)

    Release 0.6.2 adds a flag to the info functionality of pylhc.kickgroups to display a copy-pastable list of kick files once can use in the GUI to load them all at once.

    What's Changed

    • Kickgroups file list by @JoschD and @fsoubelet in https://github.com/pylhc/PyLHC/pull/99

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.6.1...0.6.2

    Source code(tar.gz)
    Source code(zip)
  • 0.6.1(Apr 22, 2022)

    Release 0.6.1 brings a fix to the kickgroups module.

    Fixed:

    • Correctly detect the plane of the excitationSettings being read.
    • Better handling of kickgroups with no kickfiles.

    Changed:

    • The command-line commands have been renamed to list and info.

    What's Changed

    • fix by @JoschD in https://github.com/pylhc/PyLHC/pull/98

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.6.0...0.6.1

    Source code(tar.gz)
    Source code(zip)
  • 0.6.0(Apr 22, 2022)

    Release 0.6.0 contains the following changes:

    Added:

    • Added a new module, pylhc.kickgroups, with functionality to query available kickgroup files from a location, retrieve information of a given kickgroup, and retrieve relevant information for all kicks in a kickgroup. It can be called as a script (python -m pylhc.kickgroups) to printout copy-pastable information to put in the OMC logbook.

    What's Changed

    • Added KickGroup Infos by @JoschD and @fsoubelet in https://github.com/pylhc/PyLHC/pull/97

    Full Changelog: https://github.com/pylhc/PyLHC/compare/v0.5.0...0.6.0

    Source code(tar.gz)
    Source code(zip)
  • v0.5.0(Apr 20, 2022)

    What's Changed

    • Cron workflow fix by @fsoubelet in https://github.com/pylhc/PyLHC/pull/95
    • Removed IRNL RDT Correction by @JoschD in https://github.com/pylhc/PyLHC/pull/96 which can now be found as its own package in https://github.com/pylhc/irnl_rdt_correction

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.4.2...v0.5.0

    Source code(tar.gz)
    Source code(zip)
  • 0.4.2(Mar 31, 2022)

  • v0.4.1(Feb 20, 2022)

    Minor bugfixes in machine_settings_info.

    • Added:

      • time and start_time can now be given as AccDatetime-objects.
    • Fixed:

      • trims variable is initialized as None. Was not initialized if no trims were found, but used later on.
    Source code(tar.gz)
    Source code(zip)
  • v0.4.0(Feb 16, 2022)

    What's Changed

    • Add Zenodo DOI to README by @fsoubelet in https://github.com/pylhc/PyLHC/pull/89
    • Adds check for non-existing knobs by @JoschD in https://github.com/pylhc/PyLHC/pull/90
    • Update CI by @fsoubelet in https://github.com/pylhc/PyLHC/pull/91
    • Lsa with timerange by @JoschD in https://github.com/pylhc/PyLHC/pull/92

    Release 0.4.0 brings the trim-history option to the machine-info extractor. To enable this, one needs to provide a start_time. The return values are now organized into a dictionary.

    Full Changelog: https://github.com/pylhc/PyLHC/compare/0.3.0...v0.4.0

    Source code(tar.gz)
    Source code(zip)
  • 0.3.0(Nov 16, 2021)

    Release 0.3.0 brings the following:

    Added:

    • Non-linear correction script for the (HL)LHC Insertion Regions Resonance Driving Terms, including feed-down effects.

    Changed:

    • The package's license has been moved from GPLv3 to MIT.

    Note: if one wishes to extend the IRNL correction script to a different accelerator, there are valuable pointers in the following PR comment.

    Source code(tar.gz)
    Source code(zip)
  • 0.2.0(Nov 3, 2021)

    This is the first release of pylhc since its omc3 dependency is available on PyPI.

    Added:

    • BPM calibration script to get calibration factors from different BPMs
    • Proper mocking of CERN TN packages (functionality imported from omc3)

    Changed:

    • Minimum required tfs-pandas version is now 3.0.2
    • Minimum required generic-parser version is now 1.0.8
    • Minimum required omc3 version is now 0.2.0
    • Extras related to the CERN TN are now installed with python -m pip install pylhc[cern]

    Removed:

    • The HTCondor and AutoSix functionality have been removed and extracted to another package at https://github.com/pylhc/submitter
    Source code(tar.gz)
    Source code(zip)
  • 0.1.0(Dec 11, 2020)

    • Added:

      • Job submitter script to easily generate and schedule jobs through HTCondor.
      • Autosix script to easily generate and submit parametric SixDesk studies through HTCondor.
      • Script to analyse forced dynamic aperture data.
      • Scripts for logging and analysis of LHC BSRT data.
      • Utility modules supporting functionality for the above scripts.
    • Changed:

      • License moved to GNU GPLv3 to comply with the use of the omc3 package.
    • Miscellaneous:

      • Introduced extra dependencies tailored to different use cases of the package.
      • Reworked package organisation for consistency.
      • Set minimum requirements versions.
      • Moved CI/CD setup to Github Actions.
      • Improved testing and test coverage.
    Source code(tar.gz)
    Source code(zip)
Owner
PyLHC
Organisation for the OMC Team at CERN
PyLHC
Maximum Covariance Analysis in Python

xMCA | Maximum Covariance Analysis in Python The aim of this package is to provide a flexible tool for the climate science community to perform Maximu

Niclas Rieger 39 Jan 03, 2023
AptaMat is a simple script which aims to measure differences between DNA or RNA secondary structures.

AptaMAT Purpose AptaMat is a simple script which aims to measure differences between DNA or RNA secondary structures. The method is based on the compa

GEC UTC 3 Nov 03, 2022
A distributed block-based data storage and compute engine

Nebula is an extremely-fast end-to-end interactive big data analytics solution. Nebula is designed as a high-performance columnar data storage and tabular OLAP engine.

Columns AI 131 Dec 26, 2022
yt is an open-source, permissively-licensed Python library for analyzing and visualizing volumetric data.

The yt Project yt is an open-source, permissively-licensed Python library for analyzing and visualizing volumetric data. yt supports structured, varia

The yt project 367 Dec 25, 2022
Programmatically access the physical and chemical properties of elements in modern periodic table.

API to fetch elements of the periodic table in JSON format. Uses Pandas for dumping .csv data to .json and Flask for API Integration. Deployed on "pyt

the techno hack 3 Oct 23, 2022
Employee Turnover Analysis

Employee Turnover Analysis Submission to the DataCamp competition "Can you help reduce employee turnover?"

Jannik Wiedenhaupt 1 Feb 13, 2022
ASTR 302: Python for Astronomy (Winter '22)

ASTR 302, Winter 2022, University of Washington: Python for Astronomy Mario Jurić Location When: 2:30-3:50, Monday & Wednesday, Winter quarter 2022 Wh

UW ASTR 302: Python for Astronomy 4 Jan 12, 2022
Minimal working example of data acquisition with nidaqmx python API

Data Aquisition using NI-DAQmx python API Based on this project It is a minimal working example for data acquisition using the NI-DAQmx python API. It

Pablo 1 Nov 05, 2021
Shot notebooks resuming the main functions of GeoPandas

Shot notebooks resuming the main functions of GeoPandas, 2 notebooks written as Exercises to apply these functions.

1 Jan 12, 2022
Collections of pydantic models

pydantic-collections The pydantic-collections package provides BaseCollectionModel class that allows you to manipulate collections of pydantic models

Roman Snegirev 20 Dec 26, 2022
AWS Glue ETL Code Samples

AWS Glue ETL Code Samples This repository has samples that demonstrate various aspects of the new AWS Glue service, as well as various AWS Glue utilit

AWS Samples 1.2k Jan 03, 2023
Includes all files needed to satisfy hw02 requirements

HW 02 Data Sets Mean Scale Score for Asian and Hispanic Students, Grades 3 - 8 This dataset provides insights into the New York City education system

7 Oct 28, 2021
A collection of learning outcomes data analysis using Python and SQL, from DQLab.

Data Analyst with PYTHON Data Analyst berperan dalam menghasilkan analisa data serta mempresentasikan insight untuk membantu proses pengambilan keputu

6 Oct 11, 2022
Containerized Demo of Apache Spark MLlib on a Data Lakehouse (2022)

Spark-DeltaLake-Demo Reliable, Scalable Machine Learning (2022) This project was completed in an attempt to become better acquainted with the latest b

8 Mar 21, 2022
Tuplex is a parallel big data processing framework that runs data science pipelines written in Python at the speed of compiled code

Tuplex is a parallel big data processing framework that runs data science pipelines written in Python at the speed of compiled code. Tuplex has similar Python APIs to Apache Spark or Dask, but rather

Tuplex 791 Jan 04, 2023
A highly efficient and modular implementation of Gaussian Processes in PyTorch

GPyTorch GPyTorch is a Gaussian process library implemented using PyTorch. GPyTorch is designed for creating scalable, flexible, and modular Gaussian

3k Jan 02, 2023
Pypeln is a simple yet powerful Python library for creating concurrent data pipelines.

Pypeln Pypeln (pronounced as "pypeline") is a simple yet powerful Python library for creating concurrent data pipelines. Main Features Simple: Pypeln

Cristian Garcia 1.4k Dec 31, 2022
Detailed analysis on fraud claims in insurance companies, gives you information as to why huge loss take place in insurance companies

Insurance-Fraud-Claims Detailed analysis on fraud claims in insurance companies, gives you information as to why huge loss take place in insurance com

1 Jan 27, 2022
A Python module for clustering creators of social media content into networks

sm_content_clustering A Python module for clustering creators of social media content into networks. Currently supports identifying potential networks

72 Dec 30, 2022
Predictive Modeling & Analytics on Home Equity Line of Credit

Predictive Modeling & Analytics on Home Equity Line of Credit Data (Python) HMEQ Data Set In this assignment we will use Python to examine a data set

Dhaval Patel 1 Jan 09, 2022