Calculates JMA (Japan Meteorological Agency) seismic intensity (shindo) scale from acceleration data recorded in NumPy array

Related tags

Deep Learningshindo
Overview

shindo.py

Calculates JMA (Japan Meteorological Agency) seismic intensity (shindo) scale from acceleration data stored in NumPy array

Introduction

Japan is a country known for frequent earthquake occurrence. A special scale to evaluate the seismic intensity called shindo (震度) have been defined by the Japan Meteorological Agency (JMA). Long ago, the shindo scale was determined by personnel who inspect the damaged area in person to observe the ratio of collapsed houses, etc. However, from a few decades ago, it was begun to calculate the shindo scale from digital acceleration data recorded by accelerometers. The shindo scale has levels from 0 to 7, where actually 5 and 6 are devided into 5-, 5+, 6-, and 6+. 0 is the weakest and 7 is the strongest. For example, the 1995 Hanshin-Awaji Earthquake (1.17) and the 2011 Great East Japan Earthquake (3.11) recorded shindo 7, together with several other strongest earthquakes. In contrast, earthquakes at shindo 3 or below are often.

This Python module, shindo.py, calculates the shindo scale from 3-D acceleration data stored in a NumPy array in the unit of [gal] or [cm/s2].

Calculation method

Usually, the acceleration data for north-south, east-west, and up-down axes are acquired every 10 ms to calculate shindo. If a NumPy array stores the acceleration data as such in [gal], this Python module can calculates shindo.

Calculation steps

There are seven steps to calculate shindo from acceleration data.

  1. Each of the 3-D acceleration data is transformed into frequency domain by DFT or FFT.
  2. Three special filters are appplied to the 3-D acceleration spectra.
  3. The spectra is transformed back into time domain by inverse DFT or FFT.
  4. The root-sum-square (RSS) acceleration (i.e. the absolute value of the vector sum) is calculated from the time-domain north-south, east-west, and up-down data obtained in 3.
  5. A value called a is found where the RSS acceleration is above the value of a for 0.3 seconds.
  6. A value called I is obtained by I = 2 log10 a + 0.94.
  7. I is rounded at the third digits from the decimal point and cut off the fractional values below the second digit from the decimal point.

Special filters

Three filters are applied to the spectra, namely, the periodic-effect filter, the high-cut filter, and the low-cut filter. The mathematical expression of these filters can be found in the WikiPedia article, but the gain of the filters are shown below.

Periodic-effect, high-cut, and low-cut filters

In frequency domain, “applying filters” means just multiplying the gain in the figure above to the spectra, i.e., product of two NumPy arrays if the filters are also expressed as a NumPy array.

How to use

shindo.getShindo(a: numpy.ndarray, Ts: float) -> float

Giving an NumPy array a whose shape is (N, 3) to this shindo.getShindo() function returns the JMA instrumental shindo value, which corresponds to the result, I, after Step 7 above. Ts is the sampling period. Usually, Ts = 0.01 if you give this function the recorded past seismic data from the JMA website.

The number of data points of the NumPy array, N, is arbitrary. However, the number of data points should contain enough length in time domain, e.g., 5 seconds, to enable accurate calculation of shindo. If Ts = 10 ms, N = 500 for 5 seconds of acceleration data.

shindo.getShindoName(I: float, lang: str = 'jp') -> str

This functon converts the JMA instrumental shindo scale, which may have fractional values below the decimal point, into the actual shindo scale. If lang = 'jp' is given, shindo 5-, 5+, 6-, and 6+ becomes 5弱, 5強, 6弱, and 6強 by this function. if lang != 'jp', 5-, 5+, 6-, and 6+ are returned, as a string. Shindo 0-4 are also returned as a string.

Test bench

This module contains the if __name__ == '__main__': section in order to allow to be run and test itself. A data of past major earthquake, which was observed in Yonago, Tottori, Japan, is automatically downloaded as a CSV file and the acceleration data is acquired as a NumPy array. The calculation will show shindo 5.1, which is equal to the value available on the JMA website. You can change the past earthquake if you know the URL of the CSV file. See this website for the past major earthquake acceleration data.

Owner
RR_Inyo
An electrical engineer/researcher wanting to use programming languages to enhance power electronics systems design and analysis. A hobbyist programmer.
RR_Inyo
Implementation of a Transformer using ReLA (Rectified Linear Attention)

ReLA (Rectified Linear Attention) Transformer Implementation of a Transformer using ReLA (Rectified Linear Attention). It will also contain an attempt

Phil Wang 49 Oct 14, 2022
A code repository associated with the paper A Benchmark for Rough Sketch Cleanup by Chuan Yan, David Vanderhaeghe, and Yotam Gingold from SIGGRAPH Asia 2020.

A Benchmark for Rough Sketch Cleanup This is the code repository associated with the paper A Benchmark for Rough Sketch Cleanup by Chuan Yan, David Va

33 Dec 18, 2022
PCGNN - Procedural Content Generation with NEAT and Novelty

PCGNN - Procedural Content Generation with NEAT and Novelty Generation Approach — Metrics — Paper — Poster — Examples PCGNN - Procedural Content Gener

Michael Beukman 8 Dec 10, 2022
Real life contra a deep learning project built using mediapipe and openc

real-life-contra Description A python script that translates the body movement into in game control. Welcome to all new real life contra a deep learni

Programminghut 7 Jan 26, 2022
Implementation of Nyström Self-attention, from the paper Nyströmformer

Nyström Attention Implementation of Nyström Self-attention, from the paper Nyströmformer. Yannic Kilcher video Install $ pip install nystrom-attention

Phil Wang 95 Jan 02, 2023
This repository collects project-relevant Isabelle/HOL formalizations.

Isabelle/HOL formalizations related to the AuReLeE project Formalization of Abstract Argumentation Frameworks See AbstractArgumentation folder for the

AuReLeE project 1 Sep 10, 2022
Semantic graph parser based on Categorial grammars

Lambekseq "Everyone who failed Greek or Latin hates it." This package is for proving theorems in Categorial grammars (CG) and constructing semantic gr

10 Aug 19, 2022
Efficient Householder transformation in PyTorch

Efficient Householder Transformation in PyTorch This repository implements the Householder transformation algorithm for calculating orthogonal matrice

Anton Obukhov 49 Nov 20, 2022
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)

S2-BNN (Self-supervised Binary Neural Networks Using Distillation Loss) This is the official pytorch implementation of our paper: "S2-BNN: Bridging th

Zhiqiang Shen 52 Dec 24, 2022
Pytorch implementation of DeepMind's differentiable neural computer paper.

DNC pytorch This is a Pytorch implementation of DeepMind's Differentiable Neural Computer (DNC) architecture introduced in their recent Nature paper:

Yuanpu Xie 91 Nov 21, 2022
MIMO-UNet - Official Pytorch Implementation

MIMO-UNet - Official Pytorch Implementation This repository provides the official PyTorch implementation of the following paper: Rethinking Coarse-to-

Sungjin Cho 248 Jan 02, 2023
AI-Bot - 一个基于watermelon改造的OpenAI-GPT-2的智能机器人

AI-Bot 一个基于watermelon改造的OpenAI-GPT-2的智能机器人 在Binder上直接运行测试 目前有两种实现方式 TF2的GPT-2 TF

9 Nov 16, 2022
Repository of continual learning papers

Continual learning paper repository This repository contains an incomplete (but dynamically updated) list of papers exploring continual learning in ma

29 Jan 05, 2023
Traditional deepdream with VQGAN+CLIP and optical flow. Ready to use in Google Colab

VQGAN-CLIP-Video cat.mp4 policeman.mp4 schoolboy.mp4 forsenBOG.mp4

23 Oct 26, 2022
Final project code: Implementing MAE with downscaled encoders and datasets, for ESE546 FA21 at University of Pennsylvania

546 Final Project: Masked Autoencoder Haoran Tang, Qirui Wu 1. Training To train the network, please run mae_pretraining.py. Please modify folder path

Haoran Tang 0 Apr 22, 2022
Robust Self-augmentation for NER with Meta-reweighting

Robust Self-augmentation for NER with Meta-reweighting

Lam chi 17 Nov 22, 2022
Another pytorch implementation of FCN (Fully Convolutional Networks)

FCN-pytorch-easiest Trying to be the easiest FCN pytorch implementation and just in a get and use fashion Here I use a handbag semantic segmentation f

Y. Dong 158 Dec 21, 2022
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks

Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks (SDPoint) This repository contains the cod

Jason Kuen 17 Jul 04, 2022
PyTorch Implementation of Exploring Explicit Domain Supervision for Latent Space Disentanglement in Unpaired Image-to-Image Translation.

DosGAN-PyTorch PyTorch Implementation of Exploring Explicit Domain Supervision for Latent Space Disentanglement in Unpaired Image-to-Image Translation

40 Nov 30, 2022
Denoising Normalizing Flow

Denoising Normalizing Flow Christian Horvat and Jean-Pascal Pfister 2021 We combine Normalizing Flows (NFs) and Denoising Auto Encoder (DAE) by introd

CHrvt 17 Oct 15, 2022