International Space Station data with Python research 🌎

Related tags

Data AnalysisISS_data
Overview

espaciador

International Space Station data with Python research 🌎

Plotting ISS trajectory, calculating the velocity over the earth and more.


Plotting trajectory:

We are going to make a graph of the trajectory of the ISS that is N minutes long. The N will be chosen by the user according to their preferences. This means that the program will run and keep points in a list for N minutes.
We will use an API to retrieve ISS current position in latitude and longitude:

http://open-notify.org/Open-Notify-API/ISS-Location-Now/

First we need to import the following python modules:

Pandas to read json data from ISS API, plotly to make the plot of the trajectory and time to time.sleep function
import pandas as pd
import plotly.express as px
import time

Second we must initialize the list that will preserve the latitude and longitude points (every sixty seconds). You also have to initialize the N variable with time in minutes

latitudes = []
longitudes = []
N = 60 # Sixty for one hour trajectory

Then we will create the following for loop to keep recording latitude-longitude points separated by one minute

We use for i in range(N), which is the time that the script will keep running (in hours) because we have a time.sleep(60) at the end
for i in range(N):  
    url = "http://api.open-notify.org/iss-now.json" # API URL

    df = pd.read_json(url) # Pandas read JSON data from API
    
    latitudes.append(df["iss_position"]["latitude"])  # We append latitude ISS position to latitudes list
    longitudes.append(df["iss_position"]["longitude"]) # We append longitude ISS position to longitudes list
    
    time.sleep(60) # This is used to separate de point records with one minute

When the for loop finish the iterating we will have a record of N minutes ISS trajectory. Now we can plot this with Plotly (px.line_geo):

px.line_geo will create a plot with earth map
fig = px.line_geo(lat=latitudes, lon=longitudes) # Passing our latitudes and longitudes list as parameter
fig.show()  

image

This is a two hours trajectory plot

We can update our plot to orthographic projection with this code:

fig.update_geos(projection_type="orthographic")
fig.update_layout(height=300, margin={"r":0,"t":0,"l":0,"b":0})
fig.show()  

image

30 minutes trajectory plot

image

2 Hours trajectory plot GIF

Estimating ISS velocity:

We will estimate the ISS velocity using two diferent latitude-longitude points separated by one minute (sixty seconds). We can get the distance between that two points and then use phisics formula velocity(m/s) = distance(in meters)/time(in seconds)

First import the following python modules

import pandas as pd # Pandas to read API data
import time # Time for time.sleep
import geopy.distance # Geopy to get distance between two lat-lon points
import requests # Get another API data
import json # Read that data
We need to initialize two empty list to save latitudes and longitudes
lat = []
long = []
Next we will use a for loop to get the two latitude-longitude points separated by 60 seconds (time.sleep(60))
for i in range(2):  # for in range(2) because we want two lat-lon points

    url = "http://api.open-notify.org/iss-now.json" # API url

    df = pd.read_json(url) # Read API Json data with Pandas

    lat.append(df["iss_position"]["latitude"]) # Append latitude to lat list
    long.append(df["iss_position"]["longitude"]) # Append longitude to long list

    time.sleep(60) # Wait 60 seconds to record the second lat-lon point
When this for loop finish we will have a lat list with two latitude positions and one long list with two longitude positions. In conjuntion of this 4 numbers we have two lat-lon points in different time moments (separated by one minute)

Then we must get the distance between this points:

We create the two different points. The first one with lat[0] index and long[0]. The second one with lat[1] and long[0]
coords_1 = (lat[0], long[0]) 
coords_2 = (lat[1], long[1])
Then calculate distance with geopy library:
distance = (
geopy.distance.distance(coords_1, coords_2).m
) # Distance between the points in meters
But we must make a litle correction. Because ISS isn't moving in earth surface. It's orbiting aproximately 400Km above earth surface. So the radius is greater. The distance traveled is a litle bit more. To do this, we need to get ISS current altitud. Use the following code:

image

iss_alt_url = "https://api.wheretheiss.at/v1/satellites/25544"
r = requests.get(iss_alt_url)
r = r.text
r = json.loads(r)

iss_alt = int(r["altitude"]) # IN KM
Now apply phisics formula to make the correction
earth_radius = 6371 # in KM
distance_corrected = (distance * (earth_radius+iss_alt)/earth_radius)
Now finish the calculation with speed formula already explained:
speed = distancia_corrected/60 


print(round(speed*3.6, 3), "KM/H") # Multiplied by 3.6 to convert from m/s to km/h. Rounded by 3.

Output:

26367.118 KM/h
Owner
Facundo Pedaccio
Studying computer engineering and economics. I like computer science, physics, astrophysics, rocket science. Or rather the perfect combination of them.
Facundo Pedaccio
High Dimensional Portfolio Selection with Cardinality Constraints

High-Dimensional Portfolio Selecton with Cardinality Constraints This repo contains code for perform proximal gradient descent to solve sample average

Du Jinhong 2 Mar 22, 2022
wikirepo is a Python package that provides a framework to easily source and leverage standardized Wikidata information

Python based Wikidata framework for easy dataframe extraction wikirepo is a Python package that provides a framework to easily source and leverage sta

Andrew Tavis McAllister 35 Jan 04, 2023
Techdegree Data Analysis Project 2

Basketball Team Stats Tool In this project you will be writing a program that reads from the "constants" data (PLAYERS and TEAMS) in constants.py. Thi

2 Oct 23, 2021
Mining the Stack Overflow Developer Survey

Mining the Stack Overflow Developer Survey A prototype data mining application to compare the accuracy of decision tree and random forest regression m

1 Nov 16, 2021
Feature Detection Based Template Matching

Feature Detection Based Template Matching The classification of the photos was made using the OpenCv template Matching method. Installation Use the pa

Muhammet Erem 2 Nov 18, 2021
This creates a ohlc timeseries from downloaded CSV files from NSE India website and makes a SQLite database for your research.

NSE-timeseries-form-CSV-file-creator-and-SQL-appender- This creates a ohlc timeseries from downloaded CSV files from National Stock Exchange India (NS

PILLAI, Amal 1 Oct 02, 2022
Demonstrate the breadth and depth of your data science skills by earning all of the Databricks Data Scientist credentials

Data Scientist Learning Plan Demonstrate the breadth and depth of your data science skills by earning all of the Databricks Data Scientist credentials

Trung-Duy Nguyen 27 Nov 01, 2022
peptides.py is a pure-Python package to compute common descriptors for protein sequences

peptides.py Physicochemical properties and indices for amino-acid sequences. πŸ—ΊοΈ Overview peptides.py is a pure-Python package to compute common descr

Martin Larralde 32 Dec 31, 2022
AWS Glue ETL Code Samples

AWS Glue ETL Code Samples This repository has samples that demonstrate various aspects of the new AWS Glue service, as well as various AWS Glue utilit

AWS Samples 1.2k Jan 03, 2023
CINECA molecular dynamics tutorial set

High Performance Molecular Dynamics Logging into CINECA's computer systems To logon to the M100 system use the following command from an SSH client ss

J. W. Dell 0 Mar 13, 2022
Leverage Twitter API v2 to analyze tweet metrics such as impressions and profile clicks over time.

Tweetmetric Tweetmetric allows you to track various metrics on your most recent tweets, such as impressions, retweets and clicks on your profile. The

Mathis HAMMEL 29 Oct 18, 2022
Show you how to integrate Zeppelin with Airflow

Introduction This repository is to show you how to integrate Zeppelin with Airflow. The philosophy behind the ingtegration is to make the transition f

Jeff Zhang 11 Dec 30, 2022
Modular analysis tools for neurophysiology data

Neuroanalysis Modular and interactive tools for analysis of neurophysiology data, with emphasis on patch-clamp electrophysiology. Functions for runnin

Allen Institute 5 Dec 22, 2021
Port of dplyr and other related R packages in python, using pipda.

Unlike other similar packages in python that just mimic the piping syntax, datar follows the API designs from the original packages as much as possible, and is tested thoroughly with the cases from t

179 Dec 21, 2022
A Python package for Bayesian forecasting with object-oriented design and probabilistic models under the hood.

Disclaimer This project is stable and being incubated for long-term support. It may contain new experimental code, for which APIs are subject to chang

Uber Open Source 1.6k Dec 29, 2022
API>local_db>AWS_RDS - Disclaimer! All data used is for educational purposes only.

APIlocal_dbAWS_RDS Disclaimer! All data used is for educational purposes only. ETL pipeline diagram. Aim of project By creating a fully working pipe

0 Apr 25, 2022
In this tutorial, raster models of soil depth and soil water holding capacity for the United States will be sampled at random geographic coordinates within the state of Colorado.

Raster_Sampling_Demo (Resulting graph of this demo) Background Sampling values of a raster at specific geographic coordinates can be done with a numbe

2 Dec 13, 2022
This cosmetics generator allows you to generate the new Fortnite cosmetics, Search pak and search cosmetics!

COSMETICS GENERATOR This cosmetics generator allows you to generate the new Fortnite cosmetics, Search pak and search cosmetics! Remember to put the l

α΄…α΄ŠΚŸα΄Κ€3xᴒᴏ 11 Dec 13, 2022
A probabilistic programming library for Bayesian deep learning, generative models, based on Tensorflow

ZhuSuan is a Python probabilistic programming library for Bayesian deep learning, which conjoins the complimentary advantages of Bayesian methods and

Tsinghua Machine Learning Group 2.2k Dec 28, 2022
Falcon: Interactive Visual Analysis for Big Data

Falcon: Interactive Visual Analysis for Big Data Crossfilter millions of records without latencies. This project is work in progress and not documente

Vega 803 Dec 27, 2022