A framework that constructs deep neural networks, autoencoders, logistic regressors, and linear networks

Overview

Academic-DeepNeuralNetsFromScratch

A framework that constructs deep neural networks, autoencoders, logistic regressors, and linear networks without the use of any outside machine learning libraries - all from scratch.

This project was constructed for the Introduction to Machine Learning course, class 605.649 section 84 at Johns Hopkins University. FranceLab4 is a machine learning toolkit that implements several algorithms for classification and regression tasks. Specifically, the toolkit coordinates a linear network, a logistic regressor, an autoencoder, and a neural network that implements backpropagation; it also leverages data structures built in the preceding labs. FranceLab4 is a software module written in Python 3.7 that facilitates such algorithms.

##Notes for Graders All files of concern for this project (with the exception of main.py) may be found in the Linear_Network, Logistic_Regression, and Neural_Network folders. I kept most of my files from Projects 1, 2, and 3 because I ended up using cross validation, encoding, and other helper methods. However, these three folders contains the neural network algorithms of interest.

I have created blocks of code for you to test and run each algorithm if you choose to do so. In __main__.py scroll to the bottom and find the main function. Simply comment or uncomment blocks of code to test if desired.

Each neural network and autoencoder constructed are sub-classed / inherited from the NeuralNet class in neural_net.py. I simply initialize the class differently in order to construct an autoencoder, a feed-forward neural network, or a combination of both.

Data produced in my paper were run with KFCV. However within the main program, you may notice that the number of folds k has been reduced to 2 to make the analysis quicker and the console output easier to follow.

The construction of a linear network begins on line 84 in __main__.py.

The construction of a logistic regressor begins on line 102 in __main__.py.

The construction of an autoencoder only begins on line 128 in __main__.py.

The construction of a feed-forward neural network only begins on line 141 in __main__.py.

The construction of an autoencoder that is trained, the decoder removed, and the encoder attached to a new hidden layer with a prediction layer attached to form a new neural network begins on line 221 in __main__.py.

The code for the weight updates and backward and forward propagation may be found in the following files within the Neural_Network folder:

  • layer.py
  • optimizer_function.py
  • neural_net.py

__main__.py is the driver behind importing the dataset, cleaning the data, coordinating KFCV, and initializing each of the neural network algorithms.

Running FranceLab4

  1. Ensure Python 3.7 is installed on your computer.
  2. Navigate to the Lab4 directory. For example, cd User\Documents\PythonProjects\FranceLab4. Do NOT cd into the Lab4 module.
  3. Run the program as a module: python3 -m Lab4.
  4. Input and output files ar located in the io_files subdirectory.

FranceLab4 Usage

usage: python3 -m Lab4
Owner
Kordel K. France
Artificial Intelligence Engineer, Algorithmic Trader. I build software that finds order within chaos.
Kordel K. France
This is a code repository for paper OODformer: Out-Of-Distribution Detection Transformer

OODformer: Out-Of-Distribution Detection Transformer This repo is the official the implementation of the OODformer: Out-Of-Distribution Detection Tran

34 Dec 02, 2022
deep learning for image processing including classification and object-detection etc.

深度学习在图像处理中的应用教程 前言 本教程是对本人研究生期间的研究内容进行整理总结,总结的同时也希望能够帮助更多的小伙伴。后期如果有学习到新的知识也会与大家一起分享。 本教程会以视频的方式进行分享,教学流程如下: 1)介绍网络的结构与创新点 2)使用Pytorch进行网络的搭建与训练 3)使用Te

WuZhe 13.6k Jan 04, 2023
pq is a jq-like Pickle file viewer

pq PQ is a jq-like viewer/processing tool for pickle files. howto # pq '' file.pkl {'other': 456, 'test': 123} # pq 'table' file.pkl |other|test| | 45

3 Mar 15, 2022
A Library for Modelling Probabilistic Hierarchical Graphical Models in PyTorch

A Library for Modelling Probabilistic Hierarchical Graphical Models in PyTorch

Korbinian Pöppel 47 Nov 28, 2022
Beancount-mercury - Beancount importer for Mercury Startup Checking

beancount-mercury beancount-mercury provides an Importer for converting CSV expo

Michael Lynch 4 Oct 31, 2022
Using this codebase as a tool for my own research. Making some modifications to the original repo for my own purposes.

For SwapNet Create a list.txt file containing all the images to process. This can be done with the GNU find command: find path/to/input/folder -name '

Andrew Jong 2 Nov 10, 2021
Repository for the paper "From global to local MDI variable importances for random forests and when they are Shapley values"

From global to local MDI variable importances for random forests and when they are Shapley values Antonio Sutera ( Antonio Sutera 3 Feb 23, 2022

SPRING is a seq2seq model for Text-to-AMR and AMR-to-Text (AAAI2021).

SPRING This is the repo for SPRING (Symmetric ParsIng aNd Generation), a novel approach to semantic parsing and generation, presented at AAAI 2021. Wi

Sapienza NLP group 98 Dec 21, 2022
The tl;dr on a few notable transformer/language model papers + other papers (alignment, memorization, etc).

The tl;dr on a few notable transformer/language model papers + other papers (alignment, memorization, etc).

Will Thompson 166 Jan 04, 2023
Code for "Adversarial attack by dropping information." (ICCV 2021)

AdvDrop Code for "AdvDrop: Adversarial Attack to DNNs by Dropping Information(ICCV 2021)." Human can easily recognize visual objects with lost informa

Ranjie Duan 52 Nov 10, 2022
Cockpit is a visual and statistical debugger specifically designed for deep learning.

Cockpit: A Practical Debugging Tool for Training Deep Neural Networks

Felix Dangel 421 Dec 29, 2022
A heterogeneous entity-augmented academic language model based on Open Academic Graph (OAG)

Library | Paper | Slack We released two versions of OAG-BERT in CogDL package. OAG-BERT is a heterogeneous entity-augmented academic language model wh

THUDM 58 Dec 17, 2022
ServiceX Transformer that converts flat ROOT ntuples into columnwise data

ServiceX_Uproot_Transformer ServiceX Transformer that converts flat ROOT ntuples into columnwise data Usage You can invoke the transformer from the co

Vis 0 Jan 20, 2022
DeepFaceLab fork which provides IPython Notebook to use DFL with Google Colab

DFL-Colab — DeepFaceLab fork for Google Colab This project provides you IPython Notebook to use DeepFaceLab with Google Colaboratory. You can create y

779 Jan 05, 2023
An end-to-end regression problem of predicting the price of properties in Bangalore.

Bangalore-House-Price-Prediction An end-to-end regression problem of predicting the price of properties in Bangalore. Deployed in Heroku using Flask.

Shruti Balan 1 Nov 25, 2022
PyTorch implementation of the NIPS-17 paper "Poincaré Embeddings for Learning Hierarchical Representations"

Poincaré Embeddings for Learning Hierarchical Representations PyTorch implementation of Poincaré Embeddings for Learning Hierarchical Representations

Facebook Research 1.6k Dec 25, 2022
When are Iterative GPs Numerically Accurate?

When are Iterative GPs Numerically Accurate? This is a code repository for the paper "When are Iterative GPs Numerically Accurate?" by Wesley Maddox,

Wesley Maddox 1 Jan 06, 2022
I-BERT: Integer-only BERT Quantization

I-BERT: Integer-only BERT Quantization HuggingFace Implementation I-BERT is also available in the master branch of HuggingFace! Visit the following li

Sehoon Kim 139 Dec 27, 2022
Implementation of Barlow Twins paper

barlowtwins PyTorch Implementation of Barlow Twins paper: Barlow Twins: Self-Supervised Learning via Redundancy Reduction This is currently a work in

IgorSusmelj 86 Dec 20, 2022
LiDAR R-CNN: An Efficient and Universal 3D Object Detector

LiDAR R-CNN: An Efficient and Universal 3D Object Detector Introduction This is the official code of LiDAR R-CNN: An Efficient and Universal 3D Object

TuSimple 295 Jan 05, 2023