Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/abogascended/feedforwardimplementation
Basic Feed Forward Neural Network Implementation with ELU activations and MSE loss/Cross entropy loss for uni-variate regression. Uses Optuna for Hyperparameter search
https://github.com/abogascended/feedforwardimplementation
deep-learning kaggle neural-networks optuna univariate-regressions
Last synced: about 2 months ago
JSON representation
Basic Feed Forward Neural Network Implementation with ELU activations and MSE loss/Cross entropy loss for uni-variate regression. Uses Optuna for Hyperparameter search
- Host: GitHub
- URL: https://github.com/abogascended/feedforwardimplementation
- Owner: AbogAscended
- Created: 2024-06-26T21:08:22.000Z (6 months ago)
- Default Branch: master
- Last Pushed: 2024-08-05T16:04:48.000Z (5 months ago)
- Last Synced: 2024-08-05T18:58:31.399Z (5 months ago)
- Topics: deep-learning, kaggle, neural-networks, optuna, univariate-regressions
- Language: Jupyter Notebook
- Homepage:
- Size: 12.6 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README
Awesome Lists containing this project
README
# FeedForwardImplementation with pytorch
Basic Feed Forward Neural Network Implementation with ELU activations and MSE loss for uni-variate regression. This uses the data set from a Housing Price competion on Kaggle thats a bit dated now.
Both the training and test csv files are included in the repository, all you need to do is clone and then run the close via jupyter notebook and it will work.
The actual Neural Network model is in its own python file and is imported into the jupyter notebook. Its incredibly simple and just uses a sequential container and linear layers, with 287 input neurons, 5 hidden layers of 500 hidden units and a single output neuron in the output layer as its univariate regression.
The model is set up for optuna for hyperparameter search. To prepare the data it uses both pandas to fill in empty data points using mean as well as categorical one-hot encoding. I then use Scikit-learn to standardize the data to help with exploding gradients which was a problem for me when training this model.
It also automatically will create a submission csv file with pandas for submission to kaggle.
#Required python modules
1. Torch
2. torch.nn
3. torch.optim
4. numpy
5. pandas
6. torch.util.data
7. Scikit-learn
8. Optuna