{"id":25819003,"url":"https://github.com/pngo1997/neural-network-modifications-hyperparameter-experiments","last_synced_at":"2025-02-28T08:14:22.271Z","repository":{"id":275307570,"uuid":"925709985","full_name":"pngo1997/Neural-Network-Modifications-Hyperparameter-Experiments","owner":"pngo1997","description":"Modifies a neural network's hyperparameters, activation functions, cost functions, and regularization methods to improve training performance and generalization.","archived":false,"fork":false,"pushed_at":"2025-02-01T15:04:17.000Z","size":0,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-02-01T15:39:35.520Z","etag":null,"topics":["activation","deep-learning","dropout-rates","epoch","hyperparameter-optimization","leaky-relu","neural-network","neural-network-training","python","regularization","relu","sigmoid-function","tanh"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/pngo1997.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2025-02-01T14:58:54.000Z","updated_at":"2025-02-01T15:05:56.000Z","dependencies_parsed_at":"2025-02-01T15:39:37.455Z","dependency_job_id":"a251e687-8b9f-4de8-b136-529d221a7a21","html_url":"https://github.com/pngo1997/Neural-Network-Modifications-Hyperparameter-Experiments","commit_stats":null,"previous_names":["pngo1997/neural-network-modifications-hyperparameter-experiments"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pngo1997%2FNeural-Network-Modifications-Hyperparameter-Experiments","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pngo1997%2FNeural-Network-Modifications-Hyperparameter-Experiments/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pngo1997%2FNeural-Network-Modifications-Hyperparameter-Experiments/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pngo1997%2FNeural-Network-Modifications-Hyperparameter-Experiments/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/pngo1997","download_url":"https://codeload.github.com/pngo1997/Neural-Network-Modifications-Hyperparameter-Experiments/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241122310,"owners_count":19913454,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["activation","deep-learning","dropout-rates","epoch","hyperparameter-optimization","leaky-relu","neural-network","neural-network-training","python","regularization","relu","sigmoid-function","tanh"],"created_at":"2025-02-28T08:14:21.763Z","updated_at":"2025-02-28T08:14:22.256Z","avatar_url":"https://github.com/pngo1997.png","language":"Jupyter Notebook","readme":"# 🧠 Neural Network Modifications \u0026 Hyperparameter Experiments  \n\n## 📜 Overview  \nThis project involves modifying a neural network's **hyperparameters, activation functions, cost functions, and regularization methods** to improve training performance and generalization. Additionally, we conduct **experiments** to analyze the impact of different configurations on model accuracy.  \n\n📌 **Tasks**:  \n- Modify the network to support **customizable hyperparameters**.  \n- Implement additional **cost functions, activation functions, and regularization methods**.  \n- Add **dropout** for regularization.  \n- Conduct **experiments** on the **Iris dataset** using different configurations.  \n\n📌 **Programming Language**: `Python 3`  \n📌 **Libraries Used**: `NumPy`, `pandas`, `Jupyter Notebook`  \n\n## 🚀 1️⃣ Network Code Modifications  \n\n### **Added Hyperparameters**  \n- **Cost Functions**: Quadratic, CrossEntropy, LogLikelihood  \n- **Activation Functions**: Sigmoid, Tanh, ReLU, LeakyReLU, Softmax  \n- **Regularization**: L1, L2  \n- **Dropout Rate**  \n\n### **Modified Functions**  \n- `set_model_parameters()` → Supports **custom cost \u0026 activation functions**.  \n- `feedforward()` → Implements **dropout for hidden layers**.  \n- `backprop()` → Supports **L1/L2 regularization \u0026 dropout in weight updates**.  \n- `update_mini_batch()` → Incorporates **new regularization methods**.  \n- `total_cost()` → Adjusted for **regularization impact on cost function**.  \n\n📌 **Dropout Implementation**:  \n- Applied **only during training** (not evaluation).  \n- Uses a **binary mask** to randomly drop units.  \n- Ensures **scaling to maintain expected activations**.  \n\n## 🎯 2️⃣ Experimental Setup  \n\n### **Dataset**  \n- **Training Set**: `iris-train-2.csv`  \n- **Test Set**: `iris-test-2.csv`  \n- **Pretrained Models**: `iris-423.dat`, `iris4-20-7-3.dat`  \n\n### **Experimental Parameters**  \n- **Epochs**: 30  \n- **Mini-batch Size**: 8  \n- **Learning Rate (η)**: 0.1  \n- **Regularization \u0026 Dropout**: Various configurations tested.  \n\n📌 **Key Observations**:  \n- **LeakyReLU \u0026 ReLU** show improvement over **Sigmoid/Tanh**.  \n- **Regularization (L1/L2)** reduces **overfitting**.  \n- **Dropout (0.3)** helps generalization but **affects training stability**.  \n\n## 📌 Summary  \n✅ Added support for multiple cost \u0026 activation functions.  \n\n✅ Implemented L1/L2 regularization \u0026 dropout.  \n\n✅ Modified key functions for new hyperparameters.  \n\n✅ Conducted extensive experiments on the Iris dataset.  \n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpngo1997%2Fneural-network-modifications-hyperparameter-experiments","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fpngo1997%2Fneural-network-modifications-hyperparameter-experiments","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fpngo1997%2Fneural-network-modifications-hyperparameter-experiments/lists"}