{"id":15173886,"url":"https://github.com/sameetasadullah/neural-network-implementation","last_synced_at":"2026-01-30T14:17:51.567Z","repository":{"id":136453881,"uuid":"525112074","full_name":"SameetAsadullah/Neural-Network-Implementation","owner":"SameetAsadullah","description":"Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)","archived":false,"fork":false,"pushed_at":"2022-08-15T19:29:19.000Z","size":6,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-05-30T12:21:19.006Z","etag":null,"topics":["activation-functions","adagrad","adam-optimizer","cross-entropy-loss","gradient-descent","hinge-loss","jupyter-notebook","leaky-relu","loss-functions","mean-squared-error","neural-network","optimizers","pycharm","pycharm-ide","python","python3","relu-activation","rmsprop","sigmoid-activation","softmax-activation"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/SameetAsadullah.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-08-15T19:24:57.000Z","updated_at":"2022-08-15T19:32:01.000Z","dependencies_parsed_at":null,"dependency_job_id":"8ad29f03-b4e9-4bee-9c7d-1a8c65f28e0d","html_url":"https://github.com/SameetAsadullah/Neural-Network-Implementation","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/SameetAsadullah/Neural-Network-Implementation","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SameetAsadullah%2FNeural-Network-Implementation","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SameetAsadullah%2FNeural-Network-Implementation/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SameetAsadullah%2FNeural-Network-Implementation/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SameetAsadullah%2FNeural-Network-Implementation/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/SameetAsadullah","download_url":"https://codeload.github.com/SameetAsadullah/Neural-Network-Implementation/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SameetAsadullah%2FNeural-Network-Implementation/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":259952390,"owners_count":22936950,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["activation-functions","adagrad","adam-optimizer","cross-entropy-loss","gradient-descent","hinge-loss","jupyter-notebook","leaky-relu","loss-functions","mean-squared-error","neural-network","optimizers","pycharm","pycharm-ide","python","python3","relu-activation","rmsprop","sigmoid-activation","softmax-activation"],"created_at":"2024-09-27T11:04:05.537Z","updated_at":"2026-01-30T14:17:51.524Z","avatar_url":"https://github.com/SameetAsadullah.png","language":"Jupyter Notebook","readme":"\u003ch1 align=\"center\"\u003eNeural Network Implementation\u003c/h1\u003e\n\n### Description\n`Neural Network` implemented with different `Activation Functions`, `Optimizers`, and `Loss Functions`. \n\n### Activation Functions\n- Sigmoid\n- Relu\n- Leaky-Relu\n- Softmax\n\n### Optimizers\n- Gradient Descent\n- AdaGrad\n- RMSProp\n- Adam\n\n### Loss Functions\n- Cross-Entropy Loss\n- Hinge-Loss\n- Mean Squared Error (MSE)\n\n### Contributors\n- [Sameet Asadullah](https://github.com/SameetAsadullah)\n- [Aysha Noor](https://github.com/ayshanoorr)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsameetasadullah%2Fneural-network-implementation","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fsameetasadullah%2Fneural-network-implementation","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fsameetasadullah%2Fneural-network-implementation/lists"}