Projects in Awesome Lists tagged with relu-activation
A curated list of projects in awesome lists tagged with relu-activation .
https://github.com/luca-parisi/quantum_relu
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
activation activation-functions deep-learning deep-neural-networks keras keras-classification-models keras-models keras-neural-networks keras-tensorflow neural-network neural-networks pytorch quantum quantum-algorithms quantum-machine-learning relu relu-activation relu-layer tensorflow torch
Last synced: 07 May 2025
https://github.com/epsoft/layers
layers
attention conv2d dense flatten globalmaxpooling1d layers maxpooling2d noisy-regularization relu relu-activation relu-layer sequential sequential-models tensorflow
Last synced: 01 May 2025
https://github.com/fahdseddik/feedforward-neuralnetwork-from-scratch
This is a Feed-Forward Neural Network with back-propagation written in C++ from scratch with no external libraries.
backpropagation backpropagation-neural-network cpp deep-learning feedforward-neural-network neural-network neural-networks neuralnetwork-creation object-oriented-programming relu-activation sigmoid-activation
Last synced: 19 Feb 2025
https://github.com/dcarpintero/nn-image-classifier
Python from-scratch implementation of a Neural Network Classifier. Dive into the fundamentals of approximation, non-linearity, regularization, gradients, and backpropagation.
backpropagation deep-dives deep-learning dropout-layers fashion-mnist gradient-descent image-classification linear-layers multilayer-perceptron python relu-activation
Last synced: 20 Nov 2024
https://github.com/dcarpintero/deep-learning-notebooks
Get Started with Deep Learning
backpropagation data-science deep-learning education gradient-descent linear-layers perceptron python pytorch relu-activation
Last synced: 14 Mar 2025
https://github.com/sameetasadullah/neural-network-implementation
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
activation-functions adagrad adam-optimizer cross-entropy-loss gradient-descent hinge-loss jupyter-notebook leaky-relu loss-functions mean-squared-error neural-network optimizers pycharm pycharm-ide python python3 relu-activation rmsprop sigmoid-activation softmax-activation
Last synced: 20 Feb 2025
https://github.com/nazli-d/binary-classification-using-cnn
This project utilizes a CNN model to classify cat and dog images through training and testing processes. The model is created using the Keras library on the TensorFlow backend.
binary-classification cat-and-dog-classifier cnn convolutional-neural-networks flatten keras max-pooling numpy opencv python relu-activation sigmoid-function tensorflow
Last synced: 21 Feb 2025
https://github.com/isa1asn/mnist-neural-net
Applying neural networks on the MNIST dataset (from scratch & using tf)
from-scratch keras mnist-classification neural-networks numpy pandas relu-activation softmax tensorflow
Last synced: 14 Mar 2025
https://github.com/epsoft/text-generation
Text Generation
farsi generation pandas persian relu relu-activation relu-layer relu-network text text-generation
Last synced: 28 Feb 2025
https://github.com/elaheghiyabi96/fashion_mnist_nn_torch
"Simple neural network model using Torch for classifying the Fashion MNIST dataset, implemented with Torch."
adam-optimizer cross-entropy-loss deep-learning dropout fashion-mnist feedforward-neural-network image-classification machine-learning model-evaluation model-training neural-network pytorch relu-activation test-accuracy training-loss
Last synced: 06 May 2025
https://github.com/psgebeline/continuum_neural_network
An artificial neural network used for distinguishing between signal and continuum B events in Belle II monte carlo based on event topology. Competition submission for Belle II CNN competition, placed 5th with 95% efficiency.
callbacks convolutional-neural-networks feature-selection keras python relu-activation tensorflow
Last synced: 14 Mar 2025
https://github.com/yugoff/course-skillfactory-clothing-classification
First task in course
convolutional-neural flatten relu-activation single-vector
Last synced: 27 Mar 2025
https://github.com/jelhamm/activation-functions
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
activation-functions algorithms artificial-intelligence-algorithms artificial-neural-networks elu-activation gaussian gelu leakyrelu maxout prelude python relu relu-activation selu sigmoid sigmoid-activation sigmoid-activation-function swish tanh tanh-activation
Last synced: 04 Apr 2025
https://github.com/himanshumahajan138/transfervision
Transfer learning for image classification using pre-trained models like ResNet50, ResNet100, EfficientNetB0, and VGG16 in Keras. Fine-tunes the last layers, applies image augmentation, and evaluates with Precision, Recall, AUC, F1 score, and early stopping for improved performance.
adam-optimizer batch-normalization data-pre-processing deep-learning early-stopping efficientnetb0 image-augmentation image-classification keras multiclass-classification precision-recall-auc-f1-score regularization relu-activation resnet100 resnet50 transfer-learning vgg16
Last synced: 06 Mar 2025
https://github.com/tahirzia-1/neural-networks-mnist-in-python
Implementations of neural networks in python for the classification of MNIST datasets.
ai artificial-intelligence ipynb ipynb-jupyter-notebook mnist mnist-classification mnist-dataset networks neural python relu relu-activation sigmoid sigmoid-function softmax softmax-function tensorflow
Last synced: 05 Mar 2025