https://github.com/alchemz/carnd-alexnet-feature-extraction
AlexNet puts the network on two GPUs, which allows for building a larger network. Although most of the calculations are done in parallel, the GPUs communicate with each other in certain layers. The original research paper on AlexNet said that parallelizing the network decreased the classification error rate by 1.7% when compared to a neural network that used half as many neurons on one GPU.
https://github.com/alchemz/carnd-alexnet-feature-extraction
Last synced: 4 months ago
JSON representation
AlexNet puts the network on two GPUs, which allows for building a larger network. Although most of the calculations are done in parallel, the GPUs communicate with each other in certain layers. The original research paper on AlexNet said that parallelizing the network decreased the classification error rate by 1.7% when compared to a neural network that used half as many neurons on one GPU.
- Host: GitHub
- URL: https://github.com/alchemz/carnd-alexnet-feature-extraction
- Owner: alchemz
- License: mit
- Created: 2018-02-22T01:12:08.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-02-24T00:49:32.000Z (over 7 years ago)
- Last Synced: 2025-01-07T07:16:35.560Z (5 months ago)
- Language: Python
- Homepage:
- Size: 243 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Lab-AlexNet
AlexNet Architecture
--------------------------
AlexNet puts the network on two GPUs, which allows for building a larger network. Although most of the calculations are done in parallel, the GPUs communicate with each other in certain layers. The original research paper on AlexNet said that parallelizing the network decreased the classification error rate by 1.7% when compared to a neural network that used half as many neurons on one GPU.