https://github.com/shreyansh26/experiments-with-the-neural-tangent-kernel
Learning more about the NTK through existing paper implementations
https://github.com/shreyansh26/experiments-with-the-neural-tangent-kernel
Last synced: 7 months ago
JSON representation
Learning more about the NTK through existing paper implementations
- Host: GitHub
- URL: https://github.com/shreyansh26/experiments-with-the-neural-tangent-kernel
- Owner: shreyansh26
- Created: 2023-01-04T04:46:46.000Z (almost 3 years ago)
- Default Branch: master
- Last Pushed: 2023-01-04T04:46:49.000Z (almost 3 years ago)
- Last Synced: 2025-01-14T02:14:22.726Z (9 months ago)
- Language: Jupyter Notebook
- Size: 112 KB
- Stars: 1
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Experiments with Neural Tangent Kernel (NTK)
These are some paper implementations and other code snippets to help me learn more about Neural Tangent Kernels. Most of this is from existing implementations and I am maintaining this just for learning purposes.
The notebooks currently in the repository are for -
* Eignedecomposition of the empirical NTK
* Training Neural Networks (non-linear) and Linear models (after linearization) on binary eigenfunctions of the NTK at initialization
* The training and loss landscape of the linear models at initialization vs neural networks vs linear model with kernel extracted after non-linear pretraining## References
- [Neural Tangents library by Google](https://github.com/google/neural-tangents)
- [Source code of "What can linearized neural networks actually say about generalization?"](https://github.com/gortizji/linearized-networks)