{"id":28406903,"url":"https://github.com/felixsoderstrom/cifar","last_synced_at":"2025-06-29T09:31:19.407Z","repository":{"id":294616010,"uuid":"986911357","full_name":"FelixSoderstrom/CIFAR","owner":"FelixSoderstrom","description":null,"archived":false,"fork":false,"pushed_at":"2025-05-21T10:13:41.000Z","size":27,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-06-02T08:19:38.577Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/FelixSoderstrom.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-05-20T09:51:45.000Z","updated_at":"2025-05-21T10:13:44.000Z","dependencies_parsed_at":"2025-05-21T09:30:59.555Z","dependency_job_id":"773a33a1-af26-4a66-aa94-61526d7bb059","html_url":"https://github.com/FelixSoderstrom/CIFAR","commit_stats":null,"previous_names":["felixsoderstrom/cifar"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/FelixSoderstrom/CIFAR","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FelixSoderstrom%2FCIFAR","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FelixSoderstrom%2FCIFAR/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FelixSoderstrom%2FCIFAR/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FelixSoderstrom%2FCIFAR/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/FelixSoderstrom","download_url":"https://codeload.github.com/FelixSoderstrom/CIFAR/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/FelixSoderstrom%2FCIFAR/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":262569356,"owners_count":23330193,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-06-01T23:10:46.223Z","updated_at":"2025-06-29T09:31:19.392Z","avatar_url":"https://github.com/FelixSoderstrom.png","language":"Python","readme":"# CIFAR-10 Image Classification\n\nThis project implements deep learning models for image classification on the CIFAR-10 dataset. It provides a complete pipeline for training, evaluating, and visualizing the performance of different CNN architectures.\n\n## Project Overview\n\nThe CIFAR-10 dataset consists of 60,000 32x32 color images in 10 classes, with 6,000 images per class. This project offers:\n\n- Training of custom CNN architectures from scratch\n- Transfer learning using pre-trained ResNet50\n- Comprehensive evaluation metrics and visualizations\n- Modular codebase for easy experimentation\n\n## Project Structure\n\n```\nCIFAR/\n│\n├── main.py                   # Entry point for training and evaluation\n├── README.md                 # This file\n│\n└── src/\n    ├── data_processing/      # Data loading and augmentation\n    │   ├── augment.py        # Data augmentation functions\n    │   └── utils.py          # Data utilities\n    │\n    ├── networks/             # Model architectures\n    │   ├── classic_network.py # Custom CNN architecture\n    │   ├── transfer_network.py # Transfer learning with ResNet50\n    │   └── utils.py          # Network utilities\n    │\n    ├── training/             # Training functionality\n    │   ├── trainer.py        # Training loop implementation\n    │   └── utils.py          # Training utilities\n    │\n    └── evaluation/           # Evaluation functionality\n        ├── evaluate.py       # Model evaluation\n        ├── utils.py          # Evaluation utilities\n        └── visualize.py      # Visualization functions\n```\n\n## Usage\n\n### Training a Model\n\nTo train a model from scratch:\n\n```bash\npython main.py --model classic --epochs 30 --batch_size 128 --lr 0.001 --gpu\n```\n\nTo train using transfer learning with ResNet50:\n\n```bash\npython main.py --model transfer --epochs 20 --batch_size 64 --lr 0.0001 --gpu\n```\n\n### Command Line Arguments\n\n- `--model`: Model architecture to use (`classic` or `transfer`)\n- `--epochs`: Number of training epochs (default: 30)\n- `--batch_size`: Batch size for training (default: 128)\n- `--lr`: Learning rate (default: 0.001)\n- `--weight_decay`: Weight decay for optimizer (default: 1e-4)\n- `--seed`: Random seed (default: 42)\n- `--gpu`: Use GPU if available (flag)\n- `--evaluate_only`: Only run evaluation on a trained model (flag)\n\n### Evaluation Only\n\nTo evaluate a trained model without retraining:\n\n```bash\npython main.py --model classic --evaluate_only --gpu\n```\n\n## Features\n\n### Data Augmentation\n\nThe project implements several data augmentation techniques:\n- Random cropping\n- Random horizontal flips\n- Random rotation\n- Color jitter\n\n### Model Architectures\n\n1. **ClassicCNN**: A custom CNN architecture with:\n   - 4 convolutional blocks with increasing filter sizes\n   - Batch normalization\n   - Max pooling\n   - Dropout for regularization\n   - Fully connected layers\n\n2. **TransferResNet50**: A transfer learning approach using:\n   - Pre-trained ResNet50 as feature extractor\n   - Custom classification head for CIFAR-10\n\n### Evaluation Metrics\n\n- Accuracy (overall and per-class)\n- Precision, recall, and F1 score\n- Confusion matrix\n- Feature embeddings visualization (t-SNE and PCA)\n- Visualization of misclassified samples\n- Training and validation curves\n\n## Output\n\nThe results are saved in an `output/session_X` directory, where `X` is the session number. Each session directory contains:\n\n- `checkpoints/`: Model weights for each epoch and the best model\n- `plots/`: Visualization plots (confusion matrices, embeddings, etc.)\n- `test_results_*.txt`: Detailed evaluation metrics\n- `training_summary.txt`: Summary of the training process\n- `stats_*.json`: Training statistics for plotting\n\n## Requirements\n\n- Python 3.6+\n- PyTorch\n- torchvision\n- numpy\n- matplotlib\n- scikit-learn\n- tqdm\n- pytorch-lightning\n\n## License\n\n[MIT License](LICENSE) ","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffelixsoderstrom%2Fcifar","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffelixsoderstrom%2Fcifar","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffelixsoderstrom%2Fcifar/lists"}