{"id":17606417,"url":"https://github.com/atcold/pytorch-cortexnet","last_synced_at":"2025-08-19T05:09:27.640Z","repository":{"id":73551388,"uuid":"80459944","full_name":"Atcold/pytorch-CortexNet","owner":"Atcold","description":"PyTorch implementation of the CortexNet predictive model","archived":false,"fork":false,"pushed_at":"2018-11-28T20:41:56.000Z","size":13980,"stargazers_count":366,"open_issues_count":2,"forks_count":80,"subscribers_count":19,"default_branch":"master","last_synced_at":"2025-05-19T15:07:56.239Z","etag":null,"topics":["deep-learning","predictive-modeling","pytorch","self-supervised","unsupervised-learning","video"],"latest_commit_sha":null,"homepage":"http://tinyurl.com/CortexNet/","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Atcold.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2017-01-30T20:23:30.000Z","updated_at":"2025-03-15T13:16:19.000Z","dependencies_parsed_at":"2023-02-25T15:45:42.875Z","dependency_job_id":null,"html_url":"https://github.com/Atcold/pytorch-CortexNet","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/Atcold/pytorch-CortexNet","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Atcold%2Fpytorch-CortexNet","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Atcold%2Fpytorch-CortexNet/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Atcold%2Fpytorch-CortexNet/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Atcold%2Fpytorch-CortexNet/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Atcold","download_url":"https://codeload.github.com/Atcold/pytorch-CortexNet/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Atcold%2Fpytorch-CortexNet/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":271103202,"owners_count":24699646,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-19T02:00:09.176Z","response_time":63,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","predictive-modeling","pytorch","self-supervised","unsupervised-learning","video"],"created_at":"2024-10-22T15:44:26.631Z","updated_at":"2025-08-19T05:09:27.614Z","avatar_url":"https://github.com/Atcold.png","language":"Jupyter Notebook","readme":"# *CortexNet*\n\nThis repo contains the *PyTorch* implementation of *CortexNet*.  \nCheck the [project website](https://engineering.purdue.edu/elab/CortexNet/) for further information.\n\n## Project structure\n\nThe project consists of the following folders and files:\n\n - [`data/`](data): contains *Bash* scripts and a *Python* class definition inherent video data loading;\n - [`image-pretraining/`](image-pretraining/): hosts the code for pre-training TempoNet's discriminative branch;\n - [`model/`](model): stores several network architectures, including [*PredNet*](https://coxlab.github.io/prednet/), an additive feedback *Model01*, and a modulatory feedback *Model02* ([*CortexNet*](https://engineering.purdue.edu/elab/CortexNet/));\n - [`notebook/`](notebook): collection of *Jupyter Notebook*s for data exploration and results visualisation;\n - [`utils/`](utils): scripts for\n   - (current or former) training error plotting,\n   - experiments `diff`,\n   - multi-node synchronisation,\n   - generative predictions visualisation,\n   - network architecture graphing;\n - `results@`: link to the location where experimental results will be saved within 3-digit folders;\n - [`new_experiment.sh*`](new_experiment.sh): creates a new experiment folder, updates `last@`, prints a memo about last used settings;\n - `last@`: symbolic link pointing to a new results sub-directory created by `new_experiment.sh`;\n - [`main.py`](main.py): training script for *CortexNet* in *MatchNet* or *TempoNet* configuration;\n\n## Dependencies\n\n + [*scikit-video*](https://github.com/scikit-video/scikit-video): accessing images / videos\n\n```bash\npip install sk-video\n```\n\n + [*tqdm*](https://github.com/tqdm/tqdm): progress bar\n\n```bash\nconda config --add channels conda-forge\nconda update --all\nconda install tqdm\n```\n\n## IDE\n\nThis project has been realised with [*PyCharm*](https://www.jetbrains.com/pycharm/) by *JetBrains* and the [*Vim*](http://www.vim.org/) editor.\n[*Grip*](https://github.com/joeyespo/grip) has been also fundamental for crafting decent documtation locally.\n\n## Initialise environment\n\nOnce you've determined where you'd like to save your experimental results — let's call this directory `\u003cmy saving location\u003e` — run the following commands from the project's root directory:\n\n```bash\nln -s \u003cmy saving location\u003e results  # replace \u003cmy saving location\u003e\nmkdir results/000 \u0026\u0026 touch results/000/train.log  # init. placeholder\nln -s results/000 last  # create pointer to the most recent result\n```\n\n## Setup new experiment\n\nReady to run your first experiment?\nType the following:\n\n```bash\n./new_experiment.sh\n```\n\n### GPU selection\n\nLet's say your machine has `N` GPUs.\nYou can choose to use any of these, by specifying the index `n = 0, ..., N-1`.\nTherefore, type `CUDA_VISIBLE_DEVICES=n` just before `python ...` in the following sections.\n\n## Train *MatchNet*\n\n + Download *e-VDS35* (*e.g.* `e-VDS35-May17.tar`) from [here](https://engineering.purdue.edu/elab/eVDS/).\n + Use [`data/resize_and_split.sh`](data/resize_and_split.sh) to prepare your (video) data for training.\n   It resizes videos present in folders of folders (*i.e.* directory of classes) and may split them into training and validation set.\n   May also skip short videos and trim longer ones.\n   Check [`data/README.md`](data/README.md#matchnet-mode) for more details.\n + Run the [`main.py`](main.py) script to start training.\n   Use `-h` to print the command line interface (CLI) arguments help.\n\n```bash\npython -u main.py --mode MatchNet \u003cCLI arguments\u003e | tee last/train.log\n```\n\n## Train *TempoNet*\n\n + Download *e-VDS35* (*e.g.* `e-VDS35-May17.tar`) from [here](https://engineering.purdue.edu/elab/eVDS/).\n + Pre-train the forward branch (see [`image-pretraining/`](image-pretraining)) on an image data set (*e.g.* `33-image-set.tar` from [here](https://engineering.purdue.edu/elab/eVDS/));\n + Use [`data/resize_and_sample.sh`](data/resize_and_sample.sh) to prepare your (video) data for training.\n   It resizes videos present in folders of folders (*i.e.* directory of classes) and samples them.\n   Videos are then distributed across training and validation set.\n   May also skip short videos and trim longer ones.\n   Check [`data/README.md`](data/README.md#temponet-mode) for more details.\n + Run the [`main.py`](main.py) script to start training.\n   Use `-h` to print the CLI arguments help.\n\n```bash\npython -u main.py --mode TempoNet --pre-trained \u003cpath\u003e \u003cCLI args\u003e | tee last/train.log\n```\n\n## GPU selection\n\nTo run on a specific GPU, say `n`, type `CUDA_VISIBLE_DEVICES=n` just before `python ...`.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fatcold%2Fpytorch-cortexnet","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fatcold%2Fpytorch-cortexnet","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fatcold%2Fpytorch-cortexnet/lists"}