{"id":18014749,"url":"https://github.com/luqigroup/fvae","last_synced_at":"2025-11-08T01:30:31.817Z","repository":{"id":223399276,"uuid":"521441819","full_name":"alisiahkoohi/fvae","owner":"alisiahkoohi","description":"Code to partially reproduce results in \"Martian time-series unraveled: A multi-scale nested approach with factorial variational autoencoders\"","archived":false,"fork":false,"pushed_at":"2024-07-30T18:30:08.000Z","size":8627,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2024-12-28T01:09:59.677Z","etag":null,"topics":["clustering","multi-scale","source-separation","unsupervised-learning","variational-autoencoders"],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/2305.16189","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/alisiahkoohi.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-08-04T23:18:42.000Z","updated_at":"2024-07-30T18:29:25.000Z","dependencies_parsed_at":"2024-11-07T10:45:04.315Z","dependency_job_id":null,"html_url":"https://github.com/alisiahkoohi/fvae","commit_stats":null,"previous_names":["alisiahkoohi/facvae"],"tags_count":4,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alisiahkoohi%2Ffvae","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alisiahkoohi%2Ffvae/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alisiahkoohi%2Ffvae/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alisiahkoohi%2Ffvae/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/alisiahkoohi","download_url":"https://codeload.github.com/alisiahkoohi/fvae/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":239544143,"owners_count":19656504,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["clustering","multi-scale","source-separation","unsupervised-learning","variational-autoencoders"],"created_at":"2024-10-30T04:10:50.233Z","updated_at":"2025-11-08T01:30:31.758Z","avatar_url":"https://github.com/alisiahkoohi.png","language":"Python","readme":"\u003ch1 align=\"center\"\u003eMartian time-series unraveled: A multi-scale nested approach with factorial variational autoencoders\u003c/h1\u003e\n\n## Installation\n\nRun the commands below to install the required packages. Make sure to adapt the `pytorch-cuda` version to your CUDA version in `environment.yml`.\n\n```bash\ngit clone https://github.com/alisiahkoohi/facvae\ncd facvae/\nconda env create -f environment.yml\nsource activate facvae\npip install -e .\n```\n\nAlso add the following to your `~/.bashrc`:\n\n```bash\nexport MARSCONVERTER=/PATH_TO_REPO/facvae/facvae/marsconverter\n```\n\nAfter the above steps, you can run the example scripts by just\nactivating the environment, i.e., `conda activate facvae`, the\nfollowing times.\n\n## Data\n\n### Training data\n\nData required for training---i.e., the pyramidal scattering spectra, can be downloaded with the following command:\n\n\n```bash\nmkdir -p data/mars/scat_covs_h5/\nwget -O \"data/mars/scat_covs_h5/pyramid_full-mission_window_size-65536_q-1-1_j-8-8_use_day_data-1_avgpool_base-4_avgpool_exp-5-6-7-8_model_type-scat-cov_filter_key-true.h5\" \"https://www.dropbox.com/scl/fi/pwv4hwf0mu43b256dvt0q/pyramid_full-mission_window_size-65536_q-1-1_j-8-8_use_day_data-1_avgpool_base-4_avgpool_exp-5-6-7-8_model_type-scat-cov_filter_key-true.h5?rlkey=f3g0q2y5vrnpj6oaz68edf813\u0026dl=0\" --no-check-certificate\n```\n\n### Raw data for visualization and source separation\n\nIn order to visualize the results, including the aligned waveforms, time histograms, and latent space, the raw data is also required. The raw data can be downloaded from [here](https://www.dropbox.com/scl/fo/38tr0k9kghtben1mwv3qs/h?rlkey=tlccygf71nutreqakq9p54a0w\u0026dl=0) and it must be placed in the `data/mars/raw/` directory.\n\nAfter downloading the raw data, run the following command to extract the raw unprocessed UVW data:\n\n```bash\nbash facvae/utils/bash-utils/extract-mars-waveforms_raw_UVW.sh\n```\n\n### Pretrained model\n\nThe pretrained model can be downloaded with the following command. Note that for the visualization and source separation scripts to use this model, the default values in associated configuration json files must be used.\n\n```bash\nmkdir -p \"data/checkpoints/nature_full-mission_max_epoch-1000_batchsize-16384_lr-0.001_lr_final-0.001_ncluster-9_latent_dim-32_w_rec-0.15_wd-0.0_hidden_dim-1024_nlayer-4_window_size-65536_scales-1024-4096-16384-65536_seed-29/\"\nwget -O \"data/checkpoints/nature_full-mission_max_epoch-1000_batchsize-16384_lr-0.001_lr_final-0.001_ncluster-9_latent_dim-32_w_rec-0.15_wd-0.0_hidden_dim-1024_nlayer-4_window_size-65536_scales-1024-4096-16384-65536_seed-29/checkpoint_999.pth\" \"https://www.dropbox.com/scl/fi/7v7zjgzjn67t2ukp27ilr/checkpoint_999.pth?rlkey=nh6tap4xsc6p9e5b37660btpb\u0026dl=0\" --no-check-certificate\n```\n\n\n## Usage\n\nTo run the example script, you can use the following commands. The list of command line arguments and their default values can be found in the configuration json files in `configs/`.\n\n### Training the fVAE on the full mission data.\n\nFor a full list of command line arguments, see `configs/facvae_full-mission.json`.\n\n```bash\npython scripts/train_facvae.py\n```\n\n### Visualizing the fVAE: aligned waveforms, time histograms, and latent space.\n\n```bash\npython scripts/train_facvae.py --phase test\n\n```\n\n### Source separation using the trained fVAE.\n\nFor a full list of command line arguments, see `configs/source_separation.json`. Resutls will be saved in `plots/` directory. Note that the variables `cluster_n` and `scale_n` are based on the pretrained model and should be set accordingly when a new model is trained.\n\n**Glitch example:**\n\n```bash\npython scripts/separate_facvae.py --cluster_n \"5\" --cluster_g \"4\" --scale_n \"1024\" --scale_g \"65536\"\n```\n\n**Wind example:**\n\n```bash\npython scripts/separate_facvae.py --cluster_n \"1,6\" --cluster_g \"3\" --scale_n \"1024,1024\" --scale_g \"65536\"\n```\n\n## Questions\n\nPlease contact alisk@rice.edu for questions.\n\n## Author\n\nAli Siahkoohi\n\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fluqigroup%2Ffvae","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fluqigroup%2Ffvae","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fluqigroup%2Ffvae/lists"}