{"id":18968196,"url":"https://github.com/aisuko/notebooks","last_synced_at":"2025-06-30T03:34:42.715Z","repository":{"id":176870453,"uuid":"658592625","full_name":"Aisuko/notebooks","owner":"Aisuko","description":"Implementation for the different ML tasks on Kaggle platform with GPUs. ","archived":false,"fork":false,"pushed_at":"2025-05-01T10:29:55.000Z","size":167792,"stargazers_count":20,"open_issues_count":0,"forks_count":3,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-05-01T11:30:39.464Z","etag":null,"topics":["accelerator","computer-vision","fine-tuning","kaggle","large-language-models","multimodal","natural-language-processing","neural-network","peft","pytorch","quantization","renforcement-learning","tensorboard","transformers","visulization","wandb"],"latest_commit_sha":null,"homepage":"https://www.kaggle.com/aisuko/code","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Aisuko.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2023-06-26T05:46:04.000Z","updated_at":"2025-05-01T10:29:58.000Z","dependencies_parsed_at":null,"dependency_job_id":"12c215e8-1dea-476f-9b7e-bb6593c17e0b","html_url":"https://github.com/Aisuko/notebooks","commit_stats":null,"previous_names":["aisuko/notebooks"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/Aisuko/notebooks","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Aisuko%2Fnotebooks","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Aisuko%2Fnotebooks/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Aisuko%2Fnotebooks/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Aisuko%2Fnotebooks/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Aisuko","download_url":"https://codeload.github.com/Aisuko/notebooks/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Aisuko%2Fnotebooks/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":262704599,"owners_count":23351100,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["accelerator","computer-vision","fine-tuning","kaggle","large-language-models","multimodal","natural-language-processing","neural-network","peft","pytorch","quantization","renforcement-learning","tensorboard","transformers","visulization","wandb"],"created_at":"2024-11-08T14:46:30.717Z","updated_at":"2025-06-30T03:34:37.700Z","avatar_url":"https://github.com/Aisuko.png","language":"Jupyter Notebook","readme":"# Overview\n\n\u003e We might agree that the operation of LLMs will embed in daily programming in the future. So, we use these notebooks to familiarize ourselves with the LLMs tools ecosystem and quantization techniques. I believe that Cloud quantum computing is needed for the future of LLMs. Maybe somthing Qubernetes.\n\nAll these notebooks have been completed running on the [Kaggle](https://www.kaggle.com/aisuko/code) platform. With the free GPUs. Some of the notebooks use a single GPU P100, some of notebooks use double GPU T4x2, others use CPUs.\n\n\u003e Note: Some of the large size notebooks like [Topic Modeling with BERTopic](nlp/sentence-similarity/clustering/topic-modeling-with-bertopic.ipynb) may not be able to show the complete version on the preview of Github. You can open it in the Kaggle platform by clicking the link in the notebook's title.\n\n\n# This project is interested in\n\nI want to use of deep neural networks to do GenAI on consumer-grade hardware for researching.\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/The Field of AI in Layers.svg\" width=\"100%\" heigh=\"100%\" alt=\"the field of AI in layers\"\u003e\u003c/div\u003e\n\n\n# The ML tasks are covered in this project\n\nThis project's notebooks are covering the following some of the following tasks:\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/ML Tasks.svg\" width=\"100%\" heigh=\"100%\" alt=\"ML Tasks\"\u003e\u003c/div\u003e\n\n\n# The LLMs are used in this project\n\nThe some of LLMs are used in this project are as follows:\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/LLMs ecosystems-Transformers Architecture.svg\" width=\"100%\" heigh=\"100%\" alt=\"LLMs ecosystem LLMs\"\u003e\u003c/div\u003e\n\n\n# Metrics of fine-tuning\n\n\u003e Note: All the fine-tuning here is under the limited computing resource, so the metrics are not the best. Most of reasons are `num_train_epochs` is not enough. However, the fine-tuning process is the same as the normal process.\n\nAnd you can check the metrics of the fine-tuning in [wandb.ai](https://wandb.ai/causal_language_trainer?shareProfileType=copy). It includes many of useful metrics, like: training, evaling, system power usage, like below:\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/The metrics of fine-tuning.png\" width=\"100%\" heigh=\"100%\" alt=\"The metrics of fine-tuning\"\u003e\u003c/div\u003e\n\n\n# LLMs tools are used in this project\n\nThe tools we covered in this project are as follows:\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/LLMs ecosystems-LLMs ecosystems.svg\" width=\"100%\" heigh=\"100%\" alt=\"LLMs ecosystems-LLMs ecosystem\"\u003e\u003c/div\u003e\n\n\n# Quantization techniques are used in this project\n\nThe quantization techniques we used in this project are as follows:\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/Quantization.svg\" width=\"100%\" heigh=\"100%\" alt=\"quantization techniques\"\u003e\u003c/div\u003e\n\n\n# The Concepts From Papers We Should Know\n\n\u003cdiv style=\"text-align: center\"\u003e\u003cimg src=\"images/LLMs ecosystems-Concepts From Papers.svg\" width=\"100%\" heigh=\"100%\" alt=\"the concepts from papers\"\u003e\u003c/div\u003e\n\n# License\n\nLicensed under the Apache License, Version 2.0 (the \"License\"); you may not use this file except in compliance with the License. See the [LICENSE](LICENSE) file for details. \n\n\n# Credits\n\nMany of the notebooks are based on articles on Medium, TheNewStack, Huggingface and other open-source projects etc. Thanks for these great works.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faisuko%2Fnotebooks","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Faisuko%2Fnotebooks","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faisuko%2Fnotebooks/lists"}