{"id":18785271,"url":"https://github.com/andreped/dss","last_synced_at":"2025-04-13T12:33:52.270Z","repository":{"id":60806140,"uuid":"533821880","full_name":"andreped/DSS","owner":"andreped","description":":vibration_mode: From training of transformers to real-time development in cross-platform mobile apps!","archived":false,"fork":false,"pushed_at":"2024-07-05T10:13:58.000Z","size":47882,"stargazers_count":8,"open_issues_count":6,"forks_count":4,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-04-13T05:36:26.031Z","etag":null,"topics":["android","cnn","data-visualization","deep-learning","dss","flutter","internet-of-things","ios","iot","mobile","real-time","recording","rnn","sensor","tensorboard","tensorflow","tf2","tflite","vision-transformer","vit"],"latest_commit_sha":null,"homepage":"","language":"Dart","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/andreped.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-09-07T15:16:25.000Z","updated_at":"2024-07-05T10:14:02.000Z","dependencies_parsed_at":"2024-11-07T20:47:44.913Z","dependency_job_id":"c2bc0688-67f0-4c44-a6f7-e8a369c252d4","html_url":"https://github.com/andreped/DSS","commit_stats":null,"previous_names":[],"tags_count":10,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/andreped%2FDSS","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/andreped%2FDSS/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/andreped%2FDSS/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/andreped%2FDSS/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/andreped","download_url":"https://codeload.github.com/andreped/DSS/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248714736,"owners_count":21149958,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["android","cnn","data-visualization","deep-learning","dss","flutter","internet-of-things","ios","iot","mobile","real-time","recording","rnn","sensor","tensorboard","tensorflow","tf2","tflite","vision-transformer","vit"],"created_at":"2024-11-07T20:46:07.026Z","updated_at":"2025-04-13T12:33:47.260Z","avatar_url":"https://github.com/andreped.png","language":"Dart","readme":"\u003cdiv align=\"center\"\u003e\r\n    \u003cimg src=\"assets/sketch.png\" alt=\"drawing\" width=\"400\"\u003e\r\n\u003c/div\u003e\r\n\u003cdiv align=\"center\"\u003e\r\n\u003ch1 align=\"center\"\u003eDSS: Deep Sensor Systems\u003c/h1\u003e\r\n\u003ch3 align=\"center\"\u003e:vibration_mode: From training of transformers to real-time development in cross-platform mobile apps!\u003c/h3\u003e\r\n\r\n[![License](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)\r\n[![GitHub Downloads](https://img.shields.io/github/downloads/andreped/DSS/total?label=GitHub%20downloads\u0026logo=github)](https://github.com/andreped/DSS/releases)\r\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7568040.svg)](https://doi.org/10.5281/zenodo.7568040)\r\n[![codecov](https://codecov.io/gh/andreped/DSS/branch/main/graph/badge.svg?token=Nf2GKXXYXE)](https://codecov.io/gh/andreped/DSS)\r\n\r\n**DSS** was developed by SINTEF Medical Image Analysis with aim to integrate AIs into sensor systems.\r\n\u003c/div\u003e\r\n\r\nThis project serves as a demonstration on how to do it, and does not claim to be a generic framework.\r\n\r\nBelow there are described some of the key features of this project, but to see what else is possible, please, see [the wiki](https://github.com/andreped/DSS/wiki).\r\n\r\n## [Continuous integration](https://github.com/andreped/DSS#continuous-integration)\r\n\r\n| Build Type | Status |\r\n| - | - |\r\n| **Test Training** | ![CI](https://github.com/andreped/DSS/workflows/Test%20Training/badge.svg) |\r\n| **Test Flutter** | ![CI](https://github.com/andreped/DSS/workflows/Test%20Flutter/badge.svg)|\r\n| **Build APK** | ![CI](https://github.com/andreped/DSS/workflows/Build%20APK/badge.svg) |\r\n\r\n\r\n## [How to your train own model?](https://github.com/andreped/DSS#how-to-train-your-own-model)\r\n\r\n\u003cdetails\u003e\r\n\u003csummary\u003e\r\n\r\n### [Setup](https://github.com/andreped/DSS#setup)\u003c/summary\u003e\r\n\r\nWhen using this framework, it is a good idea to setup a virtual environment:\r\n```\r\nvirtualenv -ppython3 venv --clear\r\nsource venv/bin/activate\r\npip install -r requirements.txt\r\n```\r\n\r\nThe following dependencies will be installed:\r\n\r\n* `pandas\u003c=1.5.3`\r\n* `tensorflow\u003c=2.12.0`\r\n* `tensorflow-addons\u003c=0.19.0`\r\n* `tensorflow-datasets\u003c=4.8.3`\r\n\r\nTested with Python 3.7.9 on Win10, macOS, and Ubuntu Linux operating systems. Also tested with Python 3.10.4 on GitHub Codespaces.\r\n\r\nNote that to activate the virtual environment on Windows instead run `./venv/Scripts/activate`.\r\n\r\n\u003c/details\u003e\r\n\r\n\r\n\u003cdetails\u003e\r\n\u003csummary\u003e\r\n\r\n### [Usage](https://github.com/andreped/DSS#usage)\u003c/summary\u003e\r\n\r\nTo train a model, simply run:\r\n```\r\npython main.py\r\n```\r\n\r\nThe script supports multiple arguments. To see supported arguments, run `python main.py -h`.\r\n\r\n\u003c/details\u003e\r\n\r\n\r\n\u003cdetails open\u003e\r\n\u003csummary\u003e\r\n\r\n### [Training history](https://github.com/andreped/DSS#training-history)\u003c/summary\u003e\r\n\r\nTo visualize training history, use TensorBoard (with example):\r\n```\r\ntensorboard --logdir ./output/logs/gesture_classifier_arch_rnn/\r\n```\r\n\r\nExample of training history for a Recurrent Neural Network (RNN) can be seen underneath:\r\n\r\n\u003cimg src=\"assets/RNN_training_curve.png\"\u003e\r\n\r\nThe figure shows macro-averaged F1-score for each step during training, with black curve for training and blue curve for validation sets.\r\nBest model reached a macro-averaged F1 score of 99.66 % on the validation set, across all 20 classes.\r\n\r\n**Disclaimer:** This model was only trained for testing purposes. The input features were stratified on sample-level and not patient-level, and thus validation performance will likely not represent true performance on new data. However, having a trained model enables us to test it in a Mobile app.\r\n\r\n\u003c/details\u003e\r\n\r\n\r\n\u003cdetails\u003e\r\n\u003csummary\u003e\r\n\r\n### [Available datasets](https://github.com/andreped/DSS#available-datasets)\u003c/summary\u003e\r\n\r\n#### [SmartWatch Gestures](https://github.com/andreped/DSS#smartwatch-gestures)\r\n\r\nThe current data used to train the AI model is the SmartWatch Gestures dataset,\r\nwhich is available in [tensorflow-datasets](https://www.tensorflow.org/datasets/catalog/smartwatch_gestures). The dataset has the\r\nfollowing structure:\r\n```\r\nFeaturesDict({\r\n    'attempt': tf.uint8,\r\n    'features': Sequence({\r\n        'accel_x': tf.float64,\r\n        'accel_y': tf.float64,\r\n        'accel_z': tf.float64,\r\n        'time_event': tf.uint64,\r\n        'time_millis': tf.uint64,\r\n        'time_nanos': tf.uint64,\r\n    }),\r\n    'gesture': ClassLabel(shape=(), dtype=tf.int64, num_classes=20),\r\n    'participant': tf.uint8,\r\n})\r\n```\r\n\u003c/details\u003e\r\n\r\n\r\n## [How to test the model in a mobile app?](https://github.com/andreped/DSS#how-to-test-the-model-in-a-mobile-app)\r\n\r\n\u003cdetails\u003e\r\n\u003csummary\u003e\r\n\r\n### [Converting model to TF-Lite](https://github.com/andreped/DSS#converting-model-to-tf-lite)\u003c/summary\u003e\r\n\r\nIn order to be able to use the trained model in a mobile app, it is necessary to convert the model to a compatible format. TensorFlow Lite is an inference engine tailored for mobile devices. To convert the model to TF-Lite, simply run this command:\r\n\r\n```\r\npython dss/keras2tflite.py -m /path/to/pretrained/saved_model/ -o /path/to/save/converted/model.tflite\r\n```\r\n\r\n\u003c/details\u003e\r\n\r\n\r\n\u003cdetails open\u003e\r\n\u003csummary\u003e\r\n\r\n### [Model integration and testing in app](https://github.com/andreped/DSS#model-integration-and-testing-in-app)\u003c/summary\u003e\r\n\r\nA simple Mobile app was developed in Flutter, which demonstrates the AI in action using the accelerometer data from the mobile phone in real time. The data can also be stored and deleted locally.\r\n\r\n\u003cp align=\"center\" width=\"100%\"\u003e\r\n\u003cimg src=\"sw_app/assets/HomeScreen.jpg\" width=\"18%\" height=\"20%\"\u003e \u003cimg src=\"sw_app/assets/Prediction.jpg\" width=\"18%\" height=\"20%\"\u003e \u003cimg src=\"sw_app/assets/ChartWithFPS.jpg\" width=\"18%\" height=\"20%\"\u003e\r\n\u003cimg src=\"sw_app/assets/Recording.jpg\" width=\"18%\" height=\"20%\"\u003e \u003cimg src=\"sw_app/assets/Database.jpg\" width=\"18%\" height=\"20%\"\u003e\r\n\u003c/p\u003e\r\n\r\nTo use the app, you need an Android phone and have developer mode enabled (see [here](https://developer.android.com/studio/debug/dev-options) for how to enable it). Then simply download the APK from [here](https://github.com/andreped/DSS/releases), double-click to install, and use the app as you normally would.\r\n\r\nInfo on how the mobile app was developed (and how to make your own app), can be found [in the wiki](https://github.com/andreped/DSS/wiki/Getting-started-with-mobile-development).\r\n\r\n\u003c/details\u003e\r\n\r\n## [Acknowledgements](https://github.com/andreped/DSS#acknowledgements)\r\n\r\nThe training framework was mainly developed using [Keras](https://github.com/keras-team/keras) with [TensorFlow](https://github.com/tensorflow/tensorflow) backend.\r\n\r\nThe mobile app was developed using [Flutter](https://github.com/flutter/flutter), which is a framework developed by Google.\r\nFor the app, the following _open_ packages were used (either MIT, BSD-2, or BSD-3 licensed):\r\n* [flutter_sensors](https://pub.dev/packages/flutter_sensors)\r\n* [tflite_flutter](https://pub.dev/packages/tflite_flutter)\r\n* [wakelock](https://pub.dev/packages/wakelock)\r\n* [sqflite](https://pub.dev/packages/sqflite)\r\n* [intl](https://pub.dev/packages/intl)\r\n* [csv](https://pub.dev/packages/csv)\r\n* [path_provider](https://pub.dev/packages/path_provider)\r\n\r\n## [How to cite?](https://github.com/andreped/DSS#how-to-cite)\r\n\r\nIf you found this project useful, please, consider citing it in your research article:\r\n\r\n```\r\n@software{andre_pedersen_2023_7701510,\r\n  author       = {André Pedersen and Ute Spiske and Javier Pérez de Frutos},\r\n  title        = {andreped/DSS: v0.2.0},\r\n  month        = mar,\r\n  year         = 2023,\r\n  publisher    = {Zenodo},\r\n  version      = {v0.2.0},\r\n  doi          = {10.5281/zenodo.7701510},\r\n  url          = {https://doi.org/10.5281/zenodo.7701510}\r\n}\r\n```\r\n\r\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fandreped%2Fdss","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fandreped%2Fdss","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fandreped%2Fdss/lists"}