{"id":28710847,"url":"https://github.com/arm-software/ml-zoo","last_synced_at":"2025-06-14T21:08:25.419Z","repository":{"id":39637923,"uuid":"304070372","full_name":"ARM-software/ML-zoo","owner":"ARM-software","description":null,"archived":false,"fork":false,"pushed_at":"2023-03-31T09:47:56.000Z","size":64402,"stargazers_count":221,"open_issues_count":10,"forks_count":55,"subscribers_count":11,"default_branch":"master","last_synced_at":"2025-04-02T04:35:37.921Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ARM-software.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-10-14T16:17:33.000Z","updated_at":"2025-03-27T02:25:35.000Z","dependencies_parsed_at":"2024-11-03T02:31:51.809Z","dependency_job_id":"0158c029-d37c-4966-b43e-89a5b2324579","html_url":"https://github.com/ARM-software/ML-zoo","commit_stats":null,"previous_names":[],"tags_count":5,"template":false,"template_full_name":null,"purl":"pkg:github/ARM-software/ML-zoo","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ARM-software%2FML-zoo","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ARM-software%2FML-zoo/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ARM-software%2FML-zoo/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ARM-software%2FML-zoo/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ARM-software","download_url":"https://codeload.github.com/ARM-software/ML-zoo/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ARM-software%2FML-zoo/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":259884525,"owners_count":22926446,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-06-14T21:08:24.448Z","updated_at":"2025-06-14T21:08:25.402Z","avatar_url":"https://github.com/ARM-software.png","language":"Python","readme":"# Model Zoo\n![version](https://img.shields.io/badge/version-21.08-0091BD)\n\u003e A collection of machine learning models optimized for Arm IP.\n\n\n## Anomaly Detection\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (AUC)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/anomaly_detection/micronet_large/tflite_int8\"\u003eMicroNet Large INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.968\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/anomaly_detection/micronet_medium/tflite_int8\"\u003eMicroNet Medium INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.963\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/anomaly_detection/micronet_small/tflite_int8\"\u003eMicroNet Small INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.955\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: Dcase 2020 Task 2 Slide Rail\n\n## Image Classification\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (Top 1 Accuracy)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/image_classification/mobilenet_v2_1.0_224/tflite_int8\"\u003eMobileNet v2 1.0 224 INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.697\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/image_classification/mobilenet_v2_1.0_224/tflite_uint8\"\u003eMobileNet v2 1.0 224 UINT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eUINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.708\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: ILSVRC 2012\n\n## Keyword Spotting\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (Accuracy)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/cnn_large/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eCNN Large INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.923\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/cnn_medium/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eCNN Medium INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.905\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/cnn_small/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eCNN Small INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.902\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/dnn_large/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eDNN Large INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.860\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/dnn_medium/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eDNN Medium INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.839\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/dnn_small/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eDNN Small INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.821\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_large/model_package_tf/model_archive/TFLite/tflite_clustered_fp32\"\u003eDS-CNN Large Clustered FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.948\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_large/model_package_tf/model_archive/TFLite/tflite_clustered_int8\"\u003eDS-CNN Large Clustered INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.939\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_large/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eDS-CNN Large INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.945\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_medium/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eDS-CNN Medium INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.939\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_small/model_package_tf/model_archive/TFLite/tflite_int8\"\u003eDS-CNN Small INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.931\u003c/td\u003e\n    \u003c/tr\u003e\n        \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_small/model_package_tf/model_archive/TFLite/tflite_int16\"\u003eDS-CNN Small INT16 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT16\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.934\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/cnn_large/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eCNN Large FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.934\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/cnn_medium/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eCNN Medium FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.918\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/cnn_small/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eCNN Small FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.922\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/dnn_large/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eDNN Large FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.867\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/dnn_medium/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eDNN Medium FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.850\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/dnn_small/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eDNN Small FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.836\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_large/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eDS-CNN Large FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.950\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_medium/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eDS-CNN Medium FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.943\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/ds_cnn_small/model_package_tf/model_archive/TFLite/tflite_fp32\"\u003eDS-CNN Small FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.939\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/micronet_large/tflite_int8\"\u003eMicroNet Large INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.965\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/micronet_medium/tflite_int8\"\u003eMicroNet Medium INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.958\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/keyword_spotting/micronet_small/tflite_int8\"\u003eMicroNet Small INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.953\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: Google Speech Commands Test Set\n\n## Noise Suppression\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (Average Pesq)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/noise_suppression/RNNoise/tflite_int8\"\u003eRNNoise INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e2.945\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: Noisy Speech Database For Training Speech Enhancement Algorithms And Tts Models\n\n## Object Detection\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (mAP)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/object_detection/ssd_mobilenet_v1/tflite_fp32\"\u003eSSD MobileNet v1 FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.210\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/object_detection/ssd_mobilenet_v1/tflite_int8\"\u003eSSD MobileNet v1 INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.234\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/object_detection/ssd_mobilenet_v1/tflite_uint8\"\u003eSSD MobileNet v1 UINT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eUINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.180\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/object_detection/yolo_v3_tiny/tflite_fp32\"\u003eYOLO v3 Tiny FP32 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eFP32\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.331\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: COCO Validation 2017\n\n## Speech Recognition\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (LER)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/speech_recognition/wav2letter/tflite_int8\"\u003eWav2letter INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.0877\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/speech_recognition/wav2letter/tflite_pruned_int8\"\u003eWav2letter Pruned INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.0783\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/speech_recognition/tiny_wav2letter/tflite_int8\"\u003eTiny Wav2letter INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.0348\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/speech_recognition/tiny_wav2letter/tflite_pruned_int8\"\u003eTiny Wav2letter Pruned INT8 *\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.0283\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: LibriSpeech, Fluent Speech\n\n## Superresolution\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"125\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (PSNR)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/superresolution/SESR/tflite_int8\"\u003eSESR INT8 **\u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: HERO\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e35.00dB\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: DIV2K\n\n## Visual Wake Words\n\n\u003ctable\u003e\n    \u003ctr\u003e\n        \u003cth width=\"250\"\u003eNetwork\u003c/th\u003e\n        \u003cth width=\"100\"\u003eType\u003c/th\u003e\n        \u003cth width=\"160\"\u003eFramework\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-A\u003c/th\u003e\n        \u003cth width=\"120\"\u003eCortex-M\u003c/th\u003e\n        \u003cth width=\"120\"\u003eMali GPU\u003c/th\u003e\n        \u003cth width=\"120\"\u003eEthos U\u003c/th\u003e\n        \u003cth width=\"90\"\u003eScore (Accuracy)\u003c/th\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/visual_wake_words/micronet_vww2/tflite_int8\"\u003eMicroNet VWW-2 INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.768\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/visual_wake_words/micronet_vww3/tflite_int8\"\u003eMicroNet VWW-3 INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.855\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003ctd\u003e\u003ca href=\"models/visual_wake_words/micronet_vww4/tflite_int8\"\u003eMicroNet VWW-4 INT8 \u003c/a\u003e\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eINT8\u003c/td\u003e\n        \u003ctd align=\"center\"\u003eTensorFlow Lite\u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_multiplication_x: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e:heavy_check_mark: \u003c/td\u003e\n        \u003ctd align=\"center\"\u003e0.822\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\n**Dataset**: Visual Wake Words\n\n\n### Key\n* :heavy_check_mark: - Will run on this platform.\n* :heavy_multiplication_x: - Will not run on this platform.\n* `*` - Code to recreate model available.\n* `**` - This model has a large memory footprint – it will not run on all platforms.\n\n## License\n[Apache-2.0](https://spdx.org/licenses/Apache-2.0.html) unless otherwise explicitly stated.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Farm-software%2Fml-zoo","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Farm-software%2Fml-zoo","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Farm-software%2Fml-zoo/lists"}