{"id":13504896,"url":"https://github.com/ntedgi/node-efficientnet","last_synced_at":"2025-03-29T22:31:22.715Z","repository":{"id":38290253,"uuid":"324599509","full_name":"ntedgi/node-efficientnet","owner":"ntedgi","description":"tensorflowJS implementation of EfficientNet 🚀","archived":false,"fork":false,"pushed_at":"2024-02-15T00:52:38.000Z","size":65256,"stargazers_count":260,"open_issues_count":22,"forks_count":36,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-03-28T19:08:35.549Z","etag":null,"topics":["classification","deep-learning","efficientnet","efficientnet-model","hacktoberfest","image-classification","image-recognition","imagenet","mobilenet","pretrained-models","tensorflow","tfjs"],"latest_commit_sha":null,"homepage":"https://www.npmjs.com/package/node-efficientnet","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ntedgi.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":{"custom":["https://paypal.me/tedgi"]}},"created_at":"2020-12-26T17:05:02.000Z","updated_at":"2025-03-03T23:22:05.000Z","dependencies_parsed_at":"2024-11-01T02:31:46.422Z","dependency_job_id":"b6103e11-ffeb-4656-849f-9ede8ebb1e98","html_url":"https://github.com/ntedgi/node-efficientnet","commit_stats":{"total_commits":306,"total_committers":23,"mean_commits":"13.304347826086957","dds":0.5359477124183006,"last_synced_commit":"eca87e545cc027798964d5e574e0ef725f326ae8"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ntedgi%2Fnode-efficientnet","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ntedgi%2Fnode-efficientnet/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ntedgi%2Fnode-efficientnet/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ntedgi%2Fnode-efficientnet/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ntedgi","download_url":"https://codeload.github.com/ntedgi/node-efficientnet/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246254077,"owners_count":20747946,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["classification","deep-learning","efficientnet","efficientnet-model","hacktoberfest","image-classification","image-recognition","imagenet","mobilenet","pretrained-models","tensorflow","tfjs"],"created_at":"2024-08-01T00:00:53.020Z","updated_at":"2025-03-29T22:31:17.699Z","avatar_url":"https://github.com/ntedgi.png","language":"TypeScript","funding_links":["https://paypal.me/tedgi"],"categories":["TypeScript"],"sub_categories":["Misc"],"readme":"# TensorflowJS EfficientNet\n\n![npm](https://img.shields.io/npm/v/node-efficientnet) ![Node.js CI](https://github.com/ntedgi/node-efficientnet/workflows/Node.js%20CI/badge.svg?branch=main) [![codecov](https://codecov.io/gh/ntedgi/node-efficientnet/branch/main/graph/badge.svg?token=gkNPxmyry3)](https://codecov.io/gh/ntedgi/node-efficientnet)\n[![Codacy Badge](https://app.codacy.com/project/badge/Grade/09917d9ddf9c42648eb60d7d917f5026)](https://www.codacy.com/gh/ntedgi/node-efficientnet/dashboard?utm_source=github.com\u0026utm_medium=referral\u0026utm_content=ntedgi/node-efficientnet\u0026utm_campaign=Badge_Grade)\n[![Run on Repl.it](https://repl.it/badge/github/ntedgi/node-efficientnet)](https://repl.it/github/ntedgi/node-efficientnet)\n\u003ca href=\"https://www.npmjs.com/package/node-efficientnet\"\u003e\u003cimg src=\"https://img.shields.io/npm/l/node-efficientnet.svg\" alt=\"License\"\u003e\u003c/a\u003e  \n\u003ca href=\"https://npmcharts.com/compare/node-efficientnet?minimal=true\"\u003e\u003cimg src=\"https://img.shields.io/npm/dm/node-efficientnet.svg\" alt=\"Downloads\"\u003e\u003c/a\u003e\n\u003ca href=\"https://npmcharts.com/compare/node-efficientnet?minimal=true\"\u003e\u003cimg src=\"https://img.shields.io/npm/dt/node-efficientnet.svg\" alt=\"Downloads\"\u003e\u003c/a\u003e\n[![Gitter](https://badges.gitter.im/node-efficientnet/community.svg)](https://gitter.im/node-efficientnet/community?utm_source=badge\u0026utm_medium=badge\u0026utm_campaign=pr-badge)\n\n\n\nThis repository contains a tensorflowJs implementation of **EfficientNet**,\nan object detection model trained on [ImageNet](http://www.image-net.org/) and can detect [1000 different objects](https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt).\n\nEfficientNet a lightweight convolutional neural network architecture achieving the [state-of-the-art accuracy with an order of magnitude fewer parameters and FLOPS](https://arxiv.org/abs/1905.11946), on both ImageNet and\nfive other commonly used transfer learning datasets.\n\nThe codebase is heavily inspired by the [TensorFlow implementation](https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet).\n\n![Alt Text](https://raw.githubusercontent.com/ntedgi/node-efficientnet/main/media/sample.gif)\n\n#\n\n## 👏 Supporters\n\n### \u0026#8627; Stargazers\n\n[![Stargazers repo roster for @ntedgi/node-efficientnet](https://reporoster.com/stars/ntedgi/node-efficientnet)](https://github.com/ntedgi/node-efficientnet/stargazers)\n\n### \u0026#8627; Forkers\n\n[![Forkers repo roster for @ntedgi/node-efficientnet](https://reporoster.com/forks/ntedgi/node-efficientnet)](https://github.com/ntedgi/node-efficientnet/network/members)\n\n## Multilingual status\n\n| locale  |           status           |              translate by 👑                | \n|:-------:|:--------------------------:|:------------------------------------------:|\n|  `en`   |             ✅              |                                            |\n|  `zh`   |             ✅              |  [@luoye-fe](https://github.com/luoye-fe)  |\n|  `es`   |             ✅              |     [@h383r](https://github.com/h383r)     |\n|  `ar`   |             ✅              |   [@lamamyf](https://github.com/lamamyf)   |\n|  `ru`   |             ✅              | [@Abhighyaa](https://github.com/Abhighyaa) |\n|  `he`   |             ✅              | [@jhonDoe15](https://github.com/jhonDoe15) |\n|  `fr`   |             ✅              |  [@burmanp](https://github.com/burmanp)    |\n| `other` | ⏩ (need help, PR welcome ) |                                            |\n\n## Table of Contents\n\n1.  [Just Want to Play With The Model](#how-i-run-this-project-locally-)\n2.  [Installation](#installation)\n3.  [API](#api)\n4.  [Examples](#examples)\n5.  [Usage](#usgae)\n6.  [About EfficientNet Models](#about-efficientnet-models)\n7.  [Models](#models)\n8.  [Multilingual status](#multilingual-status)\n\n## How I Run This Project Locally ?\n\n- clone this repository\n- Just Want to Play ?\n  - At the root project go to playground directory, Run: `docker-compose up`\n  - Navigate to http://localhost:8080\n\n## Usage:\n\nEfficientNet has 8 different model checkpoints each checkpoint as different input layer resolution\nfor larger input layer resolution, the greater the accuracy and the running time is slower.\n\nfor example lets take this images:\n\n\u003ctable border=\"0\"\u003e\n\u003ctr\u003e\n    \u003ctd\u003e\n      \u003cimg src=\"https://raw.githubusercontent.com/ntedgi/node-efficientnet/main/samples/panda.jpg\" width=\"100%\" /\u003e\n    \u003c/td\u003e\n    \u003ctd\u003e\n       \u003ctable border=\"0\"\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eModel\u003c/td\u003e\n          \u003ctd\u003ePrediction\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB0\u003c/td\u003e\n          \u003ctd\u003e('Giant panda',83.607) , ( 'Skunk',11.61) , ('hog',4.772)\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB7\u003c/td\u003e\n          \u003ctd\u003e('Giant panda',90.406) , ( 'American black bear',7.07) , ('Badger',2.5192)\u003c/td\u003e\n        \u003c/tr\u003e\n      \u003c/table\u003e\n    \u003c/td\u003e\n\u003c/tr\u003e\n\u003ctr\u003e\n    \u003ctd\u003e\n      \u003cimg src=\"https://raw.githubusercontent.com/ntedgi/node-efficientnet/main/samples/fish.jpg\" width=\"100%\" /\u003e\n    \u003c/td\u003e\n    \u003ctd\u003e\n       \u003ctable border=\"0\"\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eModel\u003c/td\u003e\n          \u003ctd\u003ePrediction\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB3\u003c/td\u003e\n          \u003ctd\u003e('goldfish, Carassius auratus',82.5) , ( 'starfish, sea star',9.26) , ('corn'',7.33)\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB7\u003c/td\u003e\n          \u003ctd\u003e('goldfish, Carassius auratus',97.5) , ( 'starfish, sea star',1.46) , ('corn'',0.93)\u003c/td\u003e\n        \u003c/tr\u003e\n      \u003c/table\u003e\n    \u003c/td\u003e\n\u003c/tr\u003e\n\u003ctr\u003e\n    \u003ctd\u003e\n      \u003cimg src=\"https://raw.githubusercontent.com/ntedgi/node-efficientnet/main/samples/car.jpg\" width=\"100%\" /\u003e\n    \u003c/td\u003e\n    \u003ctd\u003e\n       \u003ctable border=\"0\"\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eModel\u003c/td\u003e\n          \u003ctd\u003ePrediction\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB0\u003c/td\u003e\n          \u003ctd\u003e('Sports Car',88.02) , ( 'racing car',6.647) , ('car wheel',5.32)\u003c/td\u003e\n        \u003c/tr\u003e\n         \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB7\u003c/td\u003e\n          \u003ctd\u003e('Sports Car',87.68) , ( 'convertible'',7.831) , ('car wheel',4.485)\u003c/td\u003e\n        \u003c/tr\u003e\n      \u003c/table\u003e\n    \u003c/td\u003e\n\u003c/tr\u003e\n\u003ctr\u003e\n    \u003ctd\u003e\n      \u003cimg src=\"https://raw.githubusercontent.com/ntedgi/node-efficientnet/main/samples/gun.jpg\" width=\"100%\" /\u003e\n    \u003c/td\u003e\n    \u003ctd\u003e\n       \u003ctable border=\"0\"\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eModel\u003c/td\u003e\n          \u003ctd\u003ePrediction\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB0\u003c/td\u003e\n              \u003ctd\u003e('revolver',85.52) , ( 'assault rifle',9.85) , ('rifle',4.6197)\u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n          \u003ctd\u003eEfficientNetB7\u003c/td\u003e\n          \u003ctd\u003e('revolver',88.13) , ( 'rifle',8.29) , ('assault rifle',3.56)\u003c/td\u003e\n        \u003c/tr\u003e\n      \u003c/table\u003e\n    \u003c/td\u003e\n\u003c/tr\u003e\n    \n    \n\u003c/table\u003e\n\n#\n\n## Installation\n\n```node\nnpm i --save node-efficientnet\n```\n\n## API\n\n### `EfficientNetCheckPointFactory.create(checkPoint: EfficientNetCheckPoint, options?: EfficientNetCheckPointFactoryOptions): Promise\u003cEfficientNetModel\u003e`\n\nExample: to create an efficientnet model you need to pass `EfficientNetCheckPoint`\n(available checkpoint [B0..B7]) each one of them represent different model\n\n```javascript\nconst {\n  EfficientNetCheckPointFactory,\n  EfficientNetCheckPoint,\n} = require(\"node-efficientnet\");\n\nconst model = await EfficientNetCheckPointFactory.create(\n  EfficientNetCheckPoint.B7\n);\n\nconst path2image = \"...\";\n\nconst topResults = 5;\n\nconst result = await model.inference(path2image, {\n  topK: topResults,\n  locale: \"zh\",\n});\n```\n\nOf course, you can use local model file to speed up loading\n\nYou can download model file from [efficientnet-tensorflowjs-binaries](https://github.com/ntedgi/efficientnet-tensorflowjs-binaries), please keep the directory structure consistent, just like:\n\n```\nlocal_model\n  └── B0\n    ├── group1-shard1of6.bin\n    ├── group1-shard2of6.bin\n    ├── group1-shard3of6.bin\n    ├── group1-shard4of6.bin\n    ├── group1-shard5of6.bin\n    ├── group1-shard6of6.bin\n    └── model.json\n```\n\n```javascript\nconst path = require(\"path\");\nconst {\n  EfficientNetCheckPointFactory,\n  EfficientNetCheckPoint,\n} = require(\"node-efficientnet\");\n\nconst model = await EfficientNetCheckPointFactory.create(\n  EfficientNetCheckPoint.B7,\n  {\n    localModelRootDirectory: path.join(__dirname, \"local_model\"),\n  }\n);\n\nconst path2image = \"...\";\n\nconst topResults = 5;\n\nconst result = await model.inference(path2image, {\n  topK: topResults,\n  locale: \"zh\",\n});\n```\n\n#\n\n## Examples\n\ndownload files from remote and predict using model\n\n```js\nconst fs = require(\"fs\");\nconst nodeFetch = require(\"node-fetch\");\n\nconst {\n  EfficientNetCheckPointFactory,\n  EfficientNetCheckPoint,\n} = require(\"node-efficientnet\");\n\nconst images = [\"car.jpg\", \"panda.jpg\"];\nconst imageDir = \"./samples\";\nconst imageDirRemoteUri =\n  \"https://raw.githubusercontent.com/ntedgi/node-EfficientNet/main/samples\";\n\nif (!fs.existsSync(imageDir)) {\n  fs.mkdirSync(imageDir);\n}\n\nasync function download(image, cb) {\n  const response = await nodeFetch.default(`${imageDirRemoteUri}/${image}`);\n  const buffer = await response.buffer();\n  fs.writeFile(`${imageDir}/${image}`, buffer, cb);\n}\n\nEfficientNetCheckPointFactory.create(EfficientNetCheckPoint.B2)\n  .then((model) =\u003e {\n    images.forEach(async (image) =\u003e {\n      await download(image, () =\u003e {\n        model.inference(`${imageDir}/${image}`).then((result) =\u003e {\n          console.log(result.result);\n        });\n      });\n    });\n  })\n  .catch((e) =\u003e {\n    console.error(e);\n  });\n```\n\noutput :\n\n```js\n[\n  { label: \"sports car, sport car\", precision: 88.02440940394301 },\n  {\n    label: \"racer, race car, racing car\",\n    precision: 6.647441678387659,\n  },\n  { label: \"car wheel\", precision: 5.3281489176693295 },\n][\n  ({\n    label: \"giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\",\n    precision: 83.60747593436018,\n  },\n  { label: \"skunk, poleca\", precision: 11.61300759424677 },\n  {\n    label: \"hog, pig, grunter, squealer, Sus scrofa\",\n    precision: 4.779516471393051,\n  })\n];\n```\n\n#\n\n## About EfficientNet Models\n\nEfficientNets rely on AutoML and compound scaling to achieve superior performance without compromising resource efficiency. The [AutoML Mobile framework](https://ai.googleblog.com/2018/08/mnasnet-towards-automating-design-of.html) has helped develop a mobile-size baseline network, **EfficientNet-B0**, which is then improved by the compound scaling method to obtain EfficientNet-B1 to B7.\n\n\u003ctable border=\"0\"\u003e\n\u003ctr\u003e\n    \u003ctd\u003e\n    \u003cimg src=\"https://raw.githubusercontent.com/tensorflow/tpu/master/models/official/efficientnet/g3doc/params.png\" width=\"100%\" /\u003e\n    \u003c/td\u003e\n    \u003ctd\u003e\n    \u003cimg src=\"https://raw.githubusercontent.com/tensorflow/tpu/master/models/official/efficientnet/g3doc/flops.png\", width=\"90%\" /\u003e\n    \u003c/td\u003e\n\u003c/tr\u003e\n\u003c/table\u003e\n\n#\n\nEfficientNets achieve state-of-the-art accuracy on ImageNet with an order of magnitude better efficiency:\n\n- In high-accuracy regime, EfficientNet-B7 achieves the state-of-the-art 84.4% top-1 / 97.1% top-5 accuracy on ImageNet with 66M parameters and 37B FLOPS. At the same time, the model is 8.4x smaller and 6.1x faster on CPU inference than the former leader, [Gpipe](https://arxiv.org/abs/1811.06965).\n\n- In middle-accuracy regime, EfficientNet-B1 is 7.6x smaller and 5.7x faster on CPU inference than [ResNet-152](https://arxiv.org/abs/1512.03385), with similar ImageNet accuracy.\n\n- Compared to the widely used [ResNet-50](https://arxiv.org/abs/1512.03385), EfficientNet-B4 improves the top-1 accuracy from 76.3% of ResNet-50 to 82.6% (+6.3%), under similar FLOPS constraints.\n\n#\n\n## Models\n\nThe performance of each model variant using the pre-trained weights converted from checkpoints provided by the authors is as follows:\n\n| Architecture   | @top1\\* Imagenet | @top1\\* Noisy-Student |\n| -------------- | :--------------: | :-------------------: |\n| EfficientNetB0 |      0.772       |         0.788         |\n| EfficientNetB1 |      0.791       |         0.815         |\n| EfficientNetB2 |      0.802       |         0.824         |\n| EfficientNetB3 |      0.816       |         0.841         |\n| EfficientNetB4 |      0.830       |         0.853         |\n| EfficientNetB5 |      0.837       |         0.861         |\n| EfficientNetB6 |      0.841       |         0.864         |\n| EfficientNetB7 |      0.844       |         0.869         |\n\n**\\*** - topK accuracy score for converted models (imagenet `val` set)\n\n---\n\n```ts\nif (this.repo.isAwesome || this.repo.isHelpful) {\n  Star(this.repo);\n}\n```\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fntedgi%2Fnode-efficientnet","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fntedgi%2Fnode-efficientnet","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fntedgi%2Fnode-efficientnet/lists"}