{"id":20657807,"url":"https://github.com/visual-layer/visuallayer","last_synced_at":"2025-04-19T13:17:50.418Z","repository":{"id":176814985,"uuid":"623323921","full_name":"visual-layer/visuallayer","owner":"visual-layer","description":"Simplify Your Visual Data Ops. Find and visualize issues with your computer vision datasets such as duplicates, anomalies, data leakage, mislabels and others.","archived":false,"fork":false,"pushed_at":"2023-09-27T09:50:45.000Z","size":94525,"stargazers_count":67,"open_issues_count":5,"forks_count":2,"subscribers_count":7,"default_branch":"main","last_synced_at":"2025-03-29T08:11:33.324Z","etag":null,"topics":["cleaning","computer","computer-vision","data","data-science","dataset","datasets-preparation","generative","machine-learning","python","vision"],"latest_commit_sha":null,"homepage":"https://www.visual-layer.com/","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/visual-layer.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2023-04-04T06:30:34.000Z","updated_at":"2024-10-22T19:29:44.000Z","dependencies_parsed_at":"2023-09-27T15:46:46.512Z","dependency_job_id":null,"html_url":"https://github.com/visual-layer/visuallayer","commit_stats":null,"previous_names":["visual-layer/visuallayer","visual-layer/vl-datasets"],"tags_count":15,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/visual-layer%2Fvisuallayer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/visual-layer%2Fvisuallayer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/visual-layer%2Fvisuallayer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/visual-layer%2Fvisuallayer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/visual-layer","download_url":"https://codeload.github.com/visual-layer/visuallayer/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":249702524,"owners_count":21312761,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cleaning","computer","computer-vision","data","data-science","dataset","datasets-preparation","generative","machine-learning","python","vision"],"created_at":"2024-11-16T18:23:20.656Z","updated_at":"2025-04-19T13:17:50.379Z","avatar_url":"https://github.com/visual-layer.png","language":"Jupyter Notebook","readme":"\n\u003c!-- PROJECT SHIELDS --\u003e\n\u003c!--\n*** I'm using markdown \"reference style\" links for readability.\n*** Reference links are enclosed in brackets [ ] instead of parentheses ( ).\n*** See the bottom of this document for the declaration of the reference variables\n*** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use.\n*** https://www.markdownguide.org/basic-syntax/#reference-style-links\n--\u003e\n\n[![PyPi][pypi-shield]][pypi-url]\n[![PyPi][pypiversion-shield]][pypi-url]\n[![PyPi][downloads-shield]][downloads-url]\n[![License][license-shield]][license-url]\n[![TestedOn][testedon-shield]][pypi-url]\n\u003c!-- [![Contributors][contributors-shield]][contributors-url] --\u003e\n\n\n\u003c!-- MARKDOWN LINKS \u0026 IMAGES --\u003e\n\u003c!-- https://www.markdownguide.org/basic-syntax/#reference-style-links --\u003e\n[pypi-shield]: https://img.shields.io/badge/Python-3.7%20--%203.11-blue?style=for-the-badge\n[pypi-url]: https://pypi.org/project/visuallayer/\n[pypiversion-shield]: https://img.shields.io/pypi/v/visuallayer?style=for-the-badge\n[downloads-shield]: https://img.shields.io/badge/dynamic/json?style=for-the-badge\u0026label=downloads\u0026query=%24.total_downloads\u0026url=https%3A%2F%2Fapi.pepy.tech%2Fapi%2Fv2%2Fprojects%2Fvisuallayer\u0026color=lightblue\n[downloads-url]: https://pypi.org/project/visuallayer/\n\u003c!-- [contributors-shield]: https://img.shields.io/github/contributors/visual-layer/fastdup?style=for-the-badge --\u003e\n\u003c!-- [contributors-url]: https://github.com/othneildrew/Best-README-Template/graphs/contributors --\u003e\n[license-shield]: https://img.shields.io/badge/License-Apache%202.0-purple.svg?style=for-the-badge\n[license-url]: https://github.com/visual-layer/visuallayer/blob/main/LICENSE\n[testedon-shield]: https://img.shields.io/badge/Tested%20on-Ubuntu--22.04%20%7C%20MacOS--10.16%20Intel%20%7C%20Windows%2010-brightgreen?style=for-the-badge\n\n\n\u003c!-- PROJECT LOGO --\u003e\n\u003cbr /\u003e\n\u003cdiv align=\"center\"\u003e\n\u003ca href=\"https://www.visual-layer.com\"\u003e\n  \u003cimg alt=\"Visual Layer Logo\" src=\"https://github.com/visual-layer/visuallayer/blob/main/imgs/vl_horizontal_logo.png\" alt=\"Logo\" width=\"450\"\u003e\n\u003c/a\u003e\n\u003ch3 align=\"center\"\u003eUnleash the Full Power of Your Visual Data\u003c/h3\u003e\n  \u003cp align=\"center\"\u003e\n  \u003cbr /\u003e\n    \u003ca href=\"https://docs.visual-layer.com/docs/getting-started\" target=\"_blank\" rel=\"noopener noreferrer\"\u003e\u003cstrong\u003eExplore the docs »\u003c/strong\u003e\u003c/a\u003e\n    \u003cbr /\u003e\n    \u003ca href=\"https://github.com/visual-layer/visuallayer/issues\" target=\"_blank\" rel=\"noopener noreferrer\"\u003eReport Issues\u003c/a\u003e\n    ·\n    \u003ca href=\"https://medium.com/visual-layer/\" target=\"_blank\" rel=\"noopener noreferrer\"\u003eRead Blog\u003c/a\u003e\n    ·\n    \u003ca href=\"mailto:info@visual-layer.com?subject=Sign-up%20for%20access\" target=\"_blank\" rel=\"noopener noreferrer\"\u003eGet In Touch\u003c/a\u003e\n    ·\n    \u003ca href=\"https://visual-layer.com/\" target=\"_blank\" rel=\"noopener noreferrer\"\u003eAbout Us\u003c/a\u003e\n    \u003cbr /\u003e\n    \u003cbr /\u003e \n    💫 Check out \u003ca href=\"https://medium.com/visual-layer/introducing-vl-datasets-d85adfa93f0f\" target=\"_blank\" rel=\"noopener noreferrer\"\u003eVL Datasets release blog post\u003c/a\u003e\n    \u003cbr /\u003e \n    \u003cbr /\u003e \n    \u003ca href=\"https://visualdatabase.slack.com/join/shared_invite/zt-19jaydbjn-lNDEDkgvSI1QwbTXSY6dlA#/shared-invite/email\" target=\"_blank\" rel=\"noopener noreferrer\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/JOIN US ON SLACK-4A154B?style=for-the-badge\u0026logo=slack\u0026logoColor=white\" alt=\"Logo\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://visual-layer.readme.io/discuss\" target=\"_blank\" rel=\"noopener noreferrer\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/DISCUSSION%20FORUM-brightgreen?style=for-the-badge\u0026logo=discourse\u0026logoWidth=20\" alt=\"Logo\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://www.linkedin.com/company/visual-layer/\" target=\"_blank\" rel=\"noopener noreferrer\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/LinkedIn-0077B5?style=for-the-badge\u0026logo=linkedin\u0026logoColor=white\" alt=\"Logo\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://twitter.com/visual_layer\" target=\"_blank\" rel=\"noopener noreferrer\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/Twitter-1DA1F2?style=for-the-badge\u0026logo=twitter\u0026logoColor=white\" alt=\"Logo\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://www.youtube.com/@visual-layer\" target=\"_blank\" rel=\"noopener noreferrer\"\u003e\n    \u003cimg src=\"https://img.shields.io/badge/-YouTube-black.svg?style=for-the-badge\u0026logo=youtube\u0026colorB=red\" alt=\"Logo\"\u003e\n    \u003c/a\u003e\n  \u003c/p\u003e\n\u003c/div\u003e\n\n## Description\n`visuallayer SDK` is an open-source Python package that offers access and extensibility to the Visual Layer platform from your code. \n\nWhile the platform offers a high-level overview and visualization of your data, the SDK affords you the flexibility to integrate into your favorite machine learning frameworks and environments (e.g. Jupyter Notebook) using Python.\n\n## Installation\n\nThe easiest way to use the `visuallayer SDK` is to install it from [PyPI](https://pypi.org/project/visuallayer/). On your machine, run:\n\n```shell\npip install visuallayer\n```\n\nOptionally, you can also install the bleeding edge version on [GitHub](https://github.com/visual-layer/visuallayer) by running:\n\n```shell\npip install git+https://github.com/visual-layer/visuallayer.git@main --upgrade\n```\n\n## VL Datasets\nThe `visuallayer SDK` also lets you access [VL Datasets](https://docs.visual-layer.com/docs/what-are-vl-datasets) - a collection of clean versions of widely used computer vision datasets.\n\n\nFor example with only 2 lines of code, load the clean vl datasets version of the [ImageNet-1k](https://www.robots.ox.ac.uk/~vgg/data/pets/) dataset with:\n```python\nimport visuallayer as vl\ndataset = vl.datasets.zoo.load('vl-imagenet-1k')\n\n#Export to PyTorch\ntrain_dataset = dataset.export(output_format='pytorch', split='train')\n\n#PyTorch training loop\n```\n\n\u003e **Note**: `visuallayer` does not automatically download the ImageNet dataset, you should make sure to obtain usage rights to the dataset and download it into your current working directory first.\n\nWhen we say \"clean\" we mean that the datasets loaded by `visuallayer SDK` were flagged from common issues such as [duplicates](https://docs.visual-layer.com/docs/duplicate-imagesobjects), [mislabels](https://docs.visual-layer.com/docs/mislabeled-imagesobjects), [outliers](https://docs.visual-layer.com/docs/outlier-imagesobjects), \n[dark](https://docs.visual-layer.com/docs/blurry-imagesobjects-copy)/[bright](https://docs.visual-layer.com/docs/dark-imagesobjects-copy)/\n[blurry](https://docs.visual-layer.com/docs/outlier-imagesobjects-copy) and data leakage.\nSee full description for issues support in our [documentation](https://docs.visual-layer.com/docs/mislabeled-imagesobjects).\n\n\n\n## Dataset Zoo\nWe provide a [Dataset Zoo](https://docs.visual-layer.com/docs/available-datasets) where you can find all information for each VL Dataset.\n\nFor each dataset in the zoo, we ran an analysis using [VL Profiler](https://app.visual-layer.com) and found issues pertaining to the original dataset. \nThe following table is a detailed breakdown of the issues for each dataset.\n\n\n\u003ctable\u003e\n    \u003ctr\u003e\n      \u003cth align=\"left\"\u003e\u003cstrong\u003eDataset Name\u003c/strong\u003e\u003c/th\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eTotal Images\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eTotal Issues (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eTotal Issues (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eDuplicates (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eDuplicates (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eOutliers (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eOutliers (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eBlur (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eBlur (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eDark (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eDark (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eBright (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eBright (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eMislabels (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eMislabels (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eLeakage (%)\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd style=\"text-align:left;\"\u003e\u003cstrong\u003eLeakage (Count)\u003c/strong\u003e\u003c/td\u003e\n      \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eImageNet-21K\u003c/th\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e13,153,500\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e14.58%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,917,948\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e10.53%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,385,074\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.09%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e11,119\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.29%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e38,463\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.18%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e23,575\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.43%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e56,754\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e3.06%\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e402,963\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/div\u003e\u003c/td\u003e\n         \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/div\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eImageNet-1K\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,431,167\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1.31%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e17,492\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.57%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7,522\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.09%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,199\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.19%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2,478\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.24%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e3,174\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.06%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e770\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.11%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,480\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.07%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e869\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eLAION-1B\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,000,000,000\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e10.40%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e104,942,474\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e8.89%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e89,349,899\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.63%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e6,350,368\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.77%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7,763,266\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.02%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e242,333\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.12%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,236,608\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eKITTI\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e12,919\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e18.32%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2,748\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e15.29%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2,294\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.01%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e3.01%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e452\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eCOCO\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e330,000\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.31%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e508\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.12%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e201\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.09%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e143\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.03%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e47\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.05%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e76\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.01%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e21\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.01%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e20\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eDeepFashion\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e800,000\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7.89%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\" style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e22,824\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e5.11%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e14,773\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.04%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e108\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2.75%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7,943\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eCelebA-HQ\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e30,000\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2.36%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e4,786\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1.67%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e3,389\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.08%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e157\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.51%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,037\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.00%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.01%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e13\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.09%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e188\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003ePlaces365\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1,800,000\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2.09%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e37,644\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1.53%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e27,520\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.40%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7,168\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.16%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e2,956\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eFood-101\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e101,000\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.62%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e627\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.23%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e235\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.08%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e77\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.18%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e185\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.04%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e43\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n        \u003cth align=\"left\"\u003eOxford-IIIT Pet\u003c/th\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7,349\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1.48%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e132\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e1.01%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e75\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.10%\u003c/\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e7\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.05%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e4\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e-\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e0.31%\u003c/td\u003e\n        \u003ctd style=\"text-align:left;\"\u003e\u003cdiv align=\"left\"\u003e23\u003c/td\u003e\n    \u003c/tr\u003e\n\u003c/table\u003e\n\nWe provide here full details on each issues removed from a VL Dataset (a vl dataset card).\nThe clean version of a dataset is prefixed with `vl-` to differentiate it from the original dataset.\nYou can also freely download all found issues CSV.\n\n\u003ctable\u003e\n  \u003cthead\u003e\n    \u003ctr\u003e\n      \u003cth align=\"left\"\u003eVL Dataset Card\u003c/th\u003e\n      \u003cth align=\"left\"\u003eOriginal Dataset\u003c/th\u003e\n      \u003cth align=\"left\"\u003eExplore\u003c/th\u003e\n      \u003cth align=\"left\"\u003eIssues CSV\u003c/th\u003e\n      \u003cth align=\"left\"\u003eHugging Face Dataset\u003c/th\u003e\n    \u003c/tr\u003e\n  \u003c/thead\u003e\n  \u003ctbody\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-imagenet-21k\"\u003evl-imagenet-21k\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eImageNet-21K\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/abef3dc1-fc1d-4685-8c98-e9e6cd8adb77?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/ImageNet-21K_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"#\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-WIP-red.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-imagenet-1k\"\u003evl-imagenet-1k\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eImageNet-1K\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/7a1fe79b-a241-4473-9000-13570b4189d8?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/ImageNet-1K_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://huggingface.co/datasets/visual-layer/vl-imagenet-1k\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-Click%20Here-blue.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-laion-1b\"\u003evl-laion-1b\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eLAION-1B\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/vl-datasets\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Laion1B_issues.parquet\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"#\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-WIP-red.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-kitti\"\u003evl-kitti\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eKITTI\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/3008d9ae-1f3f-11ee-9c7b-be856b858048?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Kitti_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"#\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-WIP-red.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-coco\"\u003evl-coco\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eCOCO\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/a0e925b8-216d-11ee-a49e-067f3294af8b?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Coco_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"#\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-WIP-red.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-deepfashion\"\u003evl-deepfashion\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eDeepFashion\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/30bfd668-1f3f-11ee-9c7b-be856b858048?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/DeepFashion_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://huggingface.co/datasets/vl-deepfashion\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-WIP-red.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-celeba-hq\"\u003evl-celeba-hq\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eCelebA-HQ\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/302bf43e-1f3f-11ee-82ac-527775b84146?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/CelebA_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://huggingface.co/datasets/visual-layer/vl-celeba-hq\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-Click%20Here-blue.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-places365\"\u003evl-places365\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003ePlaces365\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/vl-datasets\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Places365_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://huggingface.co/datasets/vl-places365\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-WIP-red.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-food101\"\u003evl-food-101\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eFood-101\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/304b20d4-1f3f-11ee-8c0d-067f3294af8b?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/food101_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://huggingface.co/datasets/visual-layer/vl-food101\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-Click%20Here-blue.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n    \u003ctr\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://docs.visual-layer.com/docs/available-datasets#vl-oxford-iiit-pets\"\u003evl-oxford-iiit-pet\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003eOxford-IIIT Pet\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://app.visual-layer.com/dataset/308fa5a6-1f3f-11ee-8d7b-aa19a58e1b08?p=1\"\u003e\u003cimg src=\"https://img.shields.io/badge/VL%20Profiler-Explore-purple?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/oxford-iiit-pet_images_issue_file_list.csv\"\u003e\u003cimg src=\"https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\" alt=\"download\"\u003e\u003c/a\u003e\u003c/td\u003e\n      \u003ctd align=\"left\"\u003e\u003ca href=\"https://huggingface.co/datasets/visual-layer/vl-oxford-iiit-pets\"\u003e\u003cimg src=\"https://img.shields.io/badge/🤗%20Dataset-Click%20Here-blue.svg?style=for-the-badge\"\u003e\u003c/a\u003e\u003c/td\u003e\n    \u003c/tr\u003e\n  \u003c/tbody\u003e\n\u003c/table\u003e\n\n\nWe will continue to support more datasets. Here are a few currently in our roadmap:\n+ EuroSAT\n+ Flickr30k\n+ INaturalist\n+ SVHN\n+ Cityscapes\n+ RVL-CDIP\n+ DocLayNet\n\n[Let us know](https://forms.gle/8jxPkyzeKj82kPed8) if you have additional request to support a specific dataset.\n\n\u003e **Note**: If you'd like to use our cloud tool and discover issues with your own dataset, [sign up](https://app.visual-layer.com/) to use our cloud platform for free. \n\n\n\n\n\n\u003c!-- \n## Access\nThe `visuallayer` SDK provides a convenient way to access the clean version of the datasets in Python.\nAlternatively, for each dataset in this repo, we provide a `.csv` file that lists the problematic images from the dataset.\n\nYou can use the listed images in the `.csv` to improve the model by re-labeling them or just simply removing it from the dataset.\n\nHere is a table of datasets, link to download the `.csv` file, and how to access it via `visuallayer` datasets zoo.\n\n| Dataset Name    | Issues CSV                                       | Load with SDK                                 |\n|-----------------|--------------------------------------------------|-----------------------------------------------|\n| ImageNet-21K    | [![download][download-shield]][imagenet-21k-url] | `vl.datasets.zoo.load('vl-imagenet-21k')`     |\n| ImageNet-1K     | [![download][download-shield]][imagenet-1k-url]  | `vl.datasets.zoo.load('vl-imagenet-1k')`      |\n| LAION-1B        | [![download][download-shield]][laion-1b-url]     | -                                             |\n| KITTI           | [![download][download-shield]][kitti-url]        | `vl.datasets.zoo.load('vl-kitti')`            |\n| COCO            | [![download][download-shield]][coco-url]         | WIP                                           |\n| DeepFashion     | [![download][download-shield]][deepfashion-url]  | WIP                                           |\n| CelebA-HQ       | [![download][download-shield]][celeba-hq-url]    | WIP                                           |\n| Places365       | [![download][download-shield]][places365-url]    | WIP                                           |\n| Food-101        | [![download][download-shield]][food101-url]      | `vl.datasets.zoo.load('vl-food101')`          |\n| Oxford-IIIT Pet | [![download][download-shield]][oxford-pets-url]  | `vl.datasets.zoo.load('vl-oxford-iiit-pets')` |\n\n\n[download-shield]: https://img.shields.io/badge/DOWNLOAD-brightgreen?style=for-the-badge\n[imagenet-21k-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/ImageNet-21K_images_issue_file_list.csv\n[imagenet-1k-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/ImageNet-1K_images_issue_file_list.csv\n[laion-1b-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Laion1B_issues.parquet\n[kitti-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Kitti_images_issue_file_list.csv\n[coco-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Coco_images_issue_file_list.csv\n[deepfashion-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/DeepFashion_images_issue_file_list.csv\n[celeba-hq-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/CelebA_images_issue_file_list.csv\n[places365-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/Places365_images_issue_file_list.csv\n[food101-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/food101_images_issue_file_list.csv\n[oxford-pets-url]: https://sharedvisuallayer.s3.us-east-2.amazonaws.com/visual-layer-sdk/oxford-iiit-pet_images_issue_file_list.csv\n\n\nWe offer extensive visualizations of the dataset issues in our cloud platform.\n[Sign up]((https://app.visual-layer.com)) for free. --\u003e\n\n## Usage\nThe following sections show how to use the `visuallayer` SDK to load, inspect and export a VL Dataset.\n\n### Loading a dataset\nWe offer handy functions to load datasets from the Dataset Zoo.\nFirst, let's list the datasets in the zoo with:\n\n\n```python\nimport visuallayer as vl\nvl.datasets.zoo.list_datasets()\n```\n\nwhich currently outputs:\n\n```shell\n['vl-oxford-iiit-pets',\n 'vl-imagenet-21k',\n 'vl-imagenet-1k',\n 'vl-food101',\n 'oxford-iiit-pets',\n 'imagenet-21k',\n 'imagenet-1k',\n 'food101']\n```\n\nTo load the dataset:\n\n```python\nvl.datasets.zoo.load('vl-oxford-iiit-pets')\n```\n\nThis loads the clean version of the Oxford IIIT Pets dataset where all of the problematic images are excluded from the dataset.\n\nTo load the original Oxford IIIT Pets dataset, simply drop the `vl-` prefix:\n\n```python\noriginal_pets_dataset = vl.datasets.zoo.load('oxford-iiit-pets')\n```\n\nThis loads the original dataset with no modifications.\n\n### Inspecting a dataset\nNow that you have a dataset loaded, you can view information pertaining to that dataset with:\n\n```python\nmy_pets.info\n```\n\nThis prints out high-level information about the original Dataset. In this example, we used the Pets Dataset from Oxford.\n\n```shell\nMetadata:\n--\u003e Name - vl-oxford-iiit-pets\n--\u003e Description - A modified version of the original Oxford IIIT Pets Dataset removing dataset issues.\n--\u003e License - Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)\n--\u003e Homepage URL - https://www.robots.ox.ac.uk/~vgg/data/pets/\n--\u003e Number of Images - 7349\n--\u003e Number of Images with Issues - 109\n```\n\nIf you'd like to view the issues related to the dataset, run:\n\n```python\nmy_pets.report\n```\n\nwhich outputs:\n\n```shell\n| Reason    | Count | Pct   |\n|-----------|-------|-------|\n| Duplicate | 75    | 1.016 |\n| Outlier   | 7     | 0.095 |\n| Dark      | 4     | 0.054 |\n| Leakage   | 23    | 0.627 |\n| Total     | 109   | 1.792 |\n\n```\n\nNow that you've seen the issues with the dataset, you can visualize them on screen. There are two options to visualize the dataset issues.\n\n\u003e **Option 1** - Using the Visual Layer Profiler (VL Profiler) - Provides an extensive capability to view, group, sort, and filter the dataset issues. [Sign-up](https://app.visual-layer.com) for free.\n\nHere's the visualization using the VL Profiler:\n\n![profiler](./imgs/vl_profiler.gif)\n\n\u003e **Option 2** - In Jupyter Notebook - Provides a limited but convenient way to view the dataset without leaving your notebook.\n\nTo visualize the issues using **Option 2** in your notebook, run:\n\n```python\nmy_pets.explore()\n```\n\nThis should output an interactive table in your Jupyter Notebook like the following.\n\n![explore](./imgs/explore.gif)\n\nIn the interactive table, you can view the issues, sort, filter, search, and compare the images side by side.\n\nBy default, the `.explore()` load the top 50 issues from the dataset covering all issue types. If you'd like a more granular control, you can change the `num_images` and `issue` arguments.\n\nFor example:\n\n```python\npets_dataset.explore(num_images=100, issue='Duplicate')\n```\n\nThe interactive table provides a convenient but limited way to visualize dataset issues.\nFor a more extensive visualization, view the issues using the Visual Layer Profiler.\n\nCheck out the [documentation](https://docs.visual-layer.com/docs/introduction) and blog page for the Visual Layer Profiler for more info.\n\n\n### Exporting a dataset\nIf you'd like to use a loaded dataset to train a model, you can conveniently export the dataset with:\n\n```python\ntest_dataset = my_pets.export(output_format=\"pytorch\", split=\"test\")\n```\nThis exports the Dataset into a Pytorch `Dataset` object that can be used readily with a PyTorch training loop.\n\nAlternatively, you can export the Dataset to a DataFrame with:\n\n```python\ntest_dataset = pets_dataset.export(output_format=\"csv\", split=\"test\")\n```\n\n\n## Learn from Examples\nIn this section, we show an end-to-end example of how to load, inspect and export a dataset and then train using PyTorch and fastai framework.\n\n\u003ctable\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"4\" width=\"160\"\u003e\n      \u003ca href=\"https://visual-layer.readme.io/docs/getting-started\"\u003e\n        \u003cimg src=\"./imgs/food.jpg\" width=\"256\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd rowspan=\"4\"\u003e\n      \u003cul\u003e\n        \u003cli\u003e\u003cb\u003eDataset:\u003c/b\u003e \u003ccode\u003evl-food101\u003c/code\u003e\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eFramework:\u003c/b\u003e PyTorch.\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eDescription:\u003c/b\u003e Load a dataset and train a PyTorch model.\u003c/li\u003e\n      \u003c/ul\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"80\"\u003e\n      \u003ca href=\"https://nbviewer.org/github/visual-layer/visuallayer/blob/main/notebooks/train-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/nbviewer_logo.svg\" height=\"34\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://github.com/visual-layer/visuallayer/blob/main/notebooks/train-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/github_logo.png\" height=\"32\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://colab.research.google.com/github/visual-layer/visuallayer/blob/main/notebooks/train-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/colab_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n    \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://kaggle.com/kernels/welcome?src=https://github.com/visual-layer/visuallayer/blob/main/notebooks/train-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/kaggle_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003c!-- ------------------------------------------------------------------- --\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"4\" width=\"160\"\u003e\n      \u003ca href=\"https://visual-layer.readme.io/docs/getting-started\"\u003e\n        \u003cimg src=\"./imgs/hf-food101.jpg\" width=\"256\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd rowspan=\"4\"\u003e\n      \u003cul\u003e\n        \u003cli\u003e\u003cb\u003eDataset:\u003c/b\u003e \u003ccode\u003evl-food101\u003c/code\u003e\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eFrameworks:\u003c/b\u003e PyTorch + Hugging Face Dataset\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eDescription:\u003c/b\u003e Load VL Datasets using Hugging Face Datasets and train a PyTorch model.\u003c/li\u003e\n      \u003c/ul\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"80\"\u003e\n      \u003ca href=\"https://nbviewer.org/github/visual-layer/visuallayer/blob/main/notebooks/hf-vl-datasets-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/nbviewer_logo.svg\" height=\"34\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://github.com/visual-layer/visuallayer/blob/main/notebooks/hf-vl-datasets-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/github_logo.png\" height=\"32\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://colab.research.google.com/github/visual-layer/visuallayer/blob/main/notebooks/hf-vl-datasets-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/colab_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n    \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://kaggle.com/kernels/welcome?src=https://github.com/visual-layer/visuallayer/blob/main/notebooks/hf-vl-datasets-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/kaggle_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003c!-- ------------------------------------------------------------------- --\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"4\" width=\"160\"\u003e\n      \u003ca href=\"https://visual-layer.readme.io/docs/objects-and-bounding-boxes\"\u003e\n        \u003cimg src=\"./imgs/pet.jpg\" width=\"256\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd rowspan=\"4\"\u003e\n      \u003cul\u003e\n        \u003cli\u003e\u003cb\u003eDataset:\u003c/b\u003e \u003ccode\u003evl-oxford-iiit-pet\u003c/code\u003e\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eFramework:\u003c/b\u003e fast.ai.\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eDescription:\u003c/b\u003e Finetune a pretrained TIMM model using fastai.\u003c/li\u003e\n      \u003c/ul\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"80\"\u003e\n      \u003ca href=\"https://nbviewer.org/github/visual-layer/visuallayer/blob/main/notebooks/train-fastai.ipynb\"\u003e\n        \u003cimg src=\"./imgs/nbviewer_logo.svg\" height=\"34\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://github.com/visual-layer/visuallayer/blob/main/notebooks/train-fastai.ipynb\"\u003e\n        \u003cimg src=\"./imgs/github_logo.png\" height=\"32\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://colab.research.google.com/github/visual-layer/visuallayer/blob/main/notebooks/train-fastai.ipynb\"\u003e\n        \u003cimg src=\"./imgs/colab_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n    \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://kaggle.com/kernels/welcome?src=https://github.com/visual-layer/visuallayer/blob/main/notebooks/train-fastai.ipynb\"\u003e\n        \u003cimg src=\"./imgs/kaggle_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003c!-- ------------------------------------------------------------------- --\u003e\n  \u003ctr\u003e\n    \u003ctd rowspan=\"4\" width=\"160\"\u003e\n      \u003ca href=\"https://visual-layer.readme.io/docs/getting-started\"\u003e\n        \u003cimg src=\"./imgs/imagenet.jpg\" width=\"256\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd rowspan=\"4\"\u003e\n      \u003cul\u003e\n        \u003cli\u003e\u003cb\u003eDataset:\u003c/b\u003e \u003ccode\u003evl-imagenet-1k\u003c/code\u003e\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eFramework:\u003c/b\u003e PyTorch.\u003c/li\u003e\n        \u003cli\u003e\u003cb\u003eDescription:\u003c/b\u003e Load cleaned ImageNet dataset and train a PyTorch model.\u003c/li\u003e\n      \u003c/ul\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"80\"\u003e\n      \u003ca href=\"https://nbviewer.org/github/visual-layer/visuallayer/blob/main/notebooks/imagenet-1k-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/nbviewer_logo.svg\" height=\"34\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://github.com/visual-layer/visuallayer/blob/main/notebooks/imagenet-1k-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/github_logo.png\" height=\"32\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://colab.research.google.com/github/visual-layer/visuallayer/blob/main/notebooks/imagenet-1k-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/colab_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n    \u003ctr\u003e\n    \u003ctd align=\"center\"\u003e\n      \u003ca href=\"https://kaggle.com/kernels/welcome?src=https://github.com/visual-layer/visuallayer/blob/main/notebooks/train-pytorch.ipynb\"\u003e\n        \u003cimg src=\"./imgs/kaggle_logo.png\" height=\"28\" /\u003e\n      \u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \n\u003c/table\u003e\n\n\n## License\n`visuallayer SDK` is licensed under the Apache 2.0 License. See [LICENSE](./LICENSE).\n\nHowever, you are bound to the usage license of the original dataset. It is your responsibility to determine whether you have permission to use the dataset under the dataset's license. We provide no warranty or guarantee of accuracy or completeness.\n\n## Telemetry\n\n\u003cdetails\u003e\n\u003csummary\u003eUsage Tracking\u003c/summary\u003e\n\nThis repository incorporates usage tracking using [Sentry.io](https://sentry.io) to monitor and collect valuable information about the usage of the application.\n\nUsage tracking allows us to gain insights into how the application is being used in real-world scenarios. It provides us with valuable information that helps in understanding user behavior, identifying potential issues, and making informed decisions to improve the application.\n\nWe DO NOT collect folder names, user names, image names, image content, and other personally identifiable information.\n\n**What data is tracked?**\n\n- Errors and Exceptions: Sentry captures errors and exceptions that occur in the application, providing detailed stack traces and relevant information to help diagnose and fix issues.\n- Performance Metrics: Sentry collects performance metrics, such as response times, latency, and resource usage, enabling us to monitor and optimize the application's performance.\n\nTo opt-out, define an environment variable named `SENTRY_OPT_OUT`.\n\nOn Linux/macOS, run the following:\n```shell\nexport SENTRY_OPT_OUT=True\n```\n\nRead more on [Sentry's official webpage](https://sentry.io).\n\n\u003c/details\u003e\n\n\u003c!-- This repository incorporates usage tracking using [Sentry.io](https://sentry.io/) to monitor and collect valuable information about the usage of the application.\n\nUsage tracking allows us to gain insights into how the application is being used in real-world scenarios. It provides us with valuable information that helps in understanding user behavior, identifying potential issues, and making informed decisions to improve the application.\n\nWe DO NOT collect folder names, user names, image names, image content, and other personally identifiable information.\n\nWhat data is tracked?\n+ **Errors and Exceptions**: Sentry captures errors and exceptions that occur in the application, providing detailed stack traces and relevant information to help diagnose and fix issues.\n+ **Performance Metrics**: Sentry collects performance metrics, such as response times, latency, and resource usage, enabling us to monitor and optimize the application's performance.\n\nTo opt-out, define an environment variable named `SENTRY_OPT_OUT`. \n\nOn Linux run the following:\n```bash\nexport SENTRY_OPT_OUT=True\n```\n\nRead more on Sentry's official [webpage](https://sentry.io/welcome/). --\u003e\n\n\n## Getting Help\nGet help from the Visual Layer team or community members via the following channels -\n+ [Slack](https://visualdatabase.slack.com/join/shared_invite/zt-19jaydbjn-lNDEDkgvSI1QwbTXSY6dlA#/shared-invite/email).\n+ GitHub [issues](https://github.com/visual-layer/visuallayer/issues).\n+ Discussion [forum](https://visual-layer.readme.io/discuss).\n\n\n## About Visual-Layer\n\n\u003cdiv align=\"center\"\u003e\n\u003ca href=\"https://www.visual-layer.com\"\u003e\n  \u003cimg alt=\"Visual Layer Logo\" src=\"https://github.com/visual-layer/visuallayer/blob/main/imgs/vl_horizontal_logo.png\" alt=\"Logo\" width=\"250\"\u003e\n\u003c/a\u003e\n\u003c/div\u003e\n\n\nVisual Layer is founded by the authors of [XGBoost](https://github.com/apache/tvm), [Apache TVM](https://github.com/apache/tvm) \u0026 [Turi Create](https://github.com/apple/turicreate) - [Danny Bickson](https://www.linkedin.com/in/dr-danny-bickson-835b32), [Carlos Guestrin](https://www.linkedin.com/in/carlos-guestrin-5352a869) and [Amir Alush](https://www.linkedin.com/in/amiralush).\n\nLearn more about Visual Layer [here](https://visual-layer.com).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvisual-layer%2Fvisuallayer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fvisual-layer%2Fvisuallayer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvisual-layer%2Fvisuallayer/lists"}