{"id":18704890,"url":"https://github.com/xiaohk/facedata","last_synced_at":"2025-04-12T10:06:13.775Z","repository":{"id":57959979,"uuid":"112115785","full_name":"xiaohk/FaceData","owner":"xiaohk","description":"A macOS app to parse face landmarks from a video for training GANs","archived":false,"fork":false,"pushed_at":"2021-08-31T22:00:47.000Z","size":23570,"stargazers_count":80,"open_issues_count":0,"forks_count":9,"subscribers_count":4,"default_branch":"master","last_synced_at":"2025-04-12T10:02:03.412Z","etag":null,"topics":["annotator","gans","macos","swift","vision"],"latest_commit_sha":null,"homepage":"","language":"Swift","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/xiaohk.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":"CITATION.cff","codeowners":null,"security":null,"support":null}},"created_at":"2017-11-26T20:52:58.000Z","updated_at":"2025-04-08T04:00:25.000Z","dependencies_parsed_at":"2022-09-20T08:10:50.701Z","dependency_job_id":null,"html_url":"https://github.com/xiaohk/FaceData","commit_stats":null,"previous_names":[],"tags_count":3,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/xiaohk%2FFaceData","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/xiaohk%2FFaceData/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/xiaohk%2FFaceData/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/xiaohk%2FFaceData/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/xiaohk","download_url":"https://codeload.github.com/xiaohk/FaceData/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248550634,"owners_count":21122933,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["annotator","gans","macos","swift","vision"],"created_at":"2024-11-07T12:09:00.364Z","updated_at":"2025-04-12T10:06:13.733Z","avatar_url":"https://github.com/xiaohk.png","language":"Swift","readme":"# Face Data\n\nA macOS application used to auto-annotate landmarks from a video. Those landmarks can further be used as training data for Generative Adversarial Networks (GANs).\n\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.5348316.svg)](https://doi.org/10.5281/zenodo.5348316)\n[![License](https://img.shields.io/badge/License-MIT-red)](https://github.com/xiaohk/FaceData/blob/master/LICENSE)\n\n\u003cp align=\"center\"\u003e\n\t\u003cimg src=\"./result.gif\" height=\"250\"\u003e\n\u003c/p\u003e\n\n## Getting Started\n\n### Installing\n\nYou can either download the binary file from [`Rease`](https://github.com/xiaohk/FaceData/releases) or build the source code using Xcode.\n\n### Use\n\n\u003cp align=\"center\"\u003e\n\t\u003cimg src=\"https://i.imgur.com/FEVY2Pu.png\" height=\"250\"\u003e\n\u003c/p\u003e\n\n\n|              | Description                                                                                                                                                        |\n|--------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Video Path   | Path to the video file, currently only support `.mp4` files. Use `Select File` to generate path using a file browsing panel.                                       |\n| Output Path  | Path to the output directory, this app will create `origin` and `landmarks` two sub-directories. Use `Select Folder` to generate path using a file browsing panel. |\n| Start Second | An integer value indicating from which second to start capturing frames from the video, default is 0 (from the beginning)                                          |\n| End Second   | This app would not extract frames after this second. Default is the duration of the video.                                                                         |\n| # of Frames  | Integer value of how many frames you want to generate. Default is 100 frames.                                                                                      |\n| Start        | Start the process.                                                                                                                                                 |\n| Cancel       | Stop the process.                                                                                                                                                  |\n\n### Output\n\n- Two sub-directories `origin` and `landmark` will be created in the specified output directory.\n- `origin` contains the original frames extracted from the video, with file name: `img001.png`.\n- `landmark` contains the landmark image drawn based on the corresponding frame in `origin`, with file name: `img001lm.png`.\n- If there is no face detected in one original frame, the corresponding file name in `landmark` is `no_face_img001lm.png`.\n\n### Output Images Processing\n\nYou will probably want to process the generated images to fit the size restriction for you GANs model. You can refer the Python script `crop.py`.\n\n## Built With\n\n* [Apple Vision Library](https://developer.apple.com/documentation/vision) - Easy to reproduce the landmarks in iOS devices\n* [Apple AV Foundation](https://developer.apple.com/av-foundation/) - Also use lower level image format (`CGImage`) to  make codes portable to Cocoa Touch\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fxiaohk%2Ffacedata","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fxiaohk%2Ffacedata","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fxiaohk%2Ffacedata/lists"}