{"id":22969704,"url":"https://github.com/thohemp/nitec","last_synced_at":"2025-07-19T01:04:16.522Z","repository":{"id":206476408,"uuid":"712887090","full_name":"thohemp/nitec","owner":"thohemp","description":"NITEC: Versatile Hand-Annotated Eye Contact Dataset for Ego-Vision Interaction (WACV24)","archived":false,"fork":false,"pushed_at":"2024-07-17T08:00:27.000Z","size":428,"stargazers_count":16,"open_issues_count":0,"forks_count":2,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-04-01T21:48:13.829Z","etag":null,"topics":["attention","computer","contact","ego-vision","eye","focus","gaze","human","human-robot-collaboration","human-robot-interaction","interaction","mutual","of","pytorch","robot","vision","visual","wacv"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/thohemp.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.md","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-11-01T12:09:36.000Z","updated_at":"2025-03-29T16:13:17.000Z","dependencies_parsed_at":"2024-11-19T11:00:51.652Z","dependency_job_id":"559046cf-987f-42f9-baa9-d522d655652d","html_url":"https://github.com/thohemp/nitec","commit_stats":{"total_commits":9,"total_committers":2,"mean_commits":4.5,"dds":0.2222222222222222,"last_synced_commit":"ad84570af75eca490d4adae5bc9d9d2fdd413af4"},"previous_names":["thohemp/nitec"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/thohemp/nitec","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thohemp%2Fnitec","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thohemp%2Fnitec/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thohemp%2Fnitec/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thohemp%2Fnitec/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/thohemp","download_url":"https://codeload.github.com/thohemp/nitec/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thohemp%2Fnitec/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265866264,"owners_count":23840937,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["attention","computer","contact","ego-vision","eye","focus","gaze","human","human-robot-collaboration","human-robot-interaction","interaction","mutual","of","pytorch","robot","vision","visual","wacv"],"created_at":"2024-12-14T21:37:39.575Z","updated_at":"2025-07-19T01:04:16.500Z","avatar_url":"https://github.com/thohemp.png","language":"Python","readme":"# \u003cdiv align=\"center\"\u003e **NITEC: Versatile Hand-Annotated Eye Contact Dataset for Ego-Vision Interaction (WACV24)** \u003c/div\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/thohemp/archive/blob/main/nitec.gif\" alt=\"animated\" /\u003e\n\u003c/p\u003e\n\n## **Citing**\n\nIf you find our work useful, please cite the paper:\n\n```BibTeX\n@INPROCEEDINGS{10484276,\n  author={Hempel, Thorsten and Jung, Magnus and Abdelrahman, Ahmed A. and Al-Hamadi, Ayoub},\n  booktitle={2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, \n  title={NITEC: Versatile Hand-Annotated Eye Contact Dataset for Ego-Vision Interaction}, \n  year={2024},\n  pages={4425-4434},\n  doi={10.1109/WACV57701.2024.00438}}\n```\n## \u003cdiv align=\"center\"\u003e **Paper**\u003c/div\u003e\n\u003e [Thorsten Hempel, Magnus Jung, Ahmed A. Abdelrahman and Ayoub Al-Hamadi, \"NITEC: Versatile Hand-Annotated Eye Contact Dataset for Ego-Vision Interaction\", *WACV 2024*.](https://openaccess.thecvf.com/content/WACV2024/papers/Hempel_NITEC_Versatile_Hand-Annotated_Eye_Contact_Dataset_for_Ego-Vision_Interaction_WACV_2024_paper.pdf)\n\n## \u003cdiv align=\"center\"\u003e **Abstract**\u003c/div\u003e\n\u003eEye contact is a crucial non-verbal interaction modality and plays an important role in our everyday social life. While humans are very sensitive to eye contact, the capabilities of machines to capture a person's gaze are still mediocre. We tackle this challenge and present NITEC, a hand-annotated eye contact dataset for ego-vision interaction. NITEC exceeds existing datasets for ego-vision eye contact in size and variety of demographics, social contexts, and lighting conditions, making it a valuable resource for advancing ego-vision-based eye contact research. Our extensive evaluations on NITEC demonstrate strong cross-dataset performance, emphasizing its effectiveness and adaptability in various scenarios, that allows seamless utilization to the fields of computer vision, human-computer interaction, and social robotics. We make our NITEC dataset publicly available to foster reproducibility and further exploration in the field of ego-vision interaction.\n\n\n#  \u003cdiv align=\"center\"\u003e Quick Usage: \u003c/div\u003e\n\n```sh\npip install face_detection@git+https://github.com/elliottzheng/face-detection\npip install nitec\n```\n\nExample usage:\n\n```py\nfrom nitec import NITEC_Classifier, visualize\nimport cv2\n\nnitec_pipeline = NITEC_Classifier(\n    weights= CWD / 'models' / 'nitec_rs18_e20.pth',\n    device=torch.device('cuda') # or 'cpu'\n)\n\ncap = cv2.VideoCapture(0)\n\n_, frame = cap.read()    \n# Process frame and visualize\nresults = nitec_pipeline.predict(frame)\nframe = visualize(frame, results, confidence=0.5)\n\n```\n\n\n\n# \u003cdiv align=\"center\"\u003e  Train / Test \u003c/div\u003e\n\n## NITEC Dataset\nPrepare the dataset as explained [ here](data/README.MD).\n\n## Snapshots\n\nDownload from here: https://drive.google.com/drive/folders/1zc6NZZ6yA4NJ52Nn0bgky1XpZs9Z0hSJ?usp=sharing\n\n## Train\n```py\n python train.py \\\n --gpu 0 \\\n --num_epochs 50 \\\n --batch_size 64 \\\n --lr 0.0001 \\\n```\n\n\n## Test\n\n```py\n python test.py \\\n --snapshot models/nitec_rs18_20.pth \\\n --gpu 0 \\\n```\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fthohemp%2Fnitec","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fthohemp%2Fnitec","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fthohemp%2Fnitec/lists"}