https://github.com/sripranav9/nomo
Nomo is a playful desk‑companion robot built on Raspberry Pi and Arduino, combining computer vision–based gesture detection with multimodal feedback to guide you through balanced work sessions and mindful breaks.
https://github.com/sripranav9/nomo
arduino capstone-project google-mediapipe interactive-media machine-learning raspberry-pi-5 robot-companion robotics wellbeing
Last synced: 2 months ago
JSON representation
Nomo is a playful desk‑companion robot built on Raspberry Pi and Arduino, combining computer vision–based gesture detection with multimodal feedback to guide you through balanced work sessions and mindful breaks.
- Host: GitHub
- URL: https://github.com/sripranav9/nomo
- Owner: sripranav9
- License: mit
- Created: 2025-04-15T06:11:06.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2025-04-23T06:14:46.000Z (2 months ago)
- Last Synced: 2025-04-23T07:23:56.871Z (2 months ago)
- Topics: arduino, capstone-project, google-mediapipe, interactive-media, machine-learning, raspberry-pi-5, robot-companion, robotics, wellbeing
- Language: C++
- Homepage: https://nomo-capstone.framer.website/
- Size: 17.8 MB
- Stars: 0
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Nomo
[](https://nomo-capstone.framer.website/)

**A Little Friend for a Big Focus**
Nomo is an interactive desk companion robot built to encourage healthier work habits through computer vision, gesture recognition, and playful nudges. Designed as a capstone project at NYU Abu Dhabi, Nomo blends real-time software with expressive hardware to promote mindful breaks and deep focus—making your workflow a little more human.
## 🏷 Topics
`interactive-media` `robotics` `robot-companion` `machine-learning` `google-mediapipe` `raspberry-pi-5` `arduino` `wellbeing` `capstone-project` `pomodoro`## Features
- Real-time gesture recognition (wave, closed fist, thumbs-up)
- LED feedback to guide breaks and focus sessions
- Raspberry Pi + Arduino integration
- Pettable with responsive purring vibration
- Location-based behavior (work and break zones)
- Designed with routines, attention, and well-being in mind## Built With
- [MediaPipe](https://github.com/google/mediapipe) – real-time hand tracking and gesture recognition
- [OpenCV](https://opencv.org/) – image capture, annotation, and video streaming
- [Flask](https://flask.palletsprojects.com/) – lightweight web framework for video streaming
- [Picamera2](https://github.com/raspberrypi/picamera2) – camera interface for Raspberry Pi
- [Arduino](https://www.arduino.cc/) – microcontroller used for motion, sound, and feedback
- [Python](https://www.python.org/) – control logic, CV processing, and server orchestration
- [Pyserial](https://pypi.org/project/pyserial/) – serial communication for Pi-to-Arduino command flow
- [Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5/) – main compute unit for processing gestures, state transitions, and control logic## How It Works
Nomo runs on a Raspberry Pi using Python for vision and logic, paired with an Arduino for motor and sensor control. When placed in a predefined location, it can detect gestures and trigger break/focus states accordingly. Each mode is designed to reflect a balance between productivity and self-care.
> For a full overview of modes, gestures, and setup, see the [Product Manual](/docs/nomo-product-manual.pdf).
> For a full overview of modes, gestures, and setup, see the [State Machine Diagram](/docs/nomo-state-machine-diagram.jpg).
## Repository Structure
```bash
nomo/
├── pi/ # Raspberry Pi scripts (python - gesture, logic)
├── arduino/ # Arduino sketches (servo, vibration, LEDs, sounds)
├── assets/ # Gesture samples, hero images, mockups
├── docs/ # Product manual
└── README.md```
## Credits
This project was made possible with the support and guidance of:
- **Professor Nimrah Syed** – *Capstone Advisor*
Provided continuous mentorship and feedback from the ideation stage through final execution.- **Professor Michael Shiloh** – *Technical Mentor*
Offered invaluable assistance in hardware design, debugging, and user testing throughout development.- **Daniel Nivia** – *NYU Abu Dhabi Alumnus*
Contributed custom audio cues, lending Nomo its personality and expressive audio interactions.We deeply appreciate their time, expertise, and encouragement in bringing Nomo to life.
We also extend our heartfelt thanks to the **Interactive Media faculty, instructors, lab staff, and the NYUAD Art Gallery** for their continuous support, resources, and encouragement throughout the capstone journey.
## 🔒 Project Use Notice
This was developed as part of a Capstone Project at **NYU Abu Dhabi** (2025).
The code and associated materials are made available under the terms of the [MIT License](LICENSE).
They may be freely used, modified, and distributed for personal, academic, and non-commercial purposes.**Commercial use, resale, or distribution of derivative works for profit is not permitted without prior written consent of both project authors.**