https://github.com/m4yh3m-dev/project-astra
A powerful, emotion-aware mobile assistant robot built for real-world intelligence, research, interaction, and multi-domain AI applications.
https://github.com/m4yh3m-dev/project-astra
3d-printing arduino jetson-orin-nano lidar-slam opencv robotics ros2 yolov8
Last synced: 6 months ago
JSON representation
A powerful, emotion-aware mobile assistant robot built for real-world intelligence, research, interaction, and multi-domain AI applications.
- Host: GitHub
- URL: https://github.com/m4yh3m-dev/project-astra
- Owner: M4YH3M-DEV
- License: apache-2.0
- Created: 2025-04-18T16:15:48.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2025-04-21T13:27:43.000Z (6 months ago)
- Last Synced: 2025-04-21T14:45:58.293Z (6 months ago)
- Topics: 3d-printing, arduino, jetson-orin-nano, lidar-slam, opencv, robotics, ros2, yolov8
- Homepage:
- Size: 1.34 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
UNDER WORK
---
---# π€ ASTRA MVP β Autonomous Smart Tech Robotic Assistant (MVP)

A powerful, emotion-aware mobile assistant robot built for real-world intelligence, research, interaction, and multi-domain AI applications.
## π Vision
**ASTRA (Autonomous Smart Tech Robotic Assistant)** is designed to be a modular, scalable robotic system combining computer vision, real-time ML, intelligent mobility, and emotional intelligence to interact with the physical world in an intuitive, human-centered way.
The MVP serves as a proof-of-concept platform for robotics + AI/ML + embedded systems + HCI research and innovation.
---
## π© Hardware Overview
| Component | Description |
|----------|-------------|
| **Main Controller** | NVIDIA Jetson Orin Nano Super (8-core ARM Cortex + 20 TOPS AI) |
| **Support Boards** | Raspberry Pi 5 (Sensor I/O & lightweight ML), Arduino Mega (actuators), ESP32 (wireless, BLE, sensor fusion) |
| **Camera** | Intel RealSense D435i (Depth + RGB) |
| **Lidar** | RPLidar A1/A2 (for mapping & navigation) |
| **Display** | 64x64 LED Matrix or OLED screen for Emotion UI |
| **Mobility** | 2WD or 4 Omni-wheel base (with geared motors + motor drivers) |
| **Sensors** | Ultrasonic, IMU (MPU6050), IR, ToF, GPS, Compass |
| **Connectivity** | GSM module (SIM800L), WiFi (ESP32 + Jetson), GPS (Neo6M) |
| **Power System** | 12V Li-ion battery (8000β10000mAh), 5V and 3.3V regulators |
| **Chassis** | 3D printed + lightweight aluminum/ABS custom frame |
| **Optional** | 6-DOF Robotic Arm (MG996R / Dynamixel motors) |---
## πΈ Estimated BOM (βΉ INR)
| Item | Approx. Cost |
|-----------------------------|--------------|
| Jetson Orin Nano Super | βΉ25,000β28,000 |
| Raspberry Pi 5 | βΉ7,000 |
| Arduino Mega + ESP32 | βΉ1,500 |
| Intel RealSense D435i | βΉ15,000β20,000 |
| RPLidar A1 | βΉ10,000 |
| GSM + GPS Modules | βΉ1,500 |
| Emotion Display (LED/OLED) | βΉ2,000 |
| Motors + Wheels + Drivers | βΉ4,000β6,000 |
| Battery + BMS + Power Mgmt | βΉ3,500β4,500 |
| Chassis & Structure | βΉ8,000β10,000 |
| Misc Sensors, Wiring, PCB | βΉ3,000 |
| **Total** | **βΉ90,000β1,30,000** |**About : 1,50,000 INR**
---
## π» Software Stack
| Layer | Tools / Languages / Frameworks |
|-------|-------------------------------|
| **OS & Middleware** | Ubuntu 20.04 LTS (Jetson), Raspberry Pi OS |
| **Languages** | Python, C++, Bash, Arduino (C), JavaScript (web UI) |
| **AI/ML** | PyTorch, TensorFlow Lite, OpenCV, YOLOv8, ONNX |
| **Robotics** | ROS2 Foxy, RPLidar SDK, Jetson GPIO |
| **CV & Perception** | RealSense SDK, DepthAI, MediaPipe |
| **Voice Control** | Vosk / Whisper / Google STT |
| **Emotion Engine** | Custom expression-to-LED rendering logic |
| **Cloud/Comms** | MQTT, WebSocket, GSM API, GPS tracker |
| **Web UI (Optional)** | Flask / Node.js dashboard for remote monitoring |---
## π§ Core Features
- β Real-time Object Detection & Tracking (YOLOv8)
- β Emotion Display System (LED Matrix)
- β Voice Command Execution (offline-capable)
- β Mobility & Navigation with Lidar & Depth Cam
- β Environment Mapping & Obstacle Avoidance
- β Sensor Fusion with IMU + GPS + GSM
- β ROS2 Integration for modular system design
- β Data logging, remote diagnostics (via web dashboard)
- β Modular design (sensor/arm additions possible)---
## 𧬠Design Inspiration
- π’ **Wall-Eβs** compact & expressive form
- π΅ **NVIDIA's humanoid research robot (GTC 2025)** for futuristic, emotion-aware presence
- β»οΈ Designed for modularity, extensibility & research-grade experimentation---
## π Dimensions
| Part | Size (approx) |
|----------------|---------------|
| Height | 60 cm |
| Width | 35 cm |
| Depth | 30 cm |
| Wheel Base | 25β28 cm |
| Emotion Display | 10x10 cm |
| Arm Length (opt) | ~25β30 cm |---
## π¬ Research Domains It Touches
- π€ Robotics & Mechatronics
- π§ AI/ML & Real-time Inference
- π SLAM & Autonomous Navigation
- ποΈ Computer Vision & Emotion Recognition
- π£οΈ Human-Robot Interaction (HRI)
- π Edge Computing & IoT Fusion
- π§© Multi-Agent Coordination (future expansion)---
## π― Future Goals
- Integrate GPT-4 Vision or LLaVA-style perception models
- Add autonomous path planning & swarm logic
- Upgrade to custom-built PCB for cleaner internals
- Publish white paper on ASTRAβs system design
- Open-source ROS2 packages for education/research---
## π§βπ« Target Audience
- Professors & Researchers in AI/Robotics
- Labs like MIT CSAIL, Stanford AI Lab, ETH Zurich ASL
- Hackathons, Expo Showcases (e.g., Aero India 2026)
- Open-source robotics & edge-AI community---
## πββοΈ Built With β€οΈ By
**M4YH3M-DEV**
π [GitHub](https://github.com/M4YH3M-DEV)
π§ͺ BTech '28 | AI/Robotics/OS/LLMs | Building for the Future