https://github.com/developer239/samurai-notebook
GoogleColab notebook for SAMURAI Zero-Shot Visual Tracking with Motion-Aware Memory
https://github.com/developer239/samurai-notebook
Last synced: 10 months ago
JSON representation
GoogleColab notebook for SAMURAI Zero-Shot Visual Tracking with Motion-Aware Memory
- Host: GitHub
- URL: https://github.com/developer239/samurai-notebook
- Owner: developer239
- Created: 2024-11-24T21:50:01.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-11-25T00:25:26.000Z (about 1 year ago)
- Last Synced: 2025-01-25T21:26:27.794Z (12 months ago)
- Language: Jupyter Notebook
- Size: 24.6 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
[](https://colab.research.google.com/drive/1LE0BT0H6RTJ6OuALZf6lNopQm6vSzJkE?usp=sharing)
# SAMURAI Demo with Custom Video - Jupyter Notebook
This Jupyter Notebook is designed to help you run [**SAMURAI**](https://github.com/yangchris11/samurai)

## Features
- **Custom Video Upload:** Easily upload your own video files directly into the notebook environment.
- **First Frame Extraction:** Automatically extract the first frame of your video for annotation.
- **Bounding Box Input:** Input the bounding box coordinates from your annotated frame back into the notebook.
- **SAMURAI Execution:** Run the SAMURAI demo script with your video and bounding box to perform zero-shot visual tracking.
- **Result Download:** Download the output video for your own use.
## Usage Instructions
### Prerequisites
- A Google account to use [Google Colab](https://colab.research.google.com/).
- A custom video file (e.g., in `.mp4` or `.avi` format) (or you can use the one in `/assets/`)
## Other
You can annotate the first frame of your video using the [VCG Image Annotator](https://www.robots.ox.ac.uk/~vgg/software/via/via_demo.html)