An open API service indexing awesome lists of open source software.

https://github.com/developer239/samurai-notebook

GoogleColab notebook for SAMURAI Zero-Shot Visual Tracking with Motion-Aware Memory
https://github.com/developer239/samurai-notebook

Last synced: 10 months ago
JSON representation

GoogleColab notebook for SAMURAI Zero-Shot Visual Tracking with Motion-Aware Memory

Awesome Lists containing this project

README

          

[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1LE0BT0H6RTJ6OuALZf6lNopQm6vSzJkE?usp=sharing)

# SAMURAI Demo with Custom Video - Jupyter Notebook

This Jupyter Notebook is designed to help you run [**SAMURAI**](https://github.com/yangchris11/samurai)

![image](/result.gif)

## Features

- **Custom Video Upload:** Easily upload your own video files directly into the notebook environment.
- **First Frame Extraction:** Automatically extract the first frame of your video for annotation.
- **Bounding Box Input:** Input the bounding box coordinates from your annotated frame back into the notebook.
- **SAMURAI Execution:** Run the SAMURAI demo script with your video and bounding box to perform zero-shot visual tracking.
- **Result Download:** Download the output video for your own use.

## Usage Instructions

### Prerequisites

- A Google account to use [Google Colab](https://colab.research.google.com/).
- A custom video file (e.g., in `.mp4` or `.avi` format) (or you can use the one in `/assets/`)

## Other

You can annotate the first frame of your video using the [VCG Image Annotator](https://www.robots.ox.ac.uk/~vgg/software/via/via_demo.html)