Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mastergalen/motion-relationships
UCL Computer Science MEng Master's Project: Using machine vision to predict motion relationships between entities in videos.
https://github.com/mastergalen/motion-relationships
computer-vision keras machine-vision python ucl
Last synced: 16 days ago
JSON representation
UCL Computer Science MEng Master's Project: Using machine vision to predict motion relationships between entities in videos.
- Host: GitHub
- URL: https://github.com/mastergalen/motion-relationships
- Owner: Mastergalen
- Created: 2017-12-22T18:01:07.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2018-04-30T17:07:20.000Z (almost 7 years ago)
- Last Synced: 2024-11-21T11:41:25.633Z (3 months ago)
- Topics: computer-vision, keras, machine-vision, python, ucl
- Language: Python
- Homepage:
- Size: 7.45 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Motion Relationships
This repository contains all the code for my Master's thesis at UCL: **Predicting motion relationships in video**.
The report can be viewed [here](https://cdn.galenhan.com/documents/university/thesis/Thesis-Predicting_Motion_Relationships_in_Video.pdf).
![Motion class example](figures/motion-class-examples.jpg)
## Prerequisites
* Copy `.env.example` to `.env`
* Fill in API keys
* Install Python packages `pip install -r requirements.txt`## Annotation Interface
The code for the annotation interface used for collecting annotations on AMT can be found inside `annotation-ui`.
## Scripts
Executable scripts for various pre-processing tasks can be found within the `scripts/` directory.
Each script is individually documented.## Models
To train/run the motion relationship models see `lib/model/architectures`.
## Dataset
The annotated dataset can be downloaded [here](https://cdn.galenhan.com/documents/university/thesis/motion-relationship-dataset.zip).