Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/sid220/shot-prediction

A neural-network based approach to predicting the binary outcome (hit, miss) of shooting an object into a hole or at a target.
https://github.com/sid220/shot-prediction

Last synced: 4 days ago
JSON representation

A neural-network based approach to predicting the binary outcome (hit, miss) of shooting an object into a hole or at a target.

Awesome Lists containing this project

README

        

# Shot Prediction
[![dataset](https://huggingface.co/datasets/huggingface/badges/resolve/main/dataset-on-hf-sm.svg)](https://huggingface.co/datasets/sid220/2713-2024-shot-prediction/blob/main/raycasting_normalised_angles.csv)

A neural-network based approach to predicting the binary outcome (hit, miss) of shooting an object into a hole or at a
target.

## Introduction

This project provides a neural-network based approach to predicting the binary outcome (in, miss) of shooting an object
into a hole or at a target. The model is trained on a dataset of shots and their outcomes.

## Simulation

The notebook `data_gen.ipynb` contains a simulation of a shooting process. The simulation is based on a linear shot, and
uses raycasting.
Here's an example of the simulation (red = made, blue points are corners of a rectangle representing the goal):
![Simulated shots](other/sim.png)
You can grab the data
from [https://huggingface.co/datasets/sid220/2713-2024-shot-prediction/blob/main/raycasting_normalised_angles.csv](https://huggingface.co/datasets/sid220/2713-2024-shot-prediction/blob/main/raycasting_normalised_angles.csv).

## Usage: FRC team 2713's 2024 robot

FRC 2713's 2024 robot

FRC Team 2713, [Red Hawk Robotics](https://www.thebluealliance.com/team/2713), uses this project to predict the outcomes
of shooting orange rings, or "notes", on their 2024 robot. Their robot includes a pivot shooter and chassis which
functions as a turret.

a note

Data from the robot's sensors will be used as input to the model (later annotated as to whether the shot went in), and
the
model's output will be used to control the robot's position and velocity to make the "optimal shot". For now, data
is [simulated](#Simulation). You can grab the data
from [https://huggingface.co/datasets/sid220/2713-2024-shot-prediction/blob/main/raycasting_normalised_angles.csv](https://huggingface.co/datasets/sid220/2713-2024-shot-prediction/blob/main/raycasting_normalised_angles.csv).
Here's the output of the model when the x and y are the x and y of the point on the graph, the shot is off the horizon 45 degrees, the chassis is slightly pointing right, and the velocity is 0
with the red representing the rough location of the goal:

![Model output](other/predictions.png)

Training is fairly simple, and can be found in `train.ipynb`.