Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/guibrandt/osulearn
An attempt at using machine learning to create a neural network that learns how to play osu! like a human from replay data
https://github.com/guibrandt/osulearn
jupyter-notebook keras machine-learning neural-network osugame
Last synced: about 1 month ago
JSON representation
An attempt at using machine learning to create a neural network that learns how to play osu! like a human from replay data
- Host: GitHub
- URL: https://github.com/guibrandt/osulearn
- Owner: GuiBrandt
- License: mit
- Created: 2018-12-18T01:25:11.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2021-03-21T15:39:25.000Z (almost 4 years ago)
- Last Synced: 2023-03-08T21:42:09.447Z (almost 2 years ago)
- Topics: jupyter-notebook, keras, machine-learning, neural-network, osugame
- Language: Jupyter Notebook
- Homepage:
- Size: 33 MB
- Stars: 42
- Watchers: 2
- Forks: 3
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
OsuLearn
========
### An attempt at creating a Neural Network that learns how to play osu!std like a human from replays
###### (Plz don't judge me too much I'm new to machine learning and i can't english)Introduction
------------> osu! is a free and open-source rhythm game developed and published by Australian-based company PPY Developments PTY, created by Dean Herbert (also known as peppy). Originally released for Microsoft Windows on September 16, 2007, the game has also been ported to macOS (this version might be unstable), and Windows Phone. Its gameplay is based on titles including Osu! Tatakae! Ouendan, Elite Beat Agents, Taiko no Tatsujin, Beatmania IIDX, O2Jam, and DJMax.
>
> -- [Wikipedia](https://en.wikipedia.org/wiki/Osu!)The goal here is to model and train a Neural Network to generate replays for any osu beatmap it is given based on a dataset of recorded human replays (`.osr` files) and their respective beatmap (`.osu`) file.
To accomplish that, I've trained a Recurrent Neural Network with my replays and beatmaps.
Results
-------This is a preview for a replay generated for a map the AI had never seen before:
![IA Generated Replay](https://media.giphy.com/media/cYDD6KQP0dqK1XcXpu/giphy.gif)
Pretty good, actually!
It has figured out how to aim without looking like a robot and can even hit some jumps. Of course it is not perfect, but neither is the data set it has been trained on, so I am considering this a success.
Future
------The next step is to transform this into a GAN, so it can generate multiple different replays for a given map, mimicking a human play style.
This might take some time though, so that's it for now x).