https://github.com/abhigyan126/minecraft-skin-generator
WGAN-GP GAN model to generate unique Minecraft skin every run.
https://github.com/abhigyan126/minecraft-skin-generator
generative-adversarial-network minecraft minecraft-skin minecraft-skin-generator torch wgan-gp
Last synced: about 1 month ago
JSON representation
WGAN-GP GAN model to generate unique Minecraft skin every run.
- Host: GitHub
- URL: https://github.com/abhigyan126/minecraft-skin-generator
- Owner: Abhigyan126
- License: cc0-1.0
- Created: 2025-01-30T16:50:24.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-02-02T13:50:23.000Z (4 months ago)
- Last Synced: 2025-03-30T10:29:51.767Z (2 months ago)
- Topics: generative-adversarial-network, minecraft, minecraft-skin, minecraft-skin-generator, torch, wgan-gp
- Language: Jupyter Notebook
- Homepage: https://blockface.vercel.app/
- Size: 56.7 MB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Minecraft Skin Generator using WGAN-GP
A deep learning model that generates unique Minecraft character skins using a Wasserstein GAN with Gradient Penalty (WGAN-GP).## implementation
Check out Nextjs implementation: https://blockface.vercel.app/
## Sample








## Overview
This project implements a generative adversarial network to create unique Minecraft character skins. It uses advanced techniques including self-attention mechanisms, adaptive residual blocks, and spectral normalization to produce high-quality, diverse character skins.
## Arcitecture
```mermaid
graph TD
subgraph Data_Pipeline
A[Raw Minecraft Skins] -->|Preprocessing| B[MinecraftImageDataset]
B -->|Validation| C{Image Check}
C -->|Valid| D[DataLoader]
C -->|Invalid| E[Error Log]
D -->|Batch Processing| F[Normalized Tensors]
endsubgraph Generator_Architecture
G[Latent Space] -->|Dense Layer| H[Initial Features]
H -->|Reshape| I[4x4 Feature Maps]
I -->|Block 1| J[8x8 Features + Self-Attention]
J -->|Block 2| K[16x16 Features + Self-Attention]
K -->|Block 3| L[32x32 Features]
L -->|Block 4| M[64x64 Features]
M -->|Final Conv| N[Generated Skin]
subgraph Residual_Block
R1[Input] -->|Conv1| R2[Norm + LeakyReLU]
R2 -->|Conv2| R3[Norm]
R1 -->|Shortcut| R4[1x1 Conv]
R3 -->|SE Block| R5[Channel Recalibration]
R4 --> R6[Add]
R5 --> R6
end
endsubgraph Critic_Architecture
O[Input Image] -->|Initial Block| P[Feature Maps]
P -->|Block 1| Q[Deep Features + Self-Attention]
Q -->|Block 2| S[Deeper Features]
S -->|Block 3| T[Final Features]
T -->|Flatten| U[Vector]
U -->|Linear| V[Criticism Score]
endsubgraph Training_Loop
W[Real & Fake Pairs] -->|Forward Pass| X[Loss Computation]
X -->|Gradient Penalty| Y[WGAN-GP Loss]
Y -->|Backprop| Z[Parameter Updates]
Z -->|New Batch| W
endsubgraph Evaluation_Metrics
AA[Generated Samples] -->|SSIM| AB[Structural Similarity]
AA -->|Visual Check| AC[Sample Gallery]
AB -->|Tracking| AD[Loss Curves]
AC -->|Save| AE[Sample Archive]
endsubgraph Post_Processing
AF[Raw Generated Skin] -->|Format Convert| AG[PNG Output]
AG -->|SkinPy| AH[3D Render]
AH -->|Multiple Views| AI[Final Visualization]
endF -->|Real Images| W
N -->|Fake Images| W
N -->|Output| AA
V -->|Feedback| X
```## Features
- WGAN-GP architecture with gradient penalty for stable training
- Self-attention mechanisms for better global coherence
- Adaptive residual blocks with squeeze-and-excitation
- Spectral normalization for improved training stability
- SSIM (Structural Similarity Index) monitoring during training
- Multi-GPU support for faster training
- Custom dataset handling with image validation### Environment Setup
It's recommended to use Python 3.11.x with Conda:
```bash
conda create -n minecraft-gan python=3.11
conda activate minecraft-ganpip install -r requirements.txt
```## Project Structure
```
minecraft-skin-generator/
├── test/
│ ├── save/
│ │ └── *.png
│ └── *.py
├── model/
│ ├── docs/
│ │ └── *.txt
│ ├── samples/
│ │ ├── model v5/
│ │ │ └── *.png
│ │ └── model v3/
│ │ └── *.png
│ └── *.pth
├── train/
│ └── *.py
├── ipynb_files/
│ └── *.ipynb
├── generated_skins/
│ ├── *.txt
│ └── *.png
└── README.md
```## Usage
```bash
# for training
python train/train.py# for testing
python test/test.py
```## Files Description
### Training Files (`train/`)
- Main training scripts for the GAN
- Data loading and preprocessing utilities
- Model configuration and hyperparameter settings### Model Files (`model/`)
- Saved model checkpoints (`.pth`)
- Sample outputs in version-specific directories
- Documentation and model architecture details### Testing Files (`test/`)
- Scripts for generating and testing skins
- Save directory for test outputs
- Validation utilities### Jupyter Notebooks (`ipynb_files/`)
- Development and experimentation notebooks
- Training visualization and analysis### Generated Skins (`generated_skins/`)
- Output directory for generated sample Minecraft skins
- Associated metadata## Monitoring and Evaluation
The training process includes:
- Generator and Critic loss tracking
- SSIM score monitoring
- Regular sample generation for visual inspection