https://github.com/lemniscate-world/neural
Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.
https://github.com/lemniscate-world/neural
automation data-science data-visualization diagrams dsl hyperparameter-optimization lark llms machine-learning neural-architecture-search neural-networks neural-networks-and-deep-learning neural-networks-from-scratch nocode onnx pytorch tensorflow visual-programming-language visualization
Last synced: 3 months ago
JSON representation
Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.
- Host: GitHub
- URL: https://github.com/lemniscate-world/neural
- Owner: Lemniscate-world
- License: mit
- Created: 2025-02-18T05:52:39.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-04-06T17:49:27.000Z (3 months ago)
- Last Synced: 2025-04-06T18:20:30.650Z (3 months ago)
- Topics: automation, data-science, data-visualization, diagrams, dsl, hyperparameter-optimization, lark, llms, machine-learning, neural-architecture-search, neural-networks, neural-networks-and-deep-learning, neural-networks-from-scratch, nocode, onnx, pytorch, tensorflow, visual-programming-language, visualization
- Language: HTML
- Homepage: https://lemniscate-world.github.io/Neural/
- Size: 497 MB
- Stars: 9
- Watchers: 1
- Forks: 0
- Open Issues: 186
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Funding: .github/FUNDING.yml
- License: LICENSE.md
- Security: docs/security.md
Awesome Lists containing this project
README

β οΈ WARNING: Neural-dsl is a WIP DSL and debuggerβbugs exist, feedback welcome!
This project is under active development and not yet production-ready!# Neural: A Neural Network Programming Language
[](LICENSE)
[](https://www.python.org/)
[](https://discord.gg/KFku4KvS)
[](https://github.com/Lemniscate-SHA-256/Neural/actions/workflows/pylint.yml)
[](https://github.com/Lemniscate-SHA-256/Neural/actions/workflows/python-package.yml)
[](https://github.com/Lemniscate-SHA-256/Neural/actions/workflows/codeql.yml)
[](https://github.com/Lemniscate-SHA-256/Neural/actions/workflows/pytest-to-issues.yml)
[](https://codecov.io/gh/Lemniscate-SHA-256/Neural)



## π― Pain Points Solved
Neural addresses deep learning challenges across **Criticality** (how essential) and **Impact Scope** (how transformative):
| Criticality / Impact | Low Impact | Medium Impact | High Impact |
|----------------------|-----------------------------|-------------------------------------|-------------------------------------|
| **High** | | | - **Shape Mismatches**: Pre-runtime validation stops runtime errors.
- **Debugging Complexity**: Real-time tracing & anomaly detection. |
| **Medium** | | - **Steep Learning Curve**: No-code GUI eases onboarding. | - **Framework Switching**: One-flag backend swaps.
- **HPO Inconsistency**: Unified tuning across frameworks. |
| **Low** | - **Boilerplate**: Clean DSL syntax saves time. | - **Model Insight**: FLOPs & diagrams.
- **Config Fragmentation**: Centralized setup. | |### Why It Matters
- **Core Value**: Fix critical blockers like shape errors and debugging woes with game-changing tools.
- **Strategic Edge**: Streamline framework switches and HPO for big wins.
- **User-Friendly**: Lower barriers and enhance workflows with practical features.Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks whether via code, CLI, or a no-code interface. With **declarative syntax**, **cross-framework support**, and **built-in execution tracing (NeuralDbg)**, it simplifies deep learning development.
## Feedback
Help us improve Neural DSL! Share your feedback: [Typeform link](https://form.typeform.com/to/xcibBdKD#name=xxxxx&email=xxxxx&phone_number=xxxxx&user_id=xxxxx&product_id=xxxxx&auth_code=xxxxx).
## π Features
- **YAML-like Syntax**: Define models intuitively without framework boilerplate.
- **Shape Propagation**: Catch dimension mismatches *before* runtime.
- β Interactive shape flow diagrams included.
- **Multi-Framework HPO**: Optimize hyperparameters for both PyTorch and TensorFlow with a single DSL config (#434).
- **Multi-Backend Export**: Generate code for **TensorFlow**, **PyTorch**, or **ONNX**.
- **Training Orchestration**: Configure optimizers, schedulers, and metrics in one place.
- **Visual Debugging**: Render interactive 3D architecture diagrams.
- **Extensible**: Add custom layers/losses via Python plugins.### **π NeuralDbg: Built-in Neural Network Debugger**
NeuralDbg provides **real-time execution tracing, profiling, and debugging**, allowing you to visualize and analyze deep learning models in action.β **Real-Time Execution Monitoring** β Track activations, gradients, memory usage, and FLOPs.






β **Shape Propagation Debugging** β Visualize tensor transformations at each layer.
β **Gradient Flow Analysis** β Detect **vanishing & exploding gradients**.
β **Dead Neuron Detection** β Identify inactive neurons in deep networks.
β **Anomaly Detection** β Spot **NaNs, extreme activations, and weight explosions**.
β **Step Debugging Mode** β Pause execution and inspect tensors manually.## π¦ Installation
# Clone the repository
git clone https://github.com/yourusername/neural.git
cd neural# Create a virtual environment (recommended)
python -m venv venv
source venv/bin/activate # Linux/macOS
venv\Scripts\activate # Windows# Install dependencies
```bash
pip install -r requirements.txt
``````bash
pip install neural-dsl
```see v0.2.4 for bug fixes
**Prerequisites**: Python 3.8+, pip
## π οΈ Quick Start
### 1. Define a Model
Create `mnist.neural`:
```yaml
network MNISTClassifier {
input: (28, 28, 1) # Channels-last format
layers:
Conv2D(filters=32, kernel_size=(3,3), activation="relu")
MaxPooling2D(pool_size=(2,2))
Flatten()
Dense(units=128, activation="relu")
Dropout(rate=0.5)
Output(units=10, activation="softmax")
loss: "sparse_categorical_crossentropy"
optimizer: Adam(learning_rate=0.001)
metrics: ["accuracy"]
train {
epochs: 15
batch_size: 64
validation_split: 0.2
}
}
```### 3. Run Or Compile The Model
```bash
neural run mnist.neural --backend tensorflow --output mnist_tf.py
# Or for PyTorch:
neural run mnist.neural --backend pytorch --output mnist_torch.py
```### 4. Visualize Architecture
```bash
neural visualize mnist.neural --format png
```This will create architecture.png, shape_propagation.html, and tensor_flow.html for inspecting the network structure and shape propagation.
![MNIST Architecture]()
### 5. Debug with NeuralDbg
```bash
neural debug mnist.neural
```Open your browser to http://localhost:8050 to monitor execution traces, gradients, and anomalies interactively.
### 6. Use The No-Code Interface
```bash
neural --no_code
```Open your browser to http://localhost:8051 to build and compile models via a graphical interface.
---
## **π Debugging with NeuralDbg**
### **πΉ 1οΈβ£ Start Real-Time Execution Tracing**
```bash
python neural.py debug mnist.neural
```
**Features:**
β Layer-wise execution trace
β Memory & FLOP profiling
β Live performance monitoring### **πΉ 2οΈβ£ Analyze Gradient Flow**
```bash
python neural.py debug --gradients mnist.neural
```
π **Detect vanishing/exploding gradients** with interactive charts.### **πΉ 3οΈβ£ Identify Dead Neurons**
```bash
python neural.py debug --dead-neurons mnist.neural
```
π **Find layers with inactive neurons (common in ReLU networks).**### **πΉ 4οΈβ£ Detect Training Anomalies**
```bash
python neural.py debug --anomalies mnist.neural
```
π₯ **Flag NaNs, weight explosions, and extreme activations.**### **πΉ 5οΈβ£ Step Debugging (Interactive Tensor Inspection)**
```bash
python neural.py debug --step mnist.neural
```
π **Pause execution at any layer and inspect tensors manually.**---
## π Why Neural?
| Feature | Neural | Raw TensorFlow/PyTorch |
|-----------------------|-------------|-------------------------|
| Shape Validation | β Auto | β Manual |
| Framework Switching | 1-line flag | Days of rewriting |
| Architecture Diagrams | Built-in | Third-party tools |
| Training Config | Unified | Fragmented configs |### **π Cross-Framework Code Generation**
| Neural DSL | TensorFlow Output | PyTorch Output |
|---------------------|----------------------------|---------------------------|
| `Conv2D(filters=32)`| `tf.keras.layers.Conv2D(32)`| `nn.Conv2d(in_channels, 32)` |
| `Dense(units=128)` | `tf.keras.layers.Dense(128)`| `nn.Linear(in_features, 128)`|## π Benchmarks
| Task | Neural | Baseline (TF/PyTorch) |
|----------------------|--------|-----------------------|
| MNIST Training | 1.2x β‘| 1.0x |
| Debugging Setup | 5min π| 2hr+ |## π Documentation
- [DSL Documentation](docs/dsl.md)
Explore advanced features:
- [Custom Layers Guide]()
- [ONNX Export Tutorial]()
- [Training Configuration]()
- [NeuralDbg Debugging Features]()## π Examples
Explore common use cases in `examples/` with step-by-step guides in `docs/examples/`:
- [MNIST Classifier Guide](docs/examples/mnist_guide.md)
- [Sentiment Analysis Guide](docs/examples/sentiment_guide.md)
- [Transformer for NLP Guide](docs/examples/transformer_guide.md)## πΈArchitecture Graphs (Zoom Alot For Someπ«©)

---
## π€ Contributing
We welcome contributions! See our:
- [Contributing Guidelines](CONTRIBUTING.md)
- [Code of Conduct](CODE_OF_CONDUCT.md)
- [Roadmap](ROADMAP.md)To set up a development environment:
```bash
git clone https://github.com/yourusername/neural.git
cd neural
pip install -r requirements-dev.txt # Includes linter, formatter, etc.
pre-commit install # Auto-format code on commit
```## Star History
[](https://www.star-history.com/#Lemniscate-world/Neural&Timeline)
## Support
Please give us a star βοΈ to increase our chances of getting into GitHub trends - the more attention we get, the higher our chances of actually making a difference.
Please share this project with your friends! Every share helps us reach more developers and grow our community. The more developers we reach, the more likely we are to build something truly revolutionary together. π## π¬ Community
- [Discord Server](https://discord.gg/KFku4KvS): Chat with developers
- [Twitter @NLang4438](https://x.com/NLang4438): Updates & announcements