Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/igrigorik/decisiontree
ID3-based implementation of the ML Decision Tree algorithm
https://github.com/igrigorik/decisiontree
decision-tree machine-learning ruby rubyml
Last synced: 3 days ago
JSON representation
ID3-based implementation of the ML Decision Tree algorithm
- Host: GitHub
- URL: https://github.com/igrigorik/decisiontree
- Owner: igrigorik
- Created: 2009-02-23T04:52:20.000Z (almost 16 years ago)
- Default Branch: master
- Last Pushed: 2018-10-31T15:22:42.000Z (about 6 years ago)
- Last Synced: 2024-12-02T02:00:12.017Z (10 days ago)
- Topics: decision-tree, machine-learning, ruby, rubyml
- Language: Ruby
- Homepage:
- Size: 54.7 KB
- Stars: 1,439
- Watchers: 40
- Forks: 130
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-ruby - decisiontree - A ruby library which implements ID3 (information gain) algorithm for decision tree learning. (Scientific)
README
# Decision Tree
A Ruby library which implements [ID3 (information gain)](https://en.wikipedia.org/wiki/ID3_algorithm) algorithm for decision tree learning. Currently, continuous and discrete datasets can be learned.
- Discrete model assumes unique labels & can be graphed and converted into a png for visual analysis
- Continuous looks at all possible values for a variable and iteratively chooses the best threshold between all possible assignments. This results in a binary tree which is partitioned by the threshold at every step. (e.g. temperate > 20C)## Features
- ID3 algorithms for continuous and discrete cases, with support for inconsistent datasets.
- [Graphviz component](http://rockit.sourceforge.net/subprojects/graphr/) to visualize the learned tree
- Support for multiple, and symbolic outputs and graphing of continuous trees.
- Returns default value when no branches are suitable for input## Implementation
- Ruleset is a class that trains an ID3Tree with 2/3 of the training data, converts it into set of rules and prunes the rules with the remaining 1/3 of the training data (in a [C4.5](https://en.wikipedia.org/wiki/C4.5_algorithm) way).
- Bagging is a bagging-based trainer (quite obvious), which trains 10 Ruleset trainers and when predicting chooses the best output based on voting.[Blog post with explanation & examples](http://www.igvita.com/2007/04/16/decision-tree-learning-in-ruby/)
## Example
```ruby
require 'decisiontree'attributes = ['Temperature']
training = [
[36.6, 'healthy'],
[37, 'sick'],
[38, 'sick'],
[36.7, 'healthy'],
[40, 'sick'],
[50, 'really sick'],
]# Instantiate the tree, and train it based on the data (set default to '1')
dec_tree = DecisionTree::ID3Tree.new(attributes, training, 'sick', :continuous)
dec_tree.traintest = [37, 'sick']
decision = dec_tree.predict(test)
puts "Predicted: #{decision} ... True decision: #{test.last}"# => Predicted: sick ... True decision: sick
# Specify type ("discrete" or "continuous") in the training data
labels = ["hunger", "color"]
training = [
[8, "red", "angry"],
[6, "red", "angry"],
[7, "red", "angry"],
[7, "blue", "not angry"],
[2, "red", "not angry"],
[3, "blue", "not angry"],
[2, "blue", "not angry"],
[1, "red", "not angry"]
]dec_tree = DecisionTree::ID3Tree.new(labels, training, "not angry", color: :discrete, hunger: :continuous)
dec_tree.traintest = [7, "red", "angry"]
decision = dec_tree.predict(test)
puts "Predicted: #{decision} ... True decision: #{test.last}"# => Predicted: angry ... True decision: angry
```## License
The [MIT License](https://opensource.org/licenses/MIT) - Copyright (c) 2006 Ilya Grigorik