https://github.com/deshanadesai/margin-based-adaboost
Vanilla Adaboost implementation and Adaboost with Margin
https://github.com/deshanadesai/margin-based-adaboost
Last synced: about 2 months ago
JSON representation
Vanilla Adaboost implementation and Adaboost with Margin
- Host: GitHub
- URL: https://github.com/deshanadesai/margin-based-adaboost
- Owner: deshanadesai
- Created: 2017-12-04T18:50:26.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2017-12-04T19:16:02.000Z (over 7 years ago)
- Last Synced: 2025-02-12T15:21:20.543Z (4 months ago)
- Language: Jupyter Notebook
- Size: 494 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Margin-based-Adaboost
Vanilla Adaboost implementation and Adaboost with MarginReference Papers:
1) [Efficient Margin Maximizing with Boosting](http://www.jmlr.org/papers/volume6/ratsch05a/ratsch05a.pdf)
2) [How Boosting the Margin Can Also Boost Classifier Complexity](http://rob.schapire.net/papers/boost_complexity.pdf)The Adaboost(rho) algorithm is the exact same algorithm when rho = 0.
Pseudocode:
