https://github.com/lurenss/spam-filter
Second assignment of Artificial Intelligence course held by Professor Andrea Torsello of Ca' Foscari University of Venice, spam detectors with SVM classification using linear, polynomial of degree 2, RBF kernels and Naive Bayes and k-NN
https://github.com/lurenss/spam-filter
k-nn linear-kernel machine-learning naive-bayes-classifier polynomial-kernel rbf-kernel svm-classifier
Last synced: about 2 months ago
JSON representation
Second assignment of Artificial Intelligence course held by Professor Andrea Torsello of Ca' Foscari University of Venice, spam detectors with SVM classification using linear, polynomial of degree 2, RBF kernels and Naive Bayes and k-NN
- Host: GitHub
- URL: https://github.com/lurenss/spam-filter
- Owner: lurenss
- Created: 2021-04-10T09:12:40.000Z (about 4 years ago)
- Default Branch: main
- Last Pushed: 2021-09-12T12:38:33.000Z (over 3 years ago)
- Last Synced: 2025-02-14T12:19:57.844Z (4 months ago)
- Topics: k-nn, linear-kernel, machine-learning, naive-bayes-classifier, polynomial-kernel, rbf-kernel, svm-classifier
- Language: Python
- Homepage:
- Size: 142 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Support: SupportVectorMachine.py
Awesome Lists containing this project
README
# ASSIGNMENT
Write a spam filter using discrimitative and generative classifiers. Use the Spambase dataset which already represents spam/ham messages through a bag-of-words representations through a dictionary of 48 highly discriminative words and 6 characters. The first 54 features correspond to word/symbols frequencies; ignore features 55-57; feature 58 is the class label (1 spam/0 ham).
Perform SVM classification using linear, polynomial of degree 2, and RBF kernels over the TF/IDF representation.
Can you transform the kernels to make use of angular information only (i.e., no length)? Are they still positive definite kernels?
Classify the same data also through a Naive Bayes classifier for continuous inputs, modelling each feature with a Gaussian distributionPerform k-NN clasification with k=5
Provide the code, the models on the training set, and the respective performances in 10 way cross validation.Explain the differences between the three models.
P.S. you can use a library implementation for SVM, but do implement the Naive Bayes on your own. As for k-NN, you can use libraries if you want, but it might just be easier to do it on your own.