{"id":13548324,"url":"https://github.com/erelsgl/limdu","last_synced_at":"2025-05-14T09:10:17.322Z","repository":{"id":8914284,"uuid":"10640288","full_name":"erelsgl/limdu","owner":"erelsgl","description":"Machine-learning for Node.js","archived":false,"fork":false,"pushed_at":"2025-02-19T14:23:53.000Z","size":1769,"stargazers_count":1053,"open_issues_count":7,"forks_count":99,"subscribers_count":51,"default_branch":"master","last_synced_at":"2025-05-14T09:09:55.858Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"lgpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/erelsgl.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2013-06-12T09:58:56.000Z","updated_at":"2025-05-04T02:57:12.000Z","dependencies_parsed_at":"2024-06-18T18:13:06.210Z","dependency_job_id":"2f1e352b-41dd-4ca4-8bce-7c07764bc4f0","html_url":"https://github.com/erelsgl/limdu","commit_stats":{"total_commits":505,"total_committers":11,"mean_commits":45.90909090909091,"dds":0.4693069306930693,"last_synced_commit":"1e9141daec791a220317fb8576a17d057ee0e62d"},"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/erelsgl%2Flimdu","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/erelsgl%2Flimdu/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/erelsgl%2Flimdu/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/erelsgl%2Flimdu/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/erelsgl","download_url":"https://codeload.github.com/erelsgl/limdu/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254110374,"owners_count":22016391,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T12:01:08.844Z","updated_at":"2025-05-14T09:10:17.269Z","avatar_url":"https://github.com/erelsgl.png","language":"JavaScript","readme":"# Limdu.js\n\nLimdu is a machine-learning framework for Node.js. It supports **multi-label classification**, **online learning**, and **real-time classification**. Therefore, it is especially suited for natural language understanding in dialog systems and chat-bots.\n\nLimdu is in an \"alpha\" state - some parts are working (see this readme), but some parts are missing or not tested. Contributions are welcome. \n\nLimdu currently runs on Node.js 0.12 and later versions.\n\n## Installation\n\n\tnpm install limdu\n\n\u003c!--a href=\"https://tracking.gitads.io/?repo=limdu\"\u003e \u003cimg src=\"https://images.gitads.io/limdu\" alt=\"GitAds\"/\u003e \u003c/a--\u003e\n\n## Demos\n\nYou can run the demos from this project: [limdu-demo](https://github.com/erelsgl/limdu-demo).\n \n**Table of Contents**  *generated with [DocToc](http://doctoc.herokuapp.com/)*\n\n- [Binary Classification](#binary-classification)\n\t- [Batch Learning - learn from an array of input-output pairs:](#batch-learning---learn-from-an-array-of-input-output-pairs)\n\t- [Online Learning](#online-learning)\n\t- [Binding](#binding)\n\t- [Explanations](#explanations)\n\t- [Other Binary Classifiers](#other-binary-classifiers)\n- [Multi-Label Classification](#multi-label-classification)\n\t- [Other Multi-label classifiers](#other-multi-label-classifiers)\n- [Feature engineering](#feature-engineering)\n\t- [Feature extraction - converting an input sample into feature-value pairs:](#feature-extraction---converting-an-input-sample-into-feature-value-pairs)\n\t- [Input Normalization](#input-normalization)\n\t- [Feature lookup table - convert custom features to integer features](#feature-lookup-table---convert-custom-features-to-integer-features)\n- [Serialization](#serialization)\n- [Cross-validation](#cross-validation)\n- [Back-classification (aka Generation)](#back-classification-aka-generation)\n- [SVM wrappers](#svm-wrappers)\n- [Undocumented featuers](#undocumented-featuers)\n- [Contributions](#contributions)\n- [License](#license)\n\n## Binary Classification\n\n### Batch Learning - learn from an array of input-output pairs:\n\n```js\nvar limdu = require('limdu');\n\nvar colorClassifier = new limdu.classifiers.NeuralNetwork();\n\ncolorClassifier.trainBatch([\n\t{input: { r: 0.03, g: 0.7, b: 0.5 }, output: 0},  // black\n\t{input: { r: 0.16, g: 0.09, b: 0.2 }, output: 1}, // white\n\t{input: { r: 0.5, g: 0.5, b: 1.0 }, output: 1}   // white\n\t]);\n\nconsole.log(colorClassifier.classify({ r: 1, g: 0.4, b: 0 }));  // 0.99 - almost white\n```\n\nCredit: this example uses [brain.js, by Heather Arthur](https://github.com/harthur/brain).\n\n\n### Online Learning\n```js\nvar birdClassifier = new limdu.classifiers.Winnow({\n\tdefault_positive_weight: 1,\n\tdefault_negative_weight: 1,\n\tthreshold: 0\n});\n\nbirdClassifier.trainOnline({'wings': 1, 'flight': 1, 'beak': 1, 'eagle': 1}, 1);  // eagle is a bird (1)\nbirdClassifier.trainOnline({'wings': 0, 'flight': 0, 'beak': 0, 'dog': 1}, 0);    // dog is not a bird (0)\nconsole.dir(birdClassifier.classify({'wings': 1, 'flight': 0, 'beak': 0.5, 'penguin':1})); // initially, penguin is mistakenly classified as 0 - \"not a bird\"\nconsole.dir(birdClassifier.classify({'wings': 1, 'flight': 0, 'beak': 0.5, 'penguin':1}, /*explanation level=*/4)); // why? because it does not fly.\n\nbirdClassifier.trainOnline({'wings': 1, 'flight': 0, 'beak': 1, 'penguin':1}, 1);  // learn that penguin is a bird, although it doesn't fly \nbirdClassifier.trainOnline({'wings': 0, 'flight': 1, 'beak': 0, 'bat': 1}, 0);     // learn that bat is not a bird, although it does fly\nconsole.dir(birdClassifier.classify({'wings': 1, 'flight': 0, 'beak': 1, 'chicken': 1})); // now, chicken is correctly classified as a bird, although it does not fly.  \nconsole.dir(birdClassifier.classify({'wings': 1, 'flight': 0, 'beak': 1, 'chicken': 1}, /*explanation level=*/4)); // why?  because it has wings and beak.\n```\n\nCredit: this example uses Modified Balanced Margin Winnow ([Carvalho and Cohen, 2006](http://www.citeulike.org/user/erelsegal-halevi/article/2243777)). \n\nThe \"explanation\" feature is explained below.\n\n\n### Binding\n\nUsing Javascript's binding capabilities, it is possible to create custom classes, which are made of existing classes and pre-specified parameters:\n```js\nvar MyWinnow = limdu.classifiers.Winnow.bind(0, {\n\tdefault_positive_weight: 1,\n\tdefault_negative_weight: 1,\n\tthreshold: 0\n});\n\nvar birdClassifier = new MyWinnow();\n...\n// continue as above\n```\n\n### Explanations\n\nSome classifiers can return \"explanations\" - additional information that explains how the classification result has been derived: \n\n```js\nvar colorClassifier = new limdu.classifiers.Bayesian();\n\ncolorClassifier.trainBatch([\n\t{input: { r: 0.03, g: 0.7, b: 0.5 }, output: 'black'}, \n\t{input: { r: 0.16, g: 0.09, b: 0.2 }, output: 'white'},\n\t{input: { r: 0.5, g: 0.5, b: 1.0 }, output: 'white'},\n\t]);\n\nconsole.log(colorClassifier.classify({ r: 1, g: 0.4, b: 0 }, \n\t\t/* explanation level = */1));\n```\nCredit: this example uses code from [classifier.js, by Heather Arthur](https://github.com/harthur/classifier).\n\nThe explanation feature is experimental and is supported differently for different classifiers. For example, for the Bayesian classifier it returns the probabilities for each category:\n\n```js\n{ classes: 'white',\n\texplanation: [ 'white: 0.0621402182289608', 'black: 0.031460948468170505' ] }\n```\n\nWhile for the winnow classifier it returns the relevance (feature-value times feature-weight) for each feature: \n\n```js\n{ classification: 1,\n\texplanation: [ 'bias+1.12', 'r+1.08', 'g+0.25', 'b+0.00' ] }\n```\n\nWARNING: The internal format of the explanations might change without notice. The explanations should be used for presentation purposes only (and not, for example, for extracting the actual numbers). \n\n### Other Binary Classifiers\n\nIn addition to Winnow and NeuralNetwork, version 0.2 includes the following binary classifiers:\n\n* Bayesian - uses [classifier.js, by Heather Arthur](https://github.com/harthur/classifier). \n* Perceptron - Loosely based on [perceptron.js, by John Chesley](https://github.com/chesles/perceptron).\n* SVM - uses [svm.js, by Andrej Karpathy](https://github.com/karpathy/svmjs). \n* Linear SVM - wrappers around SVM-Perf and Lib-Linear (see below).\n* Decision Tree - based on [node-decision-tree-id3 by Ankit Kuwadekar](https://github.com/bugless/nodejs-decision-tree-id3) or [ID3-Decision-Tree by Will Kurt](https://github.com/willkurt/ID3-Decision-Tree).\n\nThis library is still under construction, and not all features work for all classifiers. For a full list of the features that do work, see the \"test\" folder. \n\n\n## Multi-Label Classification\n\nIn binary classification, the output is 0 or 1;\n\nIn multi-label classification, the output is a set of zero or more labels.\n\n```js\nvar MyWinnow = limdu.classifiers.Winnow.bind(0, {retrain_count: 10});\n\nvar intentClassifier = new limdu.classifiers.multilabel.BinaryRelevance({\n\tbinaryClassifierType: MyWinnow\n});\n\nintentClassifier.trainBatch([\n\t{input: {I:1,want:1,an:1,apple:1}, output: \"APPLE\"},\n\t{input: {I:1,want:1,a:1,banana:1}, output: \"BANANA\"},\n\t{input: {I:1,want:1,chips:1}, output: \"CHIPS\"}\n\t]);\n\nconsole.dir(intentClassifier.classify({I:1,want:1,an:1,apple:1,and:1,a:1,banana:1}));  // ['APPLE','BANANA']\n```\n\n### Other Multi-label classifiers\n\nIn addition to BinaryRelevance, version 0.2 includes the following multi-label classifier types (see the multilabel folder):\n\n* Cross-Lingual Language Model Classifier (based on [Anton Leusky and David Traum, 2008](http://www.citeulike.org/user/erelsegal-halevi/article/12540655))\n* HOMER - Hierarchy Of Multi-label classifiERs (based on [Tsoumakas et al., 2007](http://www.citeulike.org/user/erelsegal-halevi/article/3170786))\n* Meta-Labeler (based on [Lei Tang, Suju Rajan, Vijay K. Narayanan, 2009](http://www.citeulike.org/user/erelsegal-halevi/article/4860265)) \n* Joint identification and segmentation (based on [Fabrizio Morbini, Kenji Sagae, 2011](http://www.citeulike.org/user/erelsegal-halevi/article/10259046))\n* Passive-Aggressive (based on [Koby Crammer, Ofer Dekel, Joseph Keshet, Shai Shalev-Shwartz, Yoram Singer, 2006](http://www.citeulike.org/user/erelsegal-halevi/article/5960770))\n* Threshold Classifier (converting multi-class classifier to multi-label classifier by finding the best appropriate threshold)\n\nThis library is still under construction, and not all features work for all classifiers. For a full list of the features that do work, see the \"test\" folder. \n\n## Feature engineering\n\n### Feature extraction - converting an input sample into feature-value pairs:\n\n```js\n// First, define our base classifier type (a multi-label classifier based on winnow):\nvar TextClassifier = limdu.classifiers.multilabel.BinaryRelevance.bind(0, {\n\tbinaryClassifierType: limdu.classifiers.Winnow.bind(0, {retrain_count: 10})\n});\n\n// Now define our feature extractor - a function that takes a sample and adds features to a given features set:\nvar WordExtractor = function(input, features) {\n\tinput.split(\" \").forEach(function(word) {\n\t\tfeatures[word]=1;\n\t});\n};\n\n// Initialize a classifier with the base classifier type and the feature extractor:\nvar intentClassifier = new limdu.classifiers.EnhancedClassifier({\n\tclassifierType: TextClassifier,\n\tfeatureExtractor: WordExtractor\n});\n\n// Train and test:\nintentClassifier.trainBatch([\n\t{input: \"I want an apple\", output: \"apl\"},\n\t{input: \"I want a banana\", output: \"bnn\"},\n\t{input: \"I want chips\", output:    \"cps\"},\n\t]);\n\nconsole.dir(intentClassifier.classify(\"I want an apple and a banana\"));  // ['apl','bnn']\nconsole.dir(intentClassifier.classify(\"I WANT AN APPLE AND A BANANA\"));  // []\n```\n\nAs you can see from the last example, by default feature extraction is case-sensitive. \nWe will take care of this in the next example.\n\nInstead of defining your own feature extractor, you can use those already bundled with limdu:\n\n```js\nlimdu.features.NGramsOfWords\nlimdu.features.NGramsOfLetters\nlimdu.features.HypernymExtractor\n```\n\nYou can also make 'featureExtractor' an array of several feature extractors, that will be executed in the order you include them.\n\n### Input Normalization\n\n```js\n//Initialize a classifier with a feature extractor and a case normalizer:\nintentClassifier = new limdu.classifiers.EnhancedClassifier({\n\tclassifierType: TextClassifier,  // same as in previous example\n\tnormalizer: limdu.features.LowerCaseNormalizer,\n\tfeatureExtractor: WordExtractor  // same as in previous example\n});\n\n//Train and test:\nintentClassifier.trainBatch([\n\t{input: \"I want an apple\", output: \"apl\"},\n\t{input: \"I want a banana\", output: \"bnn\"},\n\t{input: \"I want chips\", output: \"cps\"},\n\t]);\n\nconsole.dir(intentClassifier.classify(\"I want an apple and a banana\"));  // ['apl','bnn']\nconsole.dir(intentClassifier.classify(\"I WANT AN APPLE AND A BANANA\"));  // ['apl','bnn'] \n```\n\nOf course you can use any other function as an input normalizer. For example, if you know how to write a spell-checker, you can create a normalizer that corrects typos in the input.\n\nYou can also make 'normalizer' an array of several normalizers. These will be executed in the order you include them.\n\n### Feature lookup table - convert custom features to integer features\n\nThis example uses the quadratic SVM implementation [svm.js, by Andrej Karpathy](https://github.com/karpathy/svmjs). \nThis SVM (like most SVM implementations) works with integer features, so we need a way to convert our string-based features to integers.\n\n```js\nvar limdu = require('limdu');\n\n// First, define our base classifier type (a multi-label classifier based on svm.js):\nvar TextClassifier = limdu.classifiers.multilabel.BinaryRelevance.bind(0, {\n\tbinaryClassifierType: limdu.classifiers.SvmJs.bind(0, {C: 1.0})\n});\n\n// Initialize a classifier with a feature extractor and a lookup table:\nvar intentClassifier = new limdu.classifiers.EnhancedClassifier({\n\tclassifierType: TextClassifier,\n\tfeatureExtractor: limdu.features.NGramsOfWords(1),  // each word (\"1-gram\") is a feature  \n\tfeatureLookupTable: new limdu.features.FeatureLookupTable()\n});\n\n// Train and test:\nintentClassifier.trainBatch([\n\t{input: \"I want an apple\", output: \"apl\"},\n\t{input: \"I want a banana\", output: \"bnn\"},\n\t{input: \"I want chips\", output: \"cps\"},\n\t]);\n\nconsole.dir(intentClassifier.classify(\"I want an apple and a banana\"));  // ['apl','bnn']\n```\n\nThe FeatureLookupTable takes care of the numbers, while you may continue to work with texts! \n\n## Serialization\n\nSay you want to train a classifier on your home computer, and use it on a remote server. To do this, you should somehow convert the trained classifier to a string, send the string to the remote server, and deserialize it there.\n\nYou can do this with the \"serialization.js\" package:\n\n\tnpm install serialization\n\t\nOn your home machine, do the following:\n\n```js\nvar serialize = require('serialization');\n\n// First, define a function that creates a fresh  (untrained) classifier.\n// This code should be stand-alone - it should include all the 'require' statements\n//   required for creating the classifier.\nfunction newClassifierFunction() {\n\tvar limdu = require('limdu');\n\tvar TextClassifier = limdu.classifiers.multilabel.BinaryRelevance.bind(0, {\n\t\tbinaryClassifierType: limdu.classifiers.Winnow.bind(0, {retrain_count: 10})\n\t});\n\n\tvar WordExtractor = function(input, features) {\n\t\tinput.split(\" \").forEach(function(word) {\n\t\t\tfeatures[word]=1;\n\t\t});\n\t};\n\t\n\t// Initialize a classifier with a feature extractor:\n\treturn new limdu.classifiers.EnhancedClassifier({\n\t\tclassifierType: TextClassifier,\n\t\tfeatureExtractor: WordExtractor,\n\t\tpastTrainingSamples: [], // to enable retraining\n\t});\n}\n\n// Use the above function for creating a new classifier:\nvar intentClassifier = newClassifierFunction();\n\n// Train and test:\nvar dataset = [\n\t{input: \"I want an apple\", output: \"apl\"},\n\t{input: \"I want a banana\", output: \"bnn\"},\n\t{input: \"I want chips\", output: \"cps\"},\n\t];\nintentClassifier.trainBatch(dataset);\n\nconsole.log(\"Original classifier:\");\nintentClassifier.classifyAndLog(\"I want an apple and a banana\");  // ['apl','bnn']\nintentClassifier.trainOnline(\"I want a doughnut\", \"dnt\");\nintentClassifier.classifyAndLog(\"I want chips and a doughnut\");  // ['cps','dnt']\nintentClassifier.retrain();\nintentClassifier.classifyAndLog(\"I want an apple and a banana\");  // ['apl','bnn']\nintentClassifier.classifyAndLog(\"I want chips and a doughnut\");  // ['cps','dnt']\n\n// Serialize the classifier (convert it to a string)\nvar intentClassifierString = serialize.toString(intentClassifier, newClassifierFunction);\n\n// Save the string to a file, and send it to a remote server.\n```\n\nOn the remote server, do the following:\n\n```js\n// retrieve the string from a file and then:\n\nvar intentClassifierCopy = serialize.fromString(intentClassifierString, __dirname);\n\nconsole.log(\"Deserialized classifier:\");\nintentClassifierCopy.classifyAndLog(\"I want an apple and a banana\");  // ['apl','bnn']\nintentClassifierCopy.classifyAndLog(\"I want chips and a doughnut\");  // ['cps','dnt']\nintentClassifierCopy.trainOnline(\"I want an elm tree\", \"elm\");\nintentClassifierCopy.classifyAndLog(\"I want doughnut and elm tree\");  // ['dnt','elm']\n```\n\nCAUTION: Serialization was not tested for all possible combinations of classifiers and enhancements. Test well before use!\n\n## Cross-validation\n\n```js\n// create a dataset with a lot of input-output pairs:\nvar dataset = [ ... ];\n\n// Decide how many folds you want in your   k-fold cross-validation:\nvar numOfFolds = 5;\n\n// Define the type of classifier that you want to test:\nvar IntentClassifier = limdu.classifiers.EnhancedClassifier.bind(0, {\n\tclassifierType: limdu.classifiers.multilabel.BinaryRelevance.bind(0, {\n\t\tbinaryClassifierType: limdu.classifiers.Winnow.bind(0, {retrain_count: 10})\n\t}),\n\tfeatureExtractor: limdu.features.NGramsOfWords(1),\n});\n\nvar microAverage = new limdu.utils.PrecisionRecall();\nvar macroAverage = new limdu.utils.PrecisionRecall();\n\nlimdu.utils.partitions.partitions(dataset, numOfFolds, function(trainSet, testSet) {\n\tconsole.log(\"Training on \"+trainSet.length+\" samples, testing on \"+testSet.length+\" samples\");\n\tvar classifier = new IntentClassifier();\n\tclassifier.trainBatch(trainSet);\n\tlimdu.utils.test(classifier, testSet, /* verbosity = */0,\n\t\tmicroAverage, macroAverage);\n});\n\nmacroAverage.calculateMacroAverageStats(numOfFolds);\nconsole.log(\"\\n\\nMACRO AVERAGE:\"); console.dir(macroAverage.fullStats());\n\nmicroAverage.calculateStats();\nconsole.log(\"\\n\\nMICRO AVERAGE:\"); console.dir(microAverage.fullStats());\n```\n\n## Back-classification (aka Generation)\n\nUse this option to get the list of all samples with a given class.\n\n```js\nvar intentClassifier = new limdu.classifiers.EnhancedClassifier({\n\tclassifierType: limdu.classifiers.multilabel.BinaryRelevance.bind(0, {\n\t\tbinaryClassifierType: limdu.classifiers.Winnow.bind(0, {retrain_count: 10})\n\t}),\n\tfeatureExtractor: limdu.features.NGramsOfWords(1),\n\tpastTrainingSamples: [],\n});\n\n// Train and test:\nintentClassifier.trainBatch([\n\t{input: \"I want an apple\", output: \"apl\"},\n\t{input: \"I want a banana\", output: \"bnn\"},\n\t{input: \"I really want an apple\", output: \"apl\"},\n\t{input: \"I want a banana very much\", output: \"bnn\"},\n\t]);\n\nconsole.dir(intentClassifier.backClassify(\"apl\"));  // [ 'I want an apple', 'I really want an apple' ]\n```\n\n## SVM wrappers\n\nThe native svm.js implementation takes a lot of time to train -  quadratic in the number of training samples. \nThere are two common packages that can be trained in time linear in the number of training samples. They are:\n\n* [SVM-Perf](http://www.cs.cornell.edu/people/tj/svm_light/svm_perf.html) - by Thorsten Joachims;\n* [LibLinear](http://www.csie.ntu.edu.tw/~cjlin/liblinear) - Fan, Chang, Hsieh, Wang and Lin.\n\nThe limdu.js package provides wrappers for these implementations. \nIn order to use the wrappers, you must have the binary file used for training in your path, that is:\n\n* **svm\\_perf\\_learn** - from [SVM-Perf](http://www.cs.cornell.edu/people/tj/svm_light/svm_perf.html).\n* **liblinear\\_train** - from [LibLinear](http://www.csie.ntu.edu.tw/~cjlin/liblinear).\n\nOnce you have any one of these installed, you can use the corresponding classifier instead of any binary classifier\nused in the previous demos, as long as you have a feature-lookup-table. For example, with SvmPerf:\n\n```js\nvar intentClassifier = new limdu.classifiers.EnhancedClassifier({\n\tclassifierType: limdu.classifiers.multilabel.BinaryRelevance.bind(0, {\n\t\tbinaryClassifierType: limdu.classifiers.SvmPerf.bind(0, \t{\n\t\t\tlearn_args: \"-c 20.0\" \n\t\t})\n\t}),\n\tfeatureExtractor: limdu.features.NGramsOfWords(1),\n\tfeatureLookupTable: new limdu.features.FeatureLookupTable()\n});\n```\n\nand similarly with SvmLinear.\n\nSee the files classifiers/svm/SvmPerf.js and classifiers/svm/SvmLinear.js for a documentation of the options.\n\n\n## Undocumented featuers\n\nSome advanced features are working but not documented yet. If you need any of them, open an issue and I will try to document them.\n\n* Custom input normalization, based on regular expressions.\n* Input segmentation for multi-label classification - both manual (with regular expressions) and automatic.\n* Feature extraction for model adaptation.\n* Spell-checker features. \n* Hypernym features.\n* Classification based on a cross-lingual language model.\n* Format conversion - ARFF, JSON, svm-light, TSV.\n\n## License\n\nLGPL\n\n## Contributions\n\nCode contributions are welcome. Reasonable pull requests, with appropriate documentation and unit-tests, will be accepted.\n\nDo you like limdu? Remember that you can star it :-)\n","funding_links":[],"categories":["JavaScript","Packages","包"],"sub_categories":["Mad science","神奇的科学"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ferelsgl%2Flimdu","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ferelsgl%2Flimdu","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ferelsgl%2Flimdu/lists"}