{"id":13906995,"url":"https://github.com/geffy/tffm","last_synced_at":"2025-05-16T00:06:47.398Z","repository":{"id":47295173,"uuid":"57906189","full_name":"geffy/tffm","owner":"geffy","description":"TensorFlow implementation of an arbitrary order Factorization Machine","archived":false,"fork":false,"pushed_at":"2022-01-17T20:39:04.000Z","size":461,"stargazers_count":781,"open_issues_count":19,"forks_count":175,"subscribers_count":33,"default_branch":"master","last_synced_at":"2025-05-09T18:12:21.233Z","etag":null,"topics":["factorization-machines","research-project","tensorflow"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/geffy.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2016-05-02T17:06:07.000Z","updated_at":"2025-03-11T03:43:23.000Z","dependencies_parsed_at":"2022-09-10T04:05:06.947Z","dependency_job_id":null,"html_url":"https://github.com/geffy/tffm","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geffy%2Ftffm","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geffy%2Ftffm/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geffy%2Ftffm/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geffy%2Ftffm/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/geffy","download_url":"https://codeload.github.com/geffy/tffm/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254442854,"owners_count":22071878,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["factorization-machines","research-project","tensorflow"],"created_at":"2024-08-06T23:01:46.012Z","updated_at":"2025-05-16T00:06:42.378Z","avatar_url":"https://github.com/geffy.png","language":"Jupyter Notebook","readme":"This is a TensorFlow implementation of an arbitrary order (\u003e=2) Factorization Machine based on paper [Factorization Machines with libFM](http://dl.acm.org/citation.cfm?doid=2168752.2168771).\n\nIt supports:\n* dense and sparse inputs\n* different (gradient-based) optimization methods\n* classification/regression via different loss functions (logistic and mse implemented)\n* logging via TensorBoard\n\nThe inference time is linear with respect to the number of features.\n\nTested on Python3.5, but should work on Python2.7\n\nThis implementation is quite similar to the one described in Blondel's et al. paper [https://arxiv.org/abs/1607.07195], but was developed independently and prior to the first appearance of the paper.\n\n# Dependencies\n* [scikit-learn](http://scikit-learn.org/stable/)\n* [numpy](http://www.numpy.org/)\n* [tqdm](https://github.com/tqdm/tqdm)\n* [tensorflow 1.0+ (tested on 1.3)](https://www.tensorflow.org/)\n\n# Installation\nStable version can be installed via `pip install tffm`. \n\n# Usage\nThe interface is similar to scikit-learn models. To train a 6-order FM model with rank=10 for 100 iterations with learning_rate=0.01 use the following sample\n```python\nfrom tffm import TFFMClassifier\nmodel = TFFMClassifier(\n    order=6,\n    rank=10,\n    optimizer=tf.train.AdamOptimizer(learning_rate=0.01),\n    n_epochs=100,\n    batch_size=-1,\n    init_std=0.001,\n    input_type='dense'\n)\nmodel.fit(X_tr, y_tr, show_progress=True)\n```\n\nSee `example.ipynb` and `gpu_benchmark.ipynb` for more details.\n\nIt's highly recommended to read `tffm/core.py` for help.\n\n\n# Testing\nJust run ```python test.py``` in the terminal. ```nosetests``` works too, but you must pass the `--logging-level=WARNING` flag to avoid printing insane amounts of TensorFlow logs to the screen.\n\n\n# Citation\nIf you use this software in academic research, please, cite it using the following BibTeX:\n```latex\n@misc{trofimov2016,\nauthor = {Mikhail Trofimov, Alexander Novikov},\ntitle = {tffm: TensorFlow implementation of an arbitrary order Factorization Machine},\nyear = {2016},\npublisher = {GitHub},\njournal = {GitHub repository},\nhowpublished = {\\url{https://github.com/geffy/tffm}},\n}\n```\n","funding_links":[],"categories":["Machine Learning","Tensorflow实用程序","Jupyter Notebook"],"sub_categories":["Kernel Methods"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgeffy%2Ftffm","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgeffy%2Ftffm","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgeffy%2Ftffm/lists"}