Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/emaadmanzoor/time-sensitive-influence-maximisation
Machine learning course project at KAUST
https://github.com/emaadmanzoor/time-sensitive-influence-maximisation
Last synced: about 1 month ago
JSON representation
Machine learning course project at KAUST
- Host: GitHub
- URL: https://github.com/emaadmanzoor/time-sensitive-influence-maximisation
- Owner: emaadmanzoor
- Created: 2014-04-17T11:52:45.000Z (over 10 years ago)
- Default Branch: master
- Last Pushed: 2014-05-07T13:15:31.000Z (over 10 years ago)
- Last Synced: 2024-04-16T01:44:41.175Z (8 months ago)
- Language: C++
- Size: 14.2 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Time Sensitive Influence Maximisation
=====================================Project for the CS229 (machine learning) course at KAUST.
## Abstract
In a diffusion network, the estimated likelihood of influence of one node on another depends
on the transmission rate of the influencer node. Existing approaches learn the transmission
rates and influence likelihoods from data, but consider the transmission rate to be time-
invariant. This project proposes a modification by modeling the transmission rate as a distribution
over time; specifically, over the 24 hours of a day. New influence **estimation** and **maximisation**
algorithms are proposed in this model, and implemented as modifications of **ConTinEst** and **InfluMax**,
respectively.## Implementation
This project modifies the influence propogation model to consider transmission rates
of nodes that not static, but a distribution over the 24 hours of a day. This approach
is then composed of two parts:1. Influence estimation, built on [ConTinEst](http://www.cc.gatech.edu/~ndu8/DuSonZhaMan-NIPS-2013.html).
2. Influence maximisation, built on [InfluMax](http://people.tuebingen.mpg.de/manuelgr/influmax/).## Data
Both ConTinEst and InfluMax provide toy data and synthetic data generators.
I will evaluate my performance on these toy/synthetic data sets, and probably
on a real dataset from Twitter if time permits.