https://github.com/jonathan-cook235/misinformationdetection
Combining Temporal Graph Attention Networks and Hawkes Processes to model the dissemination of misinformation via social media as a time-series event propagation.
https://github.com/jonathan-cook235/misinformationdetection
Last synced: 3 months ago
JSON representation
Combining Temporal Graph Attention Networks and Hawkes Processes to model the dissemination of misinformation via social media as a time-series event propagation.
- Host: GitHub
- URL: https://github.com/jonathan-cook235/misinformationdetection
- Owner: jonathan-cook235
- Created: 2020-07-21T11:44:49.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2021-03-02T13:24:38.000Z (about 4 years ago)
- Last Synced: 2025-01-15T22:35:34.992Z (4 months ago)
- Language: Python
- Size: 490 KB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# MisinformationDetection
Combining Temporal Graph Attention Networks and Hawkes Processes to model the dissemination of misinformation via social media as a time-series event propagation.## Dataset
We use the twitter15/16 datasets (Ma et. al) as is the standard for assessing the performance of many state of the art misinformation detection models. These along with the extra crawled twitter features can be found in the rumor_detection_acl2017 directory here: https://drive.google.com/drive/u/0/folders/1x1aywVkcoArlKiZjLN_If_YcspAI39Np## Dependent packages
'misinformation.yml' is an export of the conda env used.## Preprocessing
Data preparation is performed within the files utils.py, text_preprocessing.py and dataset.py. These codes were largely built by the authors of 'Fake News Detection Using Machine Learning on Graphs'. We extend the script dataset.py to build a dynamic graph to illustrate the misinformation propagation over time.## Scripts
'encoder_decoder.py' combines the encoder and decoder. This architecture is explained here: https://nlp.seas.harvard.edu/2018/04/03/attention.html. A temporal graph sum constructs the encoder, creating a set of node embeddings. This method is from the paper 'Temporal Graph Networks for Deep Learning on Dynamic Graphs'. The veracity prediction task requires performing graph learning on the final state of the dynamic graph to make a prediction on the veracity of the source claim. This method is from the paper 'GCAN: Graph Aware Co-Attention Networks for Explainable Fake News Detection on Social Media'.'train_gnn.py' is the training script for our model.
The files 'Week # Model Implementation.ipynb' provide code breakdowns for my supervisory research partner Qiang Zhang (UCL Centre for Artificial Intelligence).
'm2dne' contains the codes of the temporal point process-based dynamic graph model that is used for misinformation experiments.