Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/avilum/facebook-archive-analyzer
This repository lets you explore, visualize and understand what data Facebook has about you.
https://github.com/avilum/facebook-archive-analyzer
data-visualization docker docker-compose elasticsearch facebook-archive kibana logstash
Last synced: 9 days ago
JSON representation
This repository lets you explore, visualize and understand what data Facebook has about you.
- Host: GitHub
- URL: https://github.com/avilum/facebook-archive-analyzer
- Owner: avilum
- License: mit
- Created: 2021-02-01T22:11:10.000Z (almost 4 years ago)
- Default Branch: master
- Last Pushed: 2021-03-10T16:46:40.000Z (over 3 years ago)
- Last Synced: 2024-10-12T11:46:15.848Z (25 days ago)
- Topics: data-visualization, docker, docker-compose, elasticsearch, facebook-archive, kibana, logstash
- Language: Dockerfile
- Homepage:
- Size: 16.6 KB
- Stars: 21
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Facebook ZIP Analyzer - Explore Your Facebook Data Using ELK
This repository lets you explore, visualize and understand what data Facebook has about you.
All you need to do is go to the "Download Your Information" page, Download your private JSON archive at https://facebook.com/dyi/## Setup
After you downloaded the archive, unzip it into the the folder "my_data", to the directory tree will look like this:
```code
mkdir my_data
# unzip your archive to my_data directory
# The repo tree should be:
..
docker-compose.yml
/my_data/facebook-/about_you
/my_data/facebook-/ads_and_businesses
...
```
Then, install docker-compose and docker locally.
Now, simply run the stack and see your data.
The archive will be streamed to the stack from the files you just added to "my_data" directory using Logstash, indexed in ElasticSearch and visualized in Kibana.
```
# Optional - Install Docker:
# sudo snap install docker
# (https://docs.docker.com/get-docker/)
# Optional - Install Compose:
# https://docs.docker.com/compose/install/
# Run the stack:
docker-compose up
```
## Accessing The Dashboard
Now your data will be streamed, and within a minute you should see it on Kibana on http://localhost:5601/.
## See The Stack Logs
```code
docker-compose logs -f
```
## Add More Data To Logstash
You can add more files to the pipeline, by adding files to /logstash/pipeline/ and replacing the file path, and fields names to split.