https://github.com/gsurma/twitter_data_parser
Python scripts that download metadata and tweets for given users.
https://github.com/gsurma/twitter_data_parser
data machine-learning parser python python2 twitter twitter-api
Last synced: 6 months ago
JSON representation
Python scripts that download metadata and tweets for given users.
- Host: GitHub
- URL: https://github.com/gsurma/twitter_data_parser
- Owner: gsurma
- License: mit
- Created: 2018-03-31T09:00:47.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2021-07-09T08:57:55.000Z (over 4 years ago)
- Last Synced: 2025-07-04T08:42:44.036Z (7 months ago)
- Topics: data, machine-learning, parser, python, python2, twitter, twitter-api
- Language: Python
- Homepage: https://gsurma.github.io
- Size: 42 KB
- Stars: 18
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
# Twitter Data Parser
## About
Python (Tweepy based) scripts that allow downloading metadata and tweets for given users.
## How to use it
1. Configure API Keys in `config.py`
2. Prepare a list of usernames in `users.txt`
3. Adjust additional params in `config.py`
### Account Data
1. `python twitter_account_data_parser.py`
2. For each user there will be created a folder named `{user}` with with `{user}_account_data.csv` file inside it.
3. Additionally there will be created a single `account_data.csv` file combining all entries in output folder. Its very useful for further data analysis.
### Tweets
1. `python twitter_tweets_parser.py`
2. For each user there will be created a folder named `{user}` with with `{user}_tweets.csv` file inside it.
3. Script skips already downloaded tweets
## Author
**Greg (Grzegorz) Surma**
[**PORTFOLIO**](https://gsurma.github.io)
[**GITHUB**](https://github.com/gsurma)
[**BLOG**](https://medium.com/@gsurma)