Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dalindev/tweepy
data mining using tweepy, SA using NLTK
https://github.com/dalindev/tweepy
data-mining sentiment-analysis
Last synced: 9 days ago
JSON representation
data mining using tweepy, SA using NLTK
- Host: GitHub
- URL: https://github.com/dalindev/tweepy
- Owner: dalindev
- License: mit
- Created: 2018-11-07T14:00:20.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2019-05-17T14:28:37.000Z (over 5 years ago)
- Last Synced: 2024-08-01T13:37:38.694Z (3 months ago)
- Topics: data-mining, sentiment-analysis
- Language: Python
- Homepage:
- Size: 438 KB
- Stars: 11
- Watchers: 1
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
## What is this?
* data mining using tweepy (search by key words ex: 'trump')
* sentiment analysis using NLTK, textblob etc
* words frequence count from a batch of tweeter messages (remove stopwords and stemming)
* calling methods by API
```(ex: localhost:5000/search?query=trump&tweet_limit=100&word_freq=20)```TODO:
1. mysql for storeage
2. front-end page
3. get all tweets from one user's
4. proper auth## Mining Twitter Data with python + mysql!
---
### To run you need:
1. python 3.7.0
2. MySQL
3. Twitter Account with consumer key and access token (get approved from twitter first)Example of app_config.py file
### OAuth Authentication
```
consumer_key=""
consumer_secret=""
access_token=""
access_token_secret=""
```### Database
```
HOST = "127.0.0.1"
USER = "root"
PASSWD = "mypwd"
DATABASE = "twt_database"
```### Installation
1. we need tweepy
2. we need TextBlob for sentiment analysis (note that NLTK download is about 3.5 GB)```
pip install tweepy
pip install nltk
pip install textblob$ python
>>> import nltk
>>> nltk.download()
```We have to select “all” and click download.