https://github.com/scionoftech/textclassification-bert
A jupyter notebook, Text classificaiton using BERT: Bidirectional Encoder Representations from Transformers
https://github.com/scionoftech/textclassification-bert
bert bert-server rnn-lstm text-classificaiton
Last synced: 5 months ago
JSON representation
A jupyter notebook, Text classificaiton using BERT: Bidirectional Encoder Representations from Transformers
- Host: GitHub
- URL: https://github.com/scionoftech/textclassification-bert
- Owner: scionoftech
- Created: 2019-10-11T12:25:43.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2019-10-11T12:30:18.000Z (almost 6 years ago)
- Last Synced: 2025-05-12T22:17:36.264Z (5 months ago)
- Topics: bert, bert-server, rnn-lstm, text-classificaiton
- Language: Jupyter Notebook
- Size: 1.9 MB
- Stars: 5
- Watchers: 1
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## BBC Text MultiClass Classification - BERT: Bidirectional Encoder Representations from Transformers
A jupyter notebook, Text classificaiton using BERT: Bidirectional Encoder Representations from Transformers
BERT stands for Bidirectional Encoder Representations from Transformers. It is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context. As a result, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of NLP tasks.
Find more details about Bidirectional Encoder Representations from Transformers at [BERT](https://tfhub.dev/google/bert_uncased_L-12_H-768_A-12/1)
[](https://tfhub.dev/google/bert_uncased_L-12_H-768_A-12/1)
[](https://tfhub.dev/google/bert_uncased_L-12_H-768_A-12/1)
### Get BERT server and client
#### server
```bash
$ pip install bert-serving-server
```
#### client, independent of bert-serving-server````bash
$ pip install bert-serving-client
```#### download BERT
```bash
$ wget https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip && unzip uncased_L-12_H-768_A-12.zip
```#### Run BERT Server
```bash
$ bert-serving-start -model_dir uncased_L-12_H-768_A-12/ -num_worker=2 -max_seq_len 50
```Dataset link [bbc-fulltext-and-category](https://www.kaggle.com/yufengdev/bbc-fulltext-and-category)