https://github.com/dillondaudert/pssp_lstm
Recurrent neural network implementations for protein secondary structure prediction and language models
https://github.com/dillondaudert/pssp_lstm
amino-acid-sequence deep-learning deep-neural-networks jupyter-notebook language-models lstm paper prediction pretrained-models protein python3 recurrent-neural-networks rnn secondary structure structure-prediction tensorflow unsupervised-learning
Last synced: about 1 month ago
JSON representation
Recurrent neural network implementations for protein secondary structure prediction and language models
- Host: GitHub
- URL: https://github.com/dillondaudert/pssp_lstm
- Owner: dillondaudert
- License: mit
- Created: 2018-02-27T20:07:57.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2018-09-18T02:19:45.000Z (over 6 years ago)
- Last Synced: 2025-03-24T10:05:22.955Z (about 2 months ago)
- Topics: amino-acid-sequence, deep-learning, deep-neural-networks, jupyter-notebook, language-models, lstm, paper, prediction, pretrained-models, protein, python3, recurrent-neural-networks, rnn, secondary, structure, structure-prediction, tensorflow, unsupervised-learning
- Language: Python
- Homepage:
- Size: 526 KB
- Stars: 7
- Watchers: 2
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Recurrent Neural Networks for Protein Secondary Structure Prediction
This repo contains ongoing work exploring recurrent neural network models as applied to protein secondary structure prediction.The [pssp_lstm](./pssp_lstm/) module contains an implementation of the LSTM RNN specified in [Sonderby & Winther, 2015](https://arxiv.org/pdf/1412.7828.pdf). See the README in that folder for more details and a user guide.
The [lm_pretrain](./lm_pretrain/) module allows users to train bidirectional language models that can be combined with bidirectional RNNs for protein secondary structure prediction. See the README in that folder for more details and a user guide.
## Further Work
This repo is under development. Current work is focusing on expanding the functionality of `lm_pretrain` to allow for more flexible models, and for exploring different ways of integrating pretrained LMs into BDRNNs.Work on implementing models from other papers is currently on pause.