uci-cbcl/DanQ

Name: DanQ

Owner: Computational Biology and Computational Learning @ UCI

Description: A hybrid convolutional and recurrent neural network for predicting the function of DNA sequences

Created: 2015-12-18 22:59:42.0

Updated: 2018-01-04 13:38:48.0

Pushed: 2017-01-14 00:06:06.0

Homepage: null

Size: 58271

Language: Groff

GitHub Committers

UserMost Recent Commit# Commits

Other Committers

UserEmailMost Recent Commit# Commits

README

README for DanQ

DanQ is a hybrid convolutional and recurrent neural network model for predicting the function of DNA de novo from sequence.

Citing DanQ

Quang, D. and Xie, X. ``DanQ: a hybrid convolutional and recurrent neural network for predicting the function of DNA sequences'', NAR, 2015.

INSTALL

DanQ uses a lot of bleeding edge software packages, and very often these software packages are not backwards compatible when they are updated. Therefore, I have included the most recent version numbers of the software packages for the configuration that worked for me. For the record, I am using Ubuntu Linux 14.04 LTS with an NVIDIA Titan Z GPU.

Required
t clone git://github.com/Theano/Theano.git
 Theano
thon setup.py develop
r zxvf DanQ_seya.tar.gz
 DanQ_seya
thon setup.py install

I will likely improve DanQ soon and drop the dependency on seya.

Optional

USAGE

You need to first download the training, validation, and testing sets from DeepSEA. You can download the datasets from here. After you have extracted the contents of the tar.gz file, move the 3 .mat files into the data/ folder.

If you have everything installed, you can train a model as follows:

thon DanQ_train.py

On my system, each epoch took about 6 hours. Whenever the validation loss is reaches a new minimum at the end of a training epoch, the best weights are stored in DanQ_bestmodel.hdf5. I've already uploaded the fully trained model in the hyperlink. You can see motif results, including visualizations and TOMTOM comparisons to known motifs, in the motifs/ folder. Likewise, you can also train a much larger model where about half of the motifs are initialized with JASPAR motifs:

thon DanQ-JASPAR_train.py

Weights are saved to the fight DanQ-JASPAR_bestmodel.hdf5 whenever the validation loss is lowered. Motif results for this model are also stored in the motifs/ folder.

For your convenience, I've posted the current ROC AUC and PR AUC statistics comparing DanQ and DanQ-JASPAR with DeepSEA.

If you do not want to train a model from scratch and just want to do predictions, I've included test scripts for both models and the file example.h5 in the data folder. This is the same hdf5 file that is generated using the example from the DeepSEA package. The test scripts here have the same input and output formats as the prediction script from DeepSEA, so you can replace the prediction step of the DeepSEA pipeline (i.e. the 2_DeepSEA.lua script) with the test scripts here:

thon DanQ_test.py data/example.h5 data/example_DanQ_pred.h5

To-Do


This work is supported by the National Institutes of Health's National Center for Advancing Translational Sciences, Grant Number U24TR002306. This work is solely the responsibility of the creators and does not necessarily represent the official views of the National Institutes of Health.