NetGO: Improving large-scale protein function prediction with massive network information

Ronghui You, Shuwei Yao, Yi Xiong, Xiaodi Huang, Fengzhu Sun, Hiroshi Mamitsuka, Shanfeng Zhu

Research output: Contribution to journalArticlepeer-review

35 Citations (Scopus)
54 Downloads (Pure)

Abstract

Automated function prediction (AFP) of proteins is of great significance in biology. AFP can be regarded as a problem of the large-scale multi-label classification where a protein can be associated with multiple gene ontology terms as its labels. Based on our GOLabeler-a state-of-the-art method for the third critical assessment of functional annotation (CAFA3), in this paper we propose NetGO, a web server that is able to further improve the performance of the large-scale AFP by incorporating massive protein-protein network information. Specifically, the advantages of NetGO are threefold in using network information: (i) NetGO relies on a powerful learning to rank framework from machine learning to effectively integrate both sequence and network information of proteins; (ii) NetGO uses the massive network information of all species (>2000) in STRING (other than only some specific species) and (iii) NetGO still can use network information to annotate a protein by homology transfer, even if it is not contained in STRING. Separating training and testing data with the same time-delayed settings of CAFA, we comprehensively examined the performance of NetGO. Experimental results have clearly demonstrated that NetGO significantly outperforms GOLabeler and other competing methods. The NetGO web server is freely available at http://issubmission.sjtu.edu.cn/netgo/.
Original languageEnglish
Pages (from-to)379-387
Number of pages9
JournalNucleic Acids Research
Volume47
Issue number1
Early online date20 May 2019
DOIs
Publication statusPublished - 02 Jul 2019

Fingerprint

Dive into the research topics of 'NetGO: Improving large-scale protein function prediction with massive network information'. Together they form a unique fingerprint.

Cite this