Owing to its superior performance, the Transformer model, based on the 'Encoder- Decoder' paradigm, has become the mainstream model in natural language processing. However, bioinformatics has embraced machine learning and has led to remarkable progress in drug design and protein property prediction. Cell-penetrating peptides (CPPs) are a type of permeable protein that is a convenient 'postman' in drug penetration tasks. However, only a few CPPs have been discovered, limiting their practical applications in drug permeability. CPPs have led to a new approach that enables the uptake of only macromolecules into cells (i.e., without other potentially harmful materials found in the drug).
Most previous studies have utilized trivial machine learning techniques and hand-crafted features to construct a simple classifier. CPPFormer was constructed by implementing the attention structure of the Transformer, rebuilding the network based on the characteristics of CPPs according to their short length, and using an automatic feature extractor with a few manually engineered features to co-direct the predicted results. Compared to all previous methods and other classic text classification models, the empirical results show that our proposed deep model-based method achieves the best performance, with an accuracy of 92.16% in the CPP924 dataset, and passes various index tests.
[http://dx.doi.org/10.1038/s41573-019-0050-3] [PMID: 31801986]
[http://dx.doi.org/10.1158/1535-7163.MCT-06-0100] [PMID: 16985061]
[http://dx.doi.org/10.1007/s10822-020-00307-z] [PMID: 32180124]
[http://dx.doi.org/10.1093/bioinformatics/bty1047] [PMID: 30590410]
[http://dx.doi.org/10.1038/s41586-021-03819-2] [PMID: 34265844]
[http://dx.doi.org/10.1093/bib/bbaa275] [PMID: 33152766]