Skip to main content
Fig. 1 | Journal of Cheminformatics

Fig. 1

From: InflamNat: web-based database and predictor of anti-inflammatory natural products

Fig. 1

The framework of the multi-tokenization transformer model (MTT) which employs various sequence tokenized approaches and multiple transformers to obtain a high-quality representation of sequential data. MTT is composed of three different modules: multi-tokenization and pre-training, multiple transformers-based encoder, and tokenization-level self-attention. As a feature encoder, MTT combines the downstream prediction model into a unified end-to-end neural network learning framework

Back to article page