Skip to main content
Fig. 5 | Journal of Cheminformatics

Fig. 5

From: Small molecule autoencoders: architecture engineering to optimize latent space utility and sustainability

Fig. 5

Effect of additive optimization. Shown on the left and right as a reference are the best performing architectures identified during the systematic adjustment of single architectural parameters, once trained on the 50 k subset (left) and the full set (right). The center shows the performance of the additively optimized models, for each SMILES and SELFIES trained on the 50 k subset for 50 epochs as well as 200 epochs (the latter only for the best performing seed from the 50 epoch experiments). Architectures are all GRUs and model abbreviations follow the scheme “latent size—hidden size—number of layers—use of attention”. Shown are Mean Similarity (gray) and Full Reconstruction (blue) on the test split

Back to article page