Skip to main content

Table 3 Pairwise comparison of performance between deep neural networks with rectified linear units (ReLU) against Sigmoid (Sigm) and Tanh activation functions based on the Wilcoxon paired signed rank test, with confidence intervals 99%

From: Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data

Activation
functions
Mean of MCC diff. SD of MCC diff. p value
ReLU—Sigm 0.018 0.022 3.922e−11
ReLU—Tanh 0.029 0.033 3.417e−14
  1. DNN combined with ReLU function were found to statistically outperform Sigm and Tanh functions