Skip to main content
Fig. 1 | Journal of Cheminformatics

Fig. 1

From: Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data

Fig. 1

a A feed-forward deep neural network with two hidden layers, each layer consists of multiple neurons, which are fully connected with neurons of the previous and following layers. b Each artificial neuron receives one or more input signals x 1, x 2,…, x m and outputs a value y to neurons of the next layer. The output y is a nonlinear weighted sum of input signals. Nonlinearity is achieved by passing the linear sum through non-linear functions known as activation functions. c Popular neurons activation functions: the rectified linear unit (ReLU) (red), Sigmoid (Sigm) (green) and Tanh (blue)

Back to article page