Skip to main content

Table 1 The layers used in the traditional network

From: Chemlistem: chemical named entity recognition using recurrent neural networks

Layer Type Input(s) No. of output neurons Notes
te1 Embedding ti1 300  
tc1 Conv1D ti2 256 Width = 3, activation = relu, dropout of 0.5
tm1 Concatenate te1, tc1 556  
tb1 Bidirectional LSTM tm1 64 per direction, total 128 Dropout of 0.5
td1 TimeDistributed Dense tb1 5 Activation = softmax