GRU的源码实现 2018-02-06 | 2018-04-23 | deep learning , model | 字数统计: 842 | 阅读时长≈ 0:01 GRU的实现源码 GRU-tensorflow 1234567891011121314151617181920def call(self, inputs, state): """Gated recurrent unit (GRU) with nunits cells.""" gate_inputs = math_ops.matmul( array_ops.concat([inputs, state], 1), self._gate_kernel) gate_inputs = nn_ops.bias_add(gate_inputs, self._gate_bias) value = math_ops.sigmoid(gate_inputs) r, u = array_ops.split(value=value, num_or_size_splits=2, axis=1) r_state = r * state candidate = math_ops.matmul( array_ops.concat([inputs, r_state], 1), self._candidate_kernel) candidate = nn_ops.bias_add(candidate, self._candidate_bias) c = self._activation(candidate) new_h = u * state + (1 - u) * c return new_h, new_h pytorch 参考 Understanding LSTM Networks | colah Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling Evolution: from vanilla RNN to GRU & LSTMs 本文作者: Eson Xu 本文链接: https://blog.eson.org/pub/c79f2f90/ 版权声明: 本博客所有文章除特别声明外,均采用 CC BY-NC-SA 4.0 许可协议。转载请注明出处!