See also:
class RNN: # ... def step(self, x): # update the hidden state self.h = np.tanh(np.dot(self.W_hh, self.h) + np.dot(self.W_xh, x)) # compute the output vector y = np.dot(self.W_hy, self.h) return y
The np.tanh (hyperbolic tangent) function implements a nonlinearity
that squashes the activations to the range [1, 1]. The input, x, is combined
with the xh matrix via the numpy dot product vector operation. It is added
to the dot product of the internal state and the hh matrix, then squashed
to produce a new internal state. Finally, the output is processed through
the hy matrix and returned.
file: /Techref/method/ai/natural_language.htm, 7KB, , updated: 2019/8/27 12:08, local time: 2020/10/1 13:02,
owner: JMNEFP786,

©2020 These pages are served without commercial sponsorship. (No popup ads, etc...).Bandwidth abuse increases hosting cost forcing sponsorship or shutdown. This server aggressively defends against automated copying for any reason including offline viewing, duplication, etc... Please respect this requirement and DO NOT RIP THIS SITE. Questions? <A HREF="http://www.sxlist.com/techref/method/ai/natural_language.htm"> Natural Language processing.</A> 
Did you find what you needed? 
Welcome to sxlist.com!sales, advertizing, & kind contributors just like you! Please don't rip/copy (here's why Copies of the site on CD are available at minimal cost. 
Welcome to www.sxlist.com! 
.