The forward propagation in an RNN makes a few assumptions: 1) We assume the hyperbolic tangent activation function for the hidden layer. 2) We assume that the output is discrete as if the RNN is used to predict words or characters. This...
Read More
Resource Details
Packt - Data
Concepts
linear regression, weight, length, architecture, vector, zeros
Additional Tags
artificial neural networks, recurrent neural networks, convolutional neural networks, ann, cnn, rnn, logistic regression, long short-term memory, tensorflow, python, keras., linear combination, activation function, data set, bias term, mathematical formulation, time series, previous timestamp, current timestamp, weight wya, arrow coming, information coming, timestamp, arrow, essentially, people, thing, information, write, draw, neuron, left, bottom, call, connecting
Classroom Considerations
Best For: Explaining a topic
Video is ad-free