You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def calculate_total_loss(self, x, y):
L = 0
# For each sentence...
for i in np.arange(len(y)):
o, s = self.forward_propagation(x[i])
# We only care about our prediction of the "correct" words
correct_word_predictions = o[np.arange(len(y[i])), y[i]]
# Add to the loss based on how off we were
L += -1 * np.sum(np.log(correct_word_predictions))
return L
L += -1 * np.sum(np.log(correct_word_predictions)) didn't take $y_n$ that is the true labels into consideration. I'm not sure I am right, maybe $y_n$ is just 1 with one-hot encoding. Am I right? Thank you!
The text was updated successfully, but these errors were encountered:
Hi, since the loss function formula is
$
\begin{aligned}
L(y,o) = - \frac{1}{N} \sum_{n \in N} y_{n} \log o_{n}
\end{aligned}
$
And the code in RNNLM.ipynb is
L += -1 * np.sum(np.log(correct_word_predictions))
didn't takeThe text was updated successfully, but these errors were encountered: