Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

修正个错误 #69

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file modified Deeplearning深度学习笔记v5.71.pdf
Binary file not shown.
2 changes: 1 addition & 1 deletion html/lesson1-week2.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion markdown/lesson1-week2.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

这周我们将学习神经网络的基础知识,其中需要注意的是,当实现一个神经网络的时候,我们需要知道一些非常重要的技术和技巧。例如有一个包含$m$个样本的训练集,你很可能习惯于用一个**for**循环来遍历训练集中的每个样本,但是当实现一个神经网络的时候,我们通常不直接使用**for**循环来遍历整个训练集,所以在这周的课程中你将学会如何处理训练集。

另外在神经网络的计算中,通常先有一个叫做前向暂停(**forward pause**)或叫做前向传播(**foward propagation**)的步骤,接着有一个叫做反向暂停(**backward pause**) 或叫做反向传播**(backward propagation**)的步骤。所以这周我也会向你介绍为什么神经网络的训练过程可以分为前向传播和反向传播两个独立的部分。
另外在神经网络的计算中,通常先有一个叫做前向过程(**forward pass**)或叫做前向传播(**foward propagation**)的步骤,接着有一个叫做反向步骤(**backward pass**) 或叫做反向传播**(backward propagation**)的步骤。所以这周我也会向你介绍为什么神经网络的训练过程可以分为前向传播和反向传播两个独立的部分。

在课程中我将使用逻辑回归(l**ogistic regression**)来传达这些想法,以使大家能够更加容易地理解这些概念。即使你之前了解过逻辑回归,我认为这里还是有些新的、有趣的东西等着你去发现和了解,所以现在开始进入正题。

Expand Down