forked from norman3/prml
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
156 lines (142 loc) · 8.84 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
---
layout: default
---
<div class="row">
<div class="col-sm-12 col-md-6">
<div class="well-sm">
<h3>Introduction</h3>
<p>이 페이지는 PRML(Pattern Recognition & Machien Learning, Bishop)을 정리한 문서입니다. 관련 교재가 궁금하신 분들은 아래 버튼을 눌러 교재를 확인하시면 됩니다.</p>
<p>
<a href="http://research.microsoft.com/en-us/um/people/cmbishop/#prml-book" target="_blank">
<button type="button" class="btn btn-success">교재 정보</button>
</a>
</p>
</div>
<div class="well-sm">
<h3>New Posts</h3>
<ul class="post-list">
{% for post in site.posts limit:3 %}
<li>
<h4>
<a href="{{ post.url | prepend: site.baseurl }}">{{ post.title }}</a>
</h4>
<p><span class="post-date">{{ post.date | date: "%b %-d, %Y" }}</span></p>
{{ post.excerpt }}
<hr/>
</li>
{% endfor %}
</ul>
</div>
</div>
<div class="col-sm-12 col-md-6">
<div class="well-sm">
<h3>Chapter 1. 소개 (Introduction)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter01/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter01/1">1. 예제 : 다항식 커브 피팅 (Example: Polynomial Curve Fitting)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter01/2">2. 확률 이론 (Probability Theory)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter01/3">3. 모델 선택 (Model Selection)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter01/4">4. 차원의 저주 (The Curse of Dimension)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter01/5">5. 결정 이론 (Decision Theory)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter01/6">6. 정보 이론 (Information Theory)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 2. 확률 분포 (Probability Distributions)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter02/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/1">1. 이진 변수 (Binary Variables)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/2">2. 다항 분포 (Multinomial Variables)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/3_1">3. 가우시안 분포 (The Gaussian Distributions) - Part.I</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/3_2">3. 가우시안 분포 (The Gaussian Distributions) - Part.II</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/3_3">3. 가우시안 분포 (The Gaussian Distributions) - Part.III</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/4">4. 지수 분포족 (The Exponential Family)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter02/5">5. 비모수적 방법들 (Nonparametric Methods)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 3. 회귀 선형 모델 (Linear Models for Regression)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter03/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter03/1">1. 선형 기저 함수 모델 (Linear Basis Function Model)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter03/2">2. 바이어스-분산 분해 (The Bias-Variance Decomposition)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter03/3">3. 베이지언 선형 회귀 (Bayesian Linear Regression)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter03/4">4. 베이지언 모델 비교 (Bayesian Model Comparison)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter03/5">5. 증거 영역 근사 (Evidence Approximation)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter03/6">6. 고정된 기저 함수의 제약점 (Limitations of Fixed Basis Functions)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 4. 분류 선형 모델 (Linear Models for Classification)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter04/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter04/1">1. 판별함수 (Discriminant Functions)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter04/2">2. 확률 생성 모델 (Probabilistic Generative Models)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter04/3">3. 확률 판별 모델 (Probabilistic Discriminative Models)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter04/4">4. 라플라스 근사법 (Laplace Approximation)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter04/5">5. 베이지안 로지스틱 회귀 (Bayesian Logistic Regression)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 5. 신경망 (Neural Network)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter05/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/1">1. 전방향 네트워트 (Feed-forward Network Functions)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/2">2. 네트워크 학습 (Network Training)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/3">3. 에러 역전파 (Error Backpropagation)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/4">4. 헤시안 행렬 (Hessian Matrix)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/5">5. [작성중] 신경망 정칙화 (Regularization in Neural Networks)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/6">6. [작성중] 혼합 밀도 네트워크 (Mixture Density Networks)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter05/7">7. [작성중] 베이지안 신경망 (Bayesian Neural Networks)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 6. 커널 메소드 (Kernel Method) (일단 SKIP)</h3>
</div>
<div class="well-sm">
<h3>Chapter 7. 희소 커널 (Sparse Kernel Machines) (일단 SKIP)</h3>
</div>
<div class="well-sm">
<h3>Chapter 8. 그래픽 모델 (Graphical Models)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter08/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter08/1">1. 베이지언 네트워크 (Bayesian Networks)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter08/2">2. 조건부 독립 (Conditional Independence)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter08/3">3. [작업중] 마코프 랜덤 필드 (Markov Random Fields)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter08/4">4. [준비중] 그래프 모델 추론 (Inference in Graphical Models)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 9. 혼합 모델과 EM (Mixture Models and EM)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter09/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter09/1">1. K-평균 클러스터링 (K-means Clustering)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter09/2">2. 가우시안 혼합 분포 (Mixtures of Gaussians)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter09/3">3. EM 의 대안 (An Alternative View of EM)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter09/4">4. EM 일반화 (The EM Algorithm in General)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 10. 변분 추론 (Variational Inference)</h3>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter10/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/1">1. 변분추론 (Variational Inference)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/2">2. [준비중] 변분 혼합 가우시안 (Variational Mexture of Gaussians)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/3">3. [준비중] 변분 선형 회귀 (Variational Linear Regression)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/4">4. [준비중] 지수족 분포들 (Exponential Family Distributions)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/5">5. [준비중] 지역 변분식 (Local Variational Methods)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/6">6. [준비중] 변분 로지스틱 회귀 (Variational Logistic Regression)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter10/7">7. [준비중] 기대 전파 (Expectation Propagation)</a></li>
</ul>
</div>
<div class="well-sm">
<h3>Chapter 11. 샘플링 (Sampling Methods)</h3>
</div>
<div>
<ul>
<li><a href="{{ site.baseurl }}/docs/chapter11/0">0. 프롤로그 (Prolog)</a></li>
<li><a href="{{ site.baseurl }}/docs/chapter11/0">1. 기본 샘플링 알고리즘 (Basic Sampling Alogrithms)</a></li>
</ul>
</div>
</div>
</div>