Skip to content

Commit 825b858

Browse files
committed
Update Book
1 parent 20f9d5a commit 825b858

File tree

120 files changed

+254
-295
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

120 files changed

+254
-295
lines changed

404.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>Page not found | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="Page not found | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

acknowledgements.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>Chapter 15 Acknowledgements | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="Chapter 15 Acknowledgements | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

adversarial.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>10.4 Adversarial Examples | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="10.4 Adversarial Examples | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

agnostic.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>Chapter 6 Model-Agnostic Methods | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="Chapter 6 Model-Agnostic Methods | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

ale.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>8.2 Accumulated Local Effects (ALE) Plot | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="8.2 Accumulated Local Effects (ALE) Plot | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

anchors.html

+4-4
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>9.4 Scoped Rules (Anchors) | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="9.4 Scoped Rules (Anchors) | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />
@@ -498,7 +498,7 @@ <h2><span class="header-section-number">9.4</span> Scoped Rules (Anchors)<a href
498498
Given the same perturbation space, the anchors approach constructs explanations whose coverage is adapted to the model’s behavior and the approach clearly expresses their boundaries.
499499
Thus, they are faithful by design and state exactly for which instances they are valid.
500500
This property makes anchors particularly intuitive and easy to comprehend.</p>
501-
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-30"></span>
501+
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-28"></span>
502502
<img src="images/anchors-visualization.png" alt="LIME vs. Anchors -- A Toy Visualization. Figure from Ribeiro, Singh, and Guestrin (2018)." width="\textwidth" />
503503
<p class="caption">
504504
FIGURE 9.11: LIME vs. Anchors – A Toy Visualization. Figure from Ribeiro, Singh, and Guestrin (2018).
@@ -575,7 +575,7 @@ <h3><span class="header-section-number">9.4.1</span> Finding Anchors<a href="anc
575575
<p><strong>Candidate Precision Validation</strong>: Takes more samples in case there is no statistical confidence yet that the candidate exceeds the <span class="math inline">\(\tau\)</span> threshold.</p>
576576
<p><strong>Modified Beam Search</strong>: All of the above components are assembled in a beam search, which is a graph search algorithm and a variant of the breadth-first algorithm. It carries the <span class="math inline">\(B\)</span> best candidates of each round over to the next one (where <span class="math inline">\(B\)</span> is called the <em>Beam Width</em>). These <span class="math inline">\(B\)</span> best rules are then used to create new rules. The beam search conducts at most <span class="math inline">\(featureCount(x)\)</span> rounds, as each feature can only be included in a rule at most once. Thus, at every round <span class="math inline">\(i\)</span>, it generates candidates with exactly <span class="math inline">\(i\)</span> predicates and selects the B best thereof. Therefore, by setting <span class="math inline">\(B\)</span> high, the algorithm is more likely to avoid local optima. In turn, this requires a high number of model calls and thereby increases the computational load.</p>
577577
<p>These four components are shown in the figure below.</p>
578-
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-31"></span>
578+
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-29"></span>
579579
<img src="images/anchors-process.jpg" alt="The anchors algorithm’s components and their interrelations (simplified)" width="\textwidth" />
580580
<p class="caption">
581581
FIGURE 9.12: The anchors algorithm’s components and their interrelations (simplified)

bike-data.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>4.1 Bike Rentals (Regression) | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="4.1 Bike Rentals (Regression) | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

cervical.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>4.3 Risk Factors for Cervical Cancer (Classification) | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="4.3 Risk Factors for Cervical Cancer (Classification) | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

cite.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>Chapter 13 Citing this Book | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="Chapter 13 Citing this Book | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

cnn-features.html

+6-6
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>10.1 Learned Features | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="10.1 Learned Features | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />
@@ -493,7 +493,7 @@ <h2><span class="header-section-number">10.1</span> Learned Features<a href="cnn
493493
First, the image goes through many convolutional layers.
494494
In those convolutional layers, the network learns new and increasingly complex features in its layers.
495495
Then the transformed image information goes through the fully connected layers and turns into a classification or prediction.</p>
496-
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-51"></span>
496+
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-49"></span>
497497
<img src="images/cnn-features.png" alt="Features learned by a convolutional neural network (Inception V1) trained on the ImageNet data. The features range from simple features in the lower convolutional layers (left) to more abstract features in the higher convolutional layers (right). Figure from Olah, et al. (2017, CC-BY 4.0) https://distill.pub/2017/feature-visualization/appendix/." width="\textwidth" />
498498
<p class="caption">
499499
FIGURE 10.1: Features learned by a convolutional neural network (Inception V1) trained on the ImageNet data. The features range from simple features in the lower convolutional layers (left) to more abstract features in the higher convolutional layers (right). Figure from Olah, et al. (2017, CC-BY 4.0) <a href="https://distill.pub/2017/feature-visualization/appendix/" class="uri">https://distill.pub/2017/feature-visualization/appendix/</a>.
@@ -633,7 +633,7 @@ <h4><span class="header-section-number">10.1.2.1</span> Network Dissection Algor
633633
<li>Quantify the alignment of activations and labeled concepts.</li>
634634
</ol>
635635
<p>The following figure visualizes how an image is forwarded to a channel and matched with the labeled concepts.</p>
636-
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-52"></span>
636+
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-50"></span>
637637
<img src="images/dissection-network.png" alt="For a given input image and a trained network (fixed weights), we propagate the image forward to the target layer, upscale the activations to match the original image size and compare the maximum activations with the ground truth pixel-wise segmentation. Figure originally from http://netdissect.csail.mit.edu/." width="\textwidth" />
638638
<p class="caption">
639639
FIGURE 10.5: For a given input image and a trained network (fixed weights), we propagate the image forward to the target layer, upscale the activations to match the original image size and compare the maximum activations with the ground truth pixel-wise segmentation. Figure originally from <a href="http://netdissect.csail.mit.edu/" class="uri">http://netdissect.csail.mit.edu/</a>.
@@ -679,14 +679,14 @@ <h4><span class="header-section-number">10.1.2.1</span> Network Dissection Algor
679679
We call unit k a detector of concept c when <span class="math inline">\(IoU_{k,c}&gt;0.04\)</span>.
680680
This threshold was chosen by Bau &amp; Zhou et al (2017).</p>
681681
<p>The following figure illustrates intersection and union of activation mask and concept mask for a single image:</p>
682-
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-53"></span>
682+
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-51"></span>
683683
<img src="images/dissection-dog-exemplary.jpg" alt="The Intersection over Union (IoU) is computed by comparing the human ground truth annotation and the top activated pixels." width="\textwidth" />
684684
<p class="caption">
685685
FIGURE 10.6: The Intersection over Union (IoU) is computed by comparing the human ground truth annotation and the top activated pixels.
686686
</p>
687687
</div>
688688
<p>The following figure shows a unit that detects dogs:</p>
689-
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-54"></span>
689+
<div class="figure" style="text-align: center"><span style="display:block;" id="fig:unnamed-chunk-52"></span>
690690
<img src="images/dissection-dogs.jpeg" alt="Activation mask for inception\_4e channel 750 which detects dogs with $IoU=0.203$. Figure originally from http://netdissect.csail.mit.edu/" width="\textwidth" />
691691
<p class="caption">
692692
FIGURE 10.7: Activation mask for inception_4e channel 750 which detects dogs with <span class="math inline">\(IoU=0.203\)</span>. Figure originally from <a href="http://netdissect.csail.mit.edu/" class="uri">http://netdissect.csail.mit.edu/</a>

contribute.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>Chapter 12 Contribute to the Book | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="Chapter 12 Contribute to the Book | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

counterfactual.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>9.3 Counterfactual Explanations | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="9.3 Counterfactual Explanations | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

data.html

+2-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
77
<title>Chapter 4 Datasets | Interpretable Machine Learning</title>
88
<meta name="description" content="Machine learning algorithms usually operate as black boxes and it is unclear how they derived a certain decision. This book is a guide for practitioners to make machine learning decisions interpretable." />
9-
<meta name="generator" content="bookdown 0.35 and GitBook 2.6.7" />
9+
<meta name="generator" content="bookdown 0.39 and GitBook 2.6.7" />
1010

1111
<meta property="og:title" content="Chapter 4 Datasets | Interpretable Machine Learning" />
1212
<meta property="og:type" content="book" />
@@ -23,7 +23,7 @@
2323
<meta name="author" content="Christoph Molnar" />
2424

2525

26-
<meta name="date" content="2023-08-21" />
26+
<meta name="date" content="2024-05-22" />
2727

2828
<meta name="viewport" content="width=device-width, initial-scale=1" />
2929
<meta name="apple-mobile-web-app-capable" content="yes" />

0 commit comments

Comments
 (0)