Unit 3 Exercise 1 #37
Replies: 1 comment 2 replies
-
Hey @lorenzo-delsignore, using PCA here is actually good thinking! That would be one approach to reduce the dataset into 2D for plotting. But here, I would look at the original feature values because that's what goes into the classifier (you can of course also train a classifier on PCA features, which can sometimes work well too, especially to fit the curse of dimensionality for certain models). Above, I am recommending plotting the features, but what you say above is also right, we can't do that in a 2D scatterplot. Personally, what I like to do is looking at histograms and pairwise scatterplot. For that, I am often using a scatterplot matrix. It's not necessary but I implemented it here if it helps: https://rasbt.github.io/mlxtend/user_guide/plotting/scatterplotmatrix/
That's a good point. This becomes impossible as well. One could use filler-values for all feature values except a selection of 2 but that is not ideal (I have an implementation of that here: https://rasbt.github.io/mlxtend/user_guide/plotting/plot_decision_regions/#example-7-decision-regions-with-more-than-two-training-features. Tbh I would not recommend that. In that case, we have to accept that it's not possible to visualize decision boundaries, unfortunately. |
Beta Was this translation helpful? Give feedback.
-
Hello, I have these questions:
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions