Skip to content

Commit

Permalink
beginner_source/introyt/tensorboardyt_tutorial.py ๋ฒˆ์—ญ (#923)
Browse files Browse the repository at this point in the history
beginner_source/introyt/tensorboardyt_tutorial.py ๋ฒˆ์—ญ ์ถ”๊ฐ€
  • Loading branch information
Angela-Park-JE authored Oct 14, 2024
1 parent 0c521f9 commit bd31792
Showing 1 changed file with 99 additions and 97 deletions.
196 changes: 99 additions & 97 deletions beginner_source/introyt/tensorboardyt_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,88 +7,90 @@
`Training Models <trainingyt.html>`_ ||
`Model Understanding <captumyt.html>`_
PyTorch TensorBoard Support
PyTorch TensorBoard ์ง€์›
===========================
Follow along with the video below or on `youtube <https://www.youtube.com/watch?v=6CEld3hZgqc>`__.
**๋ฒˆ์—ญ**: `๋ฐ•์ •์€ <https://github.com/Angela-Park-JE/>`_
์•„๋ž˜ ์˜์ƒ์ด๋‚˜ `youtube <https://www.youtube.com/watch?v=6CEld3hZgqc>`_\๋ฅผ ์ฐธ๊ณ ํ•˜์„ธ์š”.
.. raw:: html
<div style="margin-top:10px; margin-bottom:10px;">
<iframe width="560" height="315" src="https://www.youtube.com/embed/6CEld3hZgqc" frameborder="0" allow="accelerometer; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
</div>
Before You Start
์‹œ์ž‘ํ•˜๊ธฐ์— ์•ž์„œ
----------------
To run this tutorial, youโ€™ll need to install PyTorch, TorchVision,
Matplotlib, and TensorBoard.
์ด ํŠœํ† ๋ฆฌ์–ผ์„ ์‹คํ–‰ํ•˜๊ธฐ ์œ„ํ•ด์„œ PyTorch, TorchVision,
Matplotlib ๊ทธ๋ฆฌ๊ณ  TensorBoard๋ฅผ ์„ค์น˜ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
With ``conda``:
``conda`` ์‚ฌ์šฉ ์‹œ:
.. code-block:: sh
conda install pytorch torchvision -c pytorch
conda install matplotlib tensorboard
With ``pip``:
``pip`` ์‚ฌ์šฉ ์‹œ:
.. code-block:: sh
pip install torch torchvision matplotlib tensorboard
Once the dependencies are installed, restart this notebook in the Python
environment where you installed them.
ํ•œ๋ฒˆ ์˜์กด์„ฑ์ด ์žˆ๋Š” ๋ชจ๋“ˆ์„ ์„ค์น˜ํ•˜๊ณ  ๋‚˜์„œ,
์„ค์น˜ํ•œ ํ™˜๊ฒฝ์—์„œ ์ด notebook์„ ๋‹ค์‹œ ์‹œ์ž‘ํ•ฉ๋‹ˆ๋‹ค.
Introduction
๊ฐœ์š”
------------
In this notebook, weโ€™ll be training a variant of LeNet-5 against the
Fashion-MNIST dataset. Fashion-MNIST is a set of image tiles depicting
various garments, with ten class labels indicating the type of garment
depicted.
์ด notebook์—์„œ๋Š” ๋ณ€ํ˜•๋œ LeNet-5๋ฅผ
Fashion-MNIST ๋ฐ์ดํ„ฐ์…‹์œผ๋กœ ํ•™์Šต์‹œํ‚ฌ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
Fashion-MNIST๋Š” ์˜๋ณต์˜ ์ข…๋ฅ˜๋ฅผ ๋‚˜ํƒ€๋‚ด๋Š” 10๊ฐœ์˜ ํด๋ž˜์Šค ๋ ˆ์ด๋ธ”์„ ํฌํ•จํ•˜๋Š”
๋‹ค์–‘ํ•œ ์˜๋ฅ˜์˜ ํƒ€์ผ ์ด๋ฏธ์ง€ ์„ธํŠธ์ž…๋‹ˆ๋‹ค.
"""

# PyTorch model and training necessities
# PyTorch ๋ชจ๋ธ๊ณผ ํ›ˆ๋ จ ํ•„์ˆ˜ ์š”์†Œ
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim

# Image datasets and image manipulation
# ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ์…‹๊ณผ ์ด๋ฏธ์ง€ ์กฐ์ž‘
import torchvision
import torchvision.transforms as transforms

# Image display
# ์ด๋ฏธ์ง€ ์‹œ๊ฐํ™”
import matplotlib.pyplot as plt
import numpy as np

# PyTorch TensorBoard support
# PyTorch TensorBoard ์ง€์›
from torch.utils.tensorboard import SummaryWriter

# In case you are using an environment that has TensorFlow installed,
# such as Google Colab, uncomment the following code to avoid
# a bug with saving embeddings to your TensorBoard directory
# ๋งŒ์•ฝ Google Colab์ฒ˜๋Ÿผ TensorFlow๊ฐ€ ์„ค์น˜๋œ ํ™˜๊ฒฝ์„ ์‚ฌ์šฉ ์ค‘์ด๋ผ๋ฉด
# ์•„๋ž˜์˜ ์ฝ”๋“œ๋ฅผ ์ฃผ์„ ํ•ด์ œํ•˜์—ฌ
# TensorBoard ๋””๋ ‰ํ„ฐ๋ฆฌ์— ์ž„๋ฒ ๋”ฉ์„ ์ €์žฅํ•  ๋•Œ์˜ ๋ฒ„๊ทธ๋ฅผ ๋ฐฉ์ง€ํ•˜์„ธ์š”.

# import tensorflow as tf
# import tensorboard as tb
# tf.io.gfile = tb.compat.tensorflow_stub.io.gfile

######################################################################
# Showing Images in TensorBoard
# TensorBoard์—์„œ ์ด๋ฏธ์ง€ ๋‚˜ํƒ€๋‚ด๊ธฐ
# -----------------------------
#
# Letโ€™s start by adding sample images from our dataset to TensorBoard:
# ๋จผ์ €, ๋ฐ์ดํ„ฐ์…‹์—์„œ TensorBoard๋กœ ์ƒ˜ํ”Œ ์ด๋ฏธ์ง€๋ฅผ ์ถ”๊ฐ€ํ•ฉ๋‹ˆ๋‹ค:
#

# Gather datasets and prepare them for consumption
# ๋ฐ์ดํ„ฐ์…‹์„ ๋ชจ์•„์„œ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•˜๋„๋ก ์ค€๋น„ํ•˜๊ธฐ
transform = transforms.Compose(
[transforms.ToTensor(),
transforms.Normalize((0.5,), (0.5,))])

# Store separate training and validations splits in ./data
# ํ›ˆ๋ จ๊ณผ ๊ฒ€์ฆ์œผ๋กœ ๋ถ„ํ• ํ•˜์—ฌ ๊ฐ๊ฐ ./data์— ์ €์žฅํ•˜๊ธฐ
training_set = torchvision.datasets.FashionMNIST('./data',
download=True,
train=True,
Expand All @@ -109,64 +111,64 @@
shuffle=False,
num_workers=2)

# Class labels
# ํด๋ž˜์Šค ๋ ˆ์ด๋ธ”
classes = ('T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat',
'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle Boot')

# Helper function for inline image display
# ์ธ๋ผ์ธ ์ด๋ฏธ์ง€ ์‹œ๊ฐํ™”๋ฅผ ์œ„ํ•œ ํ•จ์ˆ˜
def matplotlib_imshow(img, one_channel=False):
if one_channel:
img = img.mean(dim=0)
img = img / 2 + 0.5 # unnormalize
img = img / 2 + 0.5 # ๋น„์ •๊ทœํ™”(unnormalize)
npimg = img.numpy()
if one_channel:
plt.imshow(npimg, cmap="Greys")
else:
plt.imshow(np.transpose(npimg, (1, 2, 0)))

# Extract a batch of 4 images
# 4๊ฐœ์˜ ์ด๋ฏธ์ง€๋กœ๋ถ€ํ„ฐ ๋ฐฐ์น˜ ํ•˜๋‚˜๋ฅผ ์ถ”์ถœํ•˜๊ธฐ
dataiter = iter(training_loader)
images, labels = next(dataiter)

# Create a grid from the images and show them
# ์ด๋ฏธ์ง€๋ฅผ ๋‚˜ํƒ€๋‚ด๊ธฐ ์œ„ํ•œ ๊ฒฉ์ž ์ƒ์„ฑ
img_grid = torchvision.utils.make_grid(images)
matplotlib_imshow(img_grid, one_channel=True)


########################################################################
# Above, we used TorchVision and Matplotlib to create a visual grid of a
# minibatch of our input data. Below, we use the ``add_image()`` call on
# ``SummaryWriter`` to log the image for consumption by TensorBoard, and
# we also call ``flush()`` to make sure itโ€™s written to disk right away.
# ์œ„์—์„œ TorchVision๊ณผ Matplotlib์„ ์‚ฌ์šฉํ•˜์—ฌ
# ์ž…๋ ฅ ๋ฐ์ดํ„ฐ์˜ ๋ฏธ๋‹ˆ ๋ฐฐ์น˜๋ฅผ ์‹œ๊ฐ์ ์œผ๋กœ ๋ฐฐ์—ดํ•œ ๊ฒฉ์ž๋ฅผ ๋งŒ๋“ค์—ˆ์Šต๋‹ˆ๋‹ค. ์•„๋ž˜์—์„œ๋Š” TensorBoard์—์„œ ์‚ฌ์šฉ๋ 
# ์ด๋ฏธ์ง€๋ฅผ ๊ธฐ๋กํ•˜๊ธฐ ์œ„ํ•ด ``SummaryWriter`` ์˜ ``add_image()`` ๋ฅผ ํ˜ธ์ถœํ•˜๊ณ ,
# ๋˜ํ•œ ``flush()`` ๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ ์ด๋ฏธ์ง€๊ฐ€ ์ฆ‰์‹œ ๋””์Šคํฌ์— ๊ธฐ๋ก๋˜๋„๋ก ํ•ฉ๋‹ˆ๋‹ค.
#

# Default log_dir argument is "runs" - but it's good to be specific
# torch.utils.tensorboard.SummaryWriter is imported above
# log_dir ์ธ์ˆ˜ ๊ธฐ๋ณธ๊ฐ’์€ "runs"์ž…๋‹ˆ๋‹ค - ํ•˜์ง€๋งŒ ๊ตฌ์ฒด์ ์œผ๋กœ ์ •ํ•˜๋Š” ๊ฒƒ์ด ์ข‹์Šต๋‹ˆ๋‹ค.
# ์œ„์—์„œ torch.utils.tensorboard.SummaryWriter๋ฅผ ๊ฐ€์ ธ์™”์Šต๋‹ˆ๋‹ค.
writer = SummaryWriter('runs/fashion_mnist_experiment_1')

# Write image data to TensorBoard log dir
# TensorBoard ๋กœ๊ทธ ๋””๋ ‰ํ„ฐ๋ฆฌ์— ์ด๋ฏธ์ง€ ๋ฐ์ดํ„ฐ ์“ฐ๊ธฐ(write)
writer.add_image('Four Fashion-MNIST Images', img_grid)
writer.flush()

# To view, start TensorBoard on the command line with:
# ๋ˆˆ์œผ๋กœ ๋ณด๊ธฐ ์œ„ํ•ด์„œ๋Š” ์ปค๋งจ๋“œ ๋ผ์ธ์—์„œ TensorBoard๋ฅผ ์‹œ์ž‘ํ•˜์„ธ์š”:
# tensorboard --logdir=runs
# ...and open a browser tab to http://localhost:6006/
# ...๊ทธ๋Ÿฐ ๋‹ค์Œ ๋ธŒ๋ผ์šฐ์ €์—์„œ http://localhost:6006/ ๋ฅผ ์—ด์–ด๋ณด์„ธ์š”.


##########################################################################
# If you start TensorBoard at the command line and open it in a new
# browser tab (usually at `localhost:6006 <localhost:6006>`__), you should
# see the image grid under the IMAGES tab.
# ๋งŒ์•ฝ TensorBoard๋ฅผ ์ปค๋งจ๋“œ ๋ผ์ธ์—์„œ ๊ตฌ๋™์‹œ์ผœ
# ๊ทธ๊ฒƒ์„ ์ƒˆ ๋ธŒ๋ผ์šฐ์ € ํƒญ(๋ณดํ†ต `localhost:6006 <localhost:6006>`__)์—์„œ ์—ด์—ˆ๋‹ค๋ฉด,
# IMAGES ํƒญ์—์„œ ์ด๋ฏธ์ง€ ๊ฒฉ์ž๋ฅผ ํ™•์ธํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
#
# Graphing Scalars to Visualize Training
# ํ›ˆ๋ จ ์‹œ๊ฐํ™”๋ฅผ ์œ„ํ•œ ์Šค์นผ๋ผ ๊ทธ๋ž˜ํ”„ ๊ทธ๋ฆฌ๊ธฐ
# --------------------------------------
#
# TensorBoard is useful for tracking the progress and efficacy of your
# training. Below, weโ€™ll run a training loop, track some metrics, and save
# the data for TensorBoardโ€™s consumption.
# TensorBoard๋Š” ํ›ˆ๋ จ ์ง„ํ–‰ ๊ณผ์ •๊ณผ ํšจ๊ณผ๋ฅผ ์ถ”์ ํ•˜๊ธฐ์—
# ์œ ์šฉํ•ฉ๋‹ˆ๋‹ค. ์•„๋ž˜์—์„œ ํ›ˆ๋ จ ๋ฃจํ”„๋ฅผ ์‹คํ–‰ํ•˜๊ณ  ๋ช‡๋ช‡ ์ง€ํ‘œ๋ฅผ ์ถ”์ ํ•˜๋ฉฐ
# TensorBoard์—์„œ ์‚ฌ์šฉํ•  ๋ฐ์ดํ„ฐ๋ฅผ ์ €์žฅํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.
#
# Letโ€™s define a model to categorize our image tiles, and an optimizer and
# loss function for training:
# ์ด๋ฏธ์ง€ ํƒ€์ผ์„ ๋ถ„๋ฅ˜ํ•  ๋ชจ๋ธ๊ณผ ์˜ตํ‹ฐ๋งˆ์ด์ €
# ๊ทธ๋ฆฌ๊ณ  ํ›ˆ๋ จ์˜ ์†์‹ค ํ•จ์ˆ˜๋ฅผ ์ •์˜ํ•ด ๋ด…์‹œ๋‹ค:
#

class Net(nn.Module):
Expand Down Expand Up @@ -195,16 +197,16 @@ def forward(self, x):


##########################################################################
# Now letโ€™s train a single epoch, and evaluate the training vs. validation
# set losses every 1000 batches:
# ์ด์ œ ๋‹จ์ผ ์—ํญ์„ ํ›ˆ๋ จํ•˜๊ณ , ๋งค 1000 ๋ฐฐ์น˜๋งˆ๋‹ค ํ›ˆ๋ จ ์…‹๊ณผ ๊ฒ€์ฆ ์…‹์˜
# ์†์‹ค์„ ํ‰๊ฐ€ํ•ด ๋ด…๋‹ˆ๋‹ค:
#

print(len(validation_loader))
for epoch in range(1): # loop over the dataset multiple times
for epoch in range(1): # ๋ฐ์ดํ„ฐ ์…‹์„ ์—ฌ๋Ÿฌ ๋ฒˆ ๋ฐ˜๋ณต(ํ•„์š” ์‹œ ํšŸ์ˆ˜๋ฅผ ์กฐ์ •ํ•ฉ๋‹ˆ๋‹ค.)
running_loss = 0.0

for i, data in enumerate(training_loader, 0):
# basic training loop
# ๊ธฐ๋ณธ ํ›ˆ๋ จ ๋ฃจํ”„
inputs, labels = data
optimizer.zero_grad()
outputs = net(inputs)
Expand All @@ -213,24 +215,24 @@ def forward(self, x):
optimizer.step()

running_loss += loss.item()
if i % 1000 == 999: # Every 1000 mini-batches...
if i % 1000 == 999: # ๋งค 1000 ๋ฏธ๋‹ˆ ๋ฐฐ์น˜๋งˆ๋‹ค...
print('Batch {}'.format(i + 1))
# Check against the validation set
# ๊ฒ€์ฆ ์…‹๊ณผ ๋น„๊ต
running_vloss = 0.0

# In evaluation mode some model specific operations can be omitted eg. dropout layer
net.train(False) # Switching to evaluation mode, eg. turning off regularisation
# ํ‰๊ฐ€ ๋ชจ๋“œ์—์„œ๋Š” ์ผ๋ถ€ ๋ชจ๋ธ์˜ ํŠน์ • ์ž‘์—…์„ ์ƒ๋žตํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค ์˜ˆ์‹œ: ๋“œ๋กญ์•„์›ƒ ๋ ˆ์ด์–ด
net.train(False) # ํ‰๊ฐ€ ๋ชจ๋“œ๋กœ ์ „ํ™˜, ์˜ˆ์‹œ: ์ •๊ทœํ™”(regularisation) ๋„๊ธฐ
for j, vdata in enumerate(validation_loader, 0):
vinputs, vlabels = vdata
voutputs = net(vinputs)
vloss = criterion(voutputs, vlabels)
running_vloss += vloss.item()
net.train(True) # Switching back to training mode, eg. turning on regularisation
net.train(True) # ํ›ˆ๋ จ ๋ชจ๋“œ๋กœ ๋Œ์•„๊ฐ€๊ธฐ, ์˜ˆ์‹œ: ์ •๊ทœํ™” ์ผœ๊ธฐ

avg_loss = running_loss / 1000
avg_vloss = running_vloss / len(validation_loader)

# Log the running loss averaged per batch
# ๋ฐฐ์น˜๋ณ„ ํ‰๊ท  ์‹คํ–‰ ์†์‹ค์„ ๊ธฐ๋ก
writer.add_scalars('Training vs. Validation Loss',
{ 'Training' : avg_loss, 'Validation' : avg_vloss },
epoch * len(training_loader) + i)
Expand All @@ -242,59 +244,59 @@ def forward(self, x):


#########################################################################
# Switch to your open TensorBoard and have a look at the SCALARS tab.
# ์—ด๋ฆฐ TensorBoard๋กœ ์ „ํ™˜ํ•˜์—ฌ SCALARSํƒญ์„ ์‚ดํŽด๋ณด์„ธ์š”.
#
# Visualizing Your Model
# ๋ชจ๋ธ ์‹œ๊ฐํ™”ํ•˜๊ธฐ
# ----------------------
#
# TensorBoard can also be used to examine the data flow within your model.
# To do this, call the ``add_graph()`` method with a model and sample
# input:
# TensorBoard๋Š” ๋ชจ๋ธ ๋‚ด ๋ฐ์ดํ„ฐ ํ๋ฆ„์„ ๊ฒ€์‚ฌํ•˜๋Š” ๋ฐ์—๋„ ์œ ์šฉํ•ฉ๋‹ˆ๋‹ค.
# ์ด๋ฅผ ์œ„ํ•ด, ๋ชจ๋ธ๊ณผ ์ƒ˜ํ”Œ ์ž…๋ ฅ์„ ์ด์šฉํ•ด ``add_graph()`` ๋ฉ”์†Œ๋“œ๋ฅผ
# ํ˜ธ์ถœํ•ฉ๋‹ˆ๋‹ค:
#

# Again, grab a single mini-batch of images
# ๋‹ค์‹œ, ์ด๋ฏธ์ง€์˜ ๋ฏธ๋‹ˆ ๋ฐฐ์น˜ ํ•˜๋‚˜๋ฅผ ๊ฐ€์ ธ์˜ต๋‹ˆ๋‹ค.
dataiter = iter(training_loader)
images, labels = next(dataiter)

# add_graph() will trace the sample input through your model,
# and render it as a graph.
# add_graph()๋Š” ์ƒ˜ํ”Œ ์ž…๋ ฅ์ด ๋ชจ๋ธ์„ ํ†ต๊ณผํ•˜๋Š” ๊ณผ์ •์„ ์ถ”์ ํ•˜๊ณ ,
# ์ด๋ฅผ ๊ทธ๋ž˜ํ”„๋กœ ์‹œ๊ฐํ™”ํ•ฉ๋‹ˆ๋‹ค.
writer.add_graph(net, images)
writer.flush()


#########################################################################
# When you switch over to TensorBoard, you should see a GRAPHS tab.
# Double-click the โ€œNETโ€ node to see the layers and data flow within your
# model.
# TensorBoard๋กœ ์ „ํ™˜ํ•˜๋ฉด, GRAPHS ํƒญ์ด ๋ณด์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.
# โ€œNETโ€ ๋…ธ๋“œ๋ฅผ ๋”๋ธ” ํด๋ฆญํ•˜์—ฌ ๋ชจ๋ธ ๋‚ด ๊ณ„์ธต๊ณผ ๋ฐ์ดํ„ฐ ํ๋ฆ„์„
# ํ™•์ธํ•˜์„ธ์š”.
#
# Visualizing Your Dataset with Embeddings
# ์ž„๋ฒ ๋”ฉ์œผ๋กœ ๋ฐ์ดํ„ฐ์…‹ ์‹œ๊ฐํ™”ํ•˜๊ธฐ
# ----------------------------------------
#
# The 28-by-28 image tiles weโ€™re using can be modeled as 784-dimensional
# vectors (28 \* 28 = 784). It can be instructive to project this to a
# lower-dimensional representation. The ``add_embedding()`` method will
# project a set of data onto the three dimensions with highest variance,
# and display them as an interactive 3D chart. The ``add_embedding()``
# method does this automatically by projecting to the three dimensions
# with highest variance.
# ์šฐ๋ฆฌ๊ฐ€ ์‚ฌ์šฉํ•˜๋Š” 28x28 ์ด๋ฏธ์ง€ ํƒ€์ผ์€ 784์ฐจ์›์˜
# ๋ฒกํ„ฐ(28 \* 28 = 784)๊ฐ€ ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋” ๋‚ฎ์€ ์ฐจ์›์œผ๋กœ ํˆฌ์˜ํ•˜๋Š” ์ชฝ์ด
# ์œ ๋ฆฌํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ``add_embedding()`` ๋ฉ”์†Œ๋“œ๋Š”
# ๊ฐ€์žฅ ๋ถ„์‚ฐ์ด ๋†’์€ ์„ธ ์ฐจ์›์œผ๋กœ ๋ฐ์ดํ„ฐ ์„ธํŠธ๋ฅผ ํˆฌ์˜ํ•˜๊ณ ,
# ์ƒํ˜ธ์ž‘์šฉ ๊ฐ€๋Šฅํ•œ 3D ์ฐจํŠธ๋กœ ์‹œ๊ฐํ™”ํ•ด ์ค„ ๊ฒƒ์ž…๋‹ˆ๋‹ค. ``add_embedding()``
# ๋ฉ”์†Œ๋“œ๋Š” ๊ฐ€์žฅ ๋†’์€ ๋ถ„์‚ฐ์„ ๊ฐ€์ง„ ์„ธ ์ฐจ์›์— ์ž๋™์ ์œผ๋กœ
# ํˆฌ์˜ํ•˜์—ฌ ์ด๋ฅผ ์ˆ˜ํ–‰ํ•ฉ๋‹ˆ๋‹ค.
#
# Below, weโ€™ll take a sample of our data, and generate such an embedding:
# ์•„๋ž˜์—์„œ ๋ฐ์ดํ„ฐ ์ƒ˜ํ”Œ์„ ๊ฐ€์ ธ์™€ ์ž„๋ฒ ๋”ฉ์„ ์ƒ์„ฑํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค:
#

# Select a random subset of data and corresponding labels
# ๋ฐ์ดํ„ฐ์˜ ๋žœ๋ค ๋ถ€๋ถ„์ง‘ํ•ฉ๊ณผ ๋Œ€์‘ํ•˜๋Š” ๋ ˆ์ด๋ธ”์„ ์„ ํƒ
def select_n_random(data, labels, n=100):
assert len(data) == len(labels)

perm = torch.randperm(len(data))
return data[perm][:n], labels[perm][:n]

# Extract a random subset of data
# ๋ฐ์ดํ„ฐ์˜ ๋žœ๋ค ๋ถ€๋ถ„์ง‘ํ•ฉ ์ถ”์ถœ
images, labels = select_n_random(training_set.data, training_set.targets)

# get the class labels for each image
# ๊ฐ ์ด๋ฏธ์ง€๋ณ„ ํด๋ž˜์Šค ๋ ˆ์ด๋ธ” ์–ป๊ธฐ(get)
class_labels = [classes[label] for label in labels]

# log embeddings
# ๋กœ๊ทธ ์ž„๋ฒ ๋”ฉ
features = images.view(-1, 28 * 28)
writer.add_embedding(features,
metadata=class_labels,
Expand All @@ -304,24 +306,24 @@ def select_n_random(data, labels, n=100):


#######################################################################
# Now if you switch to TensorBoard and select the PROJECTOR tab, you
# should see a 3D representation of the projection. You can rotate and
# zoom the model. Examine it at large and small scales, and see whether
# you can spot patterns in the projected data and the clustering of
# labels.
# ์ด์ œ TensorBoard๋กœ ์ „ํ™˜ํ•˜์—ฌ PROJECTOR ํƒญ์„ ์„ ํƒํ•˜๋ฉด,
# 3D๋กœ ํ‘œํ˜„๋œ ํˆฌ์˜์ด ๋ณด์ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ทธ ๋ชจ๋ธ์„ ํšŒ์ „ํ•˜๊ฑฐ๋‚˜
# ํ™•๋Œ€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ํฌ๊ฑฐ๋‚˜ ์ž‘์€ ๊ทœ๋ชจ(scale)๋กœ ๊ทธ๊ฒƒ์„ ์‚ดํŽด๋ณด๋ฉฐ,
# ํˆฌ์˜๋œ ๋ฐ์ดํ„ฐ์™€ ๋ ˆ์ด๋ธ”์˜ ํด๋Ÿฌ์Šคํ„ฐ๋ง์—์„œ ํŒจํ„ด์„ ๋ฐœ๊ฒฌํ•  ์ˆ˜ ์žˆ๋Š”์ง€
# ๋ณด์„ธ์š”.
#
# For better visibility, itโ€™s recommended to:
# ๊ฐ€์‹œ์„ฑ์„ ๋†’์ด๋ ค๋ฉด, ๋‹ค์Œ์„ ๊ถŒ์žฅํ•ฉ๋‹ˆ๋‹ค:
#
# - Select โ€œlabelโ€ from the โ€œColor byโ€ drop-down on the left.
# - Toggle the Night Mode icon along the top to place the
# light-colored images on a dark background.
# - ์ขŒ์ธก์— ์žˆ๋Š” โ€œColor byโ€ ๋“œ๋กญ๋‹ค์šด์—์„œ โ€œlabelโ€์„ ์„ ํƒํ•˜์„ธ์š”.
# - ์ƒ๋‹จ์— ์žˆ๋Š” ์•ผ๊ฐ„ ๋ชจ๋“œ ์•„์ด์ฝ˜์„ ์ „ํ™˜(toggle)ํ•˜์—ฌ
# ๋ฐ์€ ์ƒ‰์ƒ ์ด๋ฏธ์ง€๋ฅผ ์–ด๋‘์šด ๋ฐฐ๊ฒฝ ์œ„์— ๋ฐฐ์น˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
#
# Other Resources
# ๊ธฐํƒ€ ์ž๋ฃŒ
# ---------------
#
# For more information, have a look at:
# ๋” ์•Œ๊ณ  ์‹ถ๋‹ค๋ฉด ์—ฌ๊ธฐ๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”:
#
# - PyTorch documentation on `torch.utils.tensorboard.SummaryWriter <https://pytorch.org/docs/stable/tensorboard.html?highlight=summarywriter>`__
# - Tensorboard tutorial content in the `PyTorch.org Tutorials <https://tutorials.pytorch.kr/>`__
# - For more information about TensorBoard, see the `TensorBoard
# documentation <https://www.tensorflow.org/tensorboard>`__
# - `torch.utils.tensorboard.SummaryWriter <https://pytorch.org/docs/stable/tensorboard.html?highlight=summarywriter>`_\์— ๋Œ€ํ•œ PyTorch ๋ฌธ์„œ
# - `PyTorch.org Tutorials <https://tutorials.pytorch.kr/>`_\์— ์žˆ๋Š” Tensorboard ํŠœํ† ๋ฆฌ์–ผ ์ฝ˜ํ…์ธ 
# - TensorBoard์— ๋Œ€ํ•œ ๋ณด๋‹ค ๋” ์ž์„ธํ•œ ๋‚ด์šฉ์€ย `TensorBoard
# ๋ฌธ์„œ <https://www.tensorflow.org/tensorboard>`_\๋ฅผ ์ฐธ๊ณ ํ•˜์„ธ์š”.

0 comments on commit bd31792

Please sign in to comment.