Skip to content

ICE Server

shuyijia edited this page Sep 7, 2023 · 4 revisions

Instructional Cluster Environment (ICE)

The Instructional Cluster Environment (ICE) is a computing cluster established in 2018 to support educational efforts per a high demand campus-wide. You can find out more about this system here.

Computing Resources

Currently, ICE offers

  • 60 Intel CPU nodes
  • 4 AMD CPU nodes
  • 98 GPUs
    • 19 GPU nodes with Nvidia Tesla V100 (1-4 GPUs/node)
    • 10 GPU nodes with Nvidia RTX600 (4 GPUs/node)
    • 4 GPU nodes with Nvidia A100 (2 GPUs/node)
    • 2 GPU nodes with Nvidia A40 (2 GPUs/node)
    • 2 GPU nodes with AMD MI210 (2 GPUs/node)

Storage

Each user has 15GB on their home directory as well as 100GB of scratch directory.

There is also a project directory that is shared among all users at /storage/ice-shared/vip-vxp/

Connecting to ICE

SSH

The most common way to connect to ICE is via ssh, which allows you to establish a secure connection from your local computer (client) to the ICE system (server).

Suppose our GT email is [email protected], then our username to log onto ICE will be gburdell23. We can connect to ice by:

The ssh client will prompt you to enter your password. This should be your GT account password.

Web-based User Interface

ICE also provides several web-based services such as Jupyter (Python, Julia and R), Matlab and Rstudio.

Click here to access on-demand web services. GT VPN is required.

Clone this wiki locally