The tables below are updated every few minutes to show the current status of our supercomputing, data analysis, visualization resources, and other specialized nodes and systems. If the tables indicate a system is unavailable, check your email for timely updates via our Notifier service. Scheduled maintenance downtimes are announced in advance in the Daily Bulletin as well as in Notifier emails.
Login Node | Status | Users | Load |
---|---|---|---|
derecho1 | UP | 91 | 10.09 % |
derecho2 | UP | 63 | 6.36 % |
derecho3 | UP | 74 | 8.33 % |
derecho4 | UP | 86 | 4.70 % |
derecho5 | UP | 57 | 4.49 % |
derecho6 | UP | 81 | 9.20 % |
derecho7 | UP | 70 | 6.95 % |
derecho8 | UP | 0 | 0.03 % |
CPU Nodes | |||
Reserved | 3 ( 0.1 %) | ||
Offline | 4 ( 0.2 %) | ||
Running Jobs | 2178 ( 87.5 %) | ||
Free | 303 ( 12.2 %) | ||
GPU Nodes | |||
Running Jobs | 25 ( 30.5 %) | ||
Free | 57 ( 69.5 %) | ||
Updated 9:15 am MDT Thu May 1 2025 |
Queue | Jobs Running | Jobs Queued | Jobs Held | Nodes Used | Users |
---|---|---|---|---|---|
cpu | 300 | 1 | 122 | 2178 | 99 |
gpu | 3 | 0 | 0 | 25 | 3 |
system | 0 | 0 | 0 | 0 | - |
hybrid | 0 | 0 | 0 | 0 | - |
pcpu | 1 | 0 | 0 | 1 | 1 |
pgpu | 0 | 0 | 0 | 0 | - |
gpudev | 2 | 0 | 0 | 2 | 1 |
cpudev | 3 | 1 | 25 | 3 | 15 |
repair | 0 | 0 | 0 | 0 | - |
jhub | 0 | 0 | 0 | 0 | - |
R9234219 | 1 | 0 | 0 | 3 | 1 |
S9239109 | 0 | 0 | 0 | 0 | - |
S9239112 | 0 | 0 | 0 | 0 | - |
Updated 9:15 am MDT Thu May 1 2025 |
Login Node | Status | Users | Load |
---|---|---|---|
casper-login1 | UP | 135 | 26.7 % |
casper-login2 | UP | - | 0.0 % |
HTC Nodes 715 of 6204 CPUs in use ( 11.5 %) | |||
Partially Allocated | 27 ( 21.4 %) | ||
Down | 1 ( 0.8 %) | ||
Offline | 3 ( 2.4 %) | ||
Free | 95 ( 75.4 %) | ||
Large Memory Nodes | |||
Partially Allocated | 3 ( 37.5 %) | ||
Free | 5 ( 62.5 %) | ||
GP100 Visualization Nodes 5 of 48 GPU sessions in use ( 10.4 %) | |||
Partially Allocated | 3 ( 37.5 %) | ||
Free | 5 ( 62.5 %) | ||
L40 Visualization Nodes 0 of 42 GPU sessions in use ( 0.0 %) | |||
Free | 6 (100.0 %) | ||
V100 GPU Nodes 13 of 52 GPUs in use ( 25.0 %) | |||
Partially Allocated | 3 ( 37.5 %) | ||
Free | 5 ( 62.5 %) | ||
A100 GPU Nodes 5 of 34 GPUs in use ( 14.7 %) | |||
Partially Allocated | 2 ( 20.0 %) | ||
Offline | 3 ( 30.0 %) | ||
Free | 5 ( 50.0 %) | ||
H100 GPU Nodes 3 of 8 GPUs in use ( 37.5 %) | |||
Partially Allocated | 1 ( 50.0 %) | ||
Free | 1 ( 50.0 %) | ||
RDA Nodes | |||
Partially Allocated | 2 ( 40.0 %) | ||
Fully Allocated | 1 ( 20.0 %) | ||
Free | 2 ( 40.0 %) | ||
JupyterHub Login Nodes | |||
Partially Allocated | 3 ( 33.3 %) | ||
Fully Allocated | 5 ( 55.6 %) | ||
Offline | 1 ( 11.1 %) | ||
Updated 9:15 am MDT Thu May 1 2025 |
Queue | Jobs Running | Jobs Queued | Jobs Held | CPUs Used | Users |
---|---|---|---|---|---|
htc | 460 | 4 | 8 | 711 | 85 |
vis | 5 | 0 | 0 | 44 | 5 |
largemem | 3 | 0 | 0 | 15 | 3 |
gpgpu | 2 | 3 | 0 | 24 | 3 |
rda | 32 | 0 | 44 | 88 | 4 |
tdd | 0 | 0 | 0 | 0 | - |
jhublogin | 229 | 0 | 0 | 497 | 229 |
system | 0 | 0 | 0 | 0 | - |
h100 | 3 | 0 | 1 | 32 | 2 |
l40 | 0 | 0 | 0 | 0 | - |
S3011784 | 0 | 0 | 0 | 0 | - |
S4456256 | 0 | 0 | 0 | 0 | - |
Updated 9:15 am MDT Thu May 1 2025 |
V100 Node | State | # GPUs | # Used | # Avail |
---|---|---|---|---|
casper36 | free | 4 | 1 | 3 |
casper08 | free | 8 | 0 | 8 |
casper29 | free | 4 | 0 | 4 |
casper30 | free | 8 | 0 | 8 |
casper31 | free | 8 | 0 | 8 |
casper24 | full | 8 | 8 | 0 |
casper25 | full | 4 | 4 | 0 |
casper28 | free | 8 | 0 | 8 |
Updated 9:15 am MDT Thu May 1 2025 |
A100 Node | State | # GPUs | # Used | # Avail |
---|---|---|---|---|
casper18 | free | 1 | 0 | 1 |
casper21 | free | 1 | 0 | 1 |
casper38 | free | 4 | 1 | 3 |
casper39 | offline | 4 | 0 | 4 |
casper40 | full | 4 | 4 | 0 |
casper41 | offline | 4 | 0 | 4 |
casper42 | free | 4 | 0 | 4 |
casper43 | free | 4 | 0 | 4 |
casper44 | offline | 4 | 0 | 4 |
casper37 | free | 4 | 0 | 4 |
Updated 9:15 am MDT Thu May 1 2025 |
H100 Node | State | # GPUs | # Used | # Avail |
---|---|---|---|---|
casper57 | free | 4 | 3 | 1 |
casper58 | free | 4 | 0 | 4 |
Updated 9:15 am MDT Thu May 1 2025 |
Individual files are removed from the /glade/scratch space automatically if they have not been accessed (for example: modified, read, or copied) in more than 120 days.
File Space | TiB Used | TiB Capacity | % Used |
---|---|---|---|
/glade/u/home | 96 | 150 | 64% |
/glade/u/apps | 2 | 10 | 18% |
/glade/work | 1,406 | 4,096 | 35% |
/glade/derecho/scratch | 25,043 | 55,814 | 46% |
/glade/campaign | 117,741 | 125,240 | 95% |
Updated 9:00 am MDT Thu May 1 2025 |