Cluster 9
Nodes Summary
Total Number of CPUs: 768State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
down | 33 | 528 | 0 | 0.00 |
down,offline | 15 | 240 | 0 | 0.00 |
Free CPUs (nodewise)
There is no free CPU available now.
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R |
Cluster 9
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute-0-0 | 16 | down | 0 | 0 | ||
compute-0-1 | 16 | down,offline | 0 | 0 | ||
compute-0-2 | 16 | down | 0 | 0 | ||
compute-0-3 | 16 | down,offline | 0 | 0 | ||
compute-0-4 | 16 | down | 0 | 0 | ||
compute-0-5 | 16 | down,offline | 0 | 0 | ||
compute-0-6 | 16 | down | 0 | 0 | ||
compute-0-7 | 16 | down | 0 | 0 | ||
compute-0-8 | 16 | down | 0 | 0 | ||
compute-0-9 | 16 | down | 0 | 0 | ||
compute-0-10 | 16 | down | 0 | 0 | ||
compute-0-11 | 16 | down | 0 | 0 | ||
compute-0-12 | 16 | down | 0 | 0 | ||
compute-0-13 | 16 | down | 0 | 0 | ||
compute-0-14 | 16 | down | 0 | 0 | ||
compute-0-15 | 16 | down,offline | 0 | 0 | ||
compute-0-16 | 16 | down | 0 | 0 | ||
compute-0-17 | 16 | down | 0 | 0 | ||
compute-0-18 | 16 | down | 0 | 0 | ||
compute-0-19 | 16 | down | 0 | 0 | ||
compute-0-20 | 16 | down | 0 | 0 | ||
compute-0-21 | 16 | down,offline | 0 | 0 | ||
compute-0-22 | 16 | down,offline | 0 | 0 | ||
compute-0-23 | 16 | down,offline | 0 | 0 | ||
compute-0-24 | 16 | down,offline | 0 | 0 | ||
compute-0-25 | 16 | down | 0 | 0 | ||
compute-0-26 | 16 | down | 0 | 0 | ||
compute-0-27 | 16 | down | 0 | 0 | ||
compute-0-28 | 16 | down | 0 | 0 | ||
compute-0-29 | 16 | down,offline | 0 | 0 | ||
compute-0-30 | 16 | down,offline | 0 | 0 | ||
compute-0-31 | 16 | down | 0 | 0 | ||
compute-0-32 | 16 | down | 0 | 0 | ||
compute-0-33 | 16 | down | 0 | 0 | ||
compute-0-34 | 16 | down | 0 | 0 | ||
compute-0-35 | 16 | down | 0 | 0 | ||
compute-0-36 | 16 | down,offline | 0 | 0 | ||
compute-0-37 | 16 | down | 0 | 0 | ||
compute-0-38 | 16 | down | 0 | 0 | ||
compute-0-39 | 16 | down,offline | 0 | 0 | ||
compute-0-40 | 16 | down | 0 | 0 | ||
compute-0-41 | 16 | down | 0 | 0 | ||
compute-0-42 | 16 | down | 0 | 0 | ||
compute-0-43 | 16 | down | 0 | 0 | ||
compute-0-44 | 16 | down,offline | 0 | 0 | ||
compute-0-45 | 16 | down,offline | 0 | 0 | ||
compute-0-46 | 16 | down | 0 | 0 | ||
compute-0-47 | 16 | down,offline | 0 | 0 | ||
Cluster 9
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|
Cluster 10
Nodes Summary
Total Number of CPUs: 280State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
down | 12 | 240 | 0 | 0.00 |
down,offline | 1 | 20 | 0 | 0.00 |
free | 1 | 0 | 20 | 7.14 |
Free CPUs (nodewise)
Node name | No. of free CPUs |
---|---|
compute-0-8 | 20 |
Total | 20 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R |
Cluster 10
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute-0-0 | 20 | down | 0 | 0 | ||
compute-0-1 | 20 | down | 0 | 0 | ||
compute-0-2 | 20 | down | 0 | 0 | ||
compute-0-4 | 20 | down | 0 | 0 | ||
compute-0-5 | 20 | down | 0 | 0 | ||
compute-0-6 | 20 | down | 0 | 0 | ||
compute-0-7 | 20 | down,offline | 0 | 0 | ||
compute-0-8 | 20 | free | 0 | 20 | ||
compute-0-10 | 20 | down | 0 | 0 | ||
compute-0-11 | 20 | down | 0 | 0 | ||
compute-0-12 | 20 | down | 0 | 0 | ||
compute-0-13 | 20 | down | 0 | 0 | ||
compute-0-3 | 20 | down | 0 | 0 | ||
compute-0-9 | 20 | down | 0 | 0 | ||
Cluster 10
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|
Cluster 11
Nodes Summary
Total Number of CPUs: 960State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
free | 9 | 88 | 128 | 13.33 |
job-exclusive | 18 | 432 | 0 | 0.00 |
down,job-exclusive | 1 | 24 | 0 | 0.00 |
down | 8 | 192 | 0 | 0.00 |
down,offline | 3 | 72 | 0 | 0.00 |
offline | 1 | 24 | 0 | 0.00 |
Free CPUs (nodewise)
Node name | No. of free CPUs |
---|---|
compute000 | 24 |
compute005 | 9 |
compute015 | 10 |
compute017 | 24 |
compute022 | 24 |
compute031 | 11 |
compute034 | 21 |
compute035 | 4 |
compute036 | 1 |
Total | 128 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|
R | |||||||
ganeshchandra | 1 | 12 | 1.25% | 15 days 21:54:28 hrs | 50.01% | 50.01% | |
tanoykanti | 3 | 41 | 4.27% | 3 days 07:39:48 hrs | 14.65% | 72.91% | |
aparajita | 22 | 22 | 2.29% | 58 days 16:37:33 hrs | 100.07% | 100.07% | |
debs | 1 | 144 | 15.00% | 24 days 19:16:03 hrs | 100.06% | 100.06% | |
shilendra | 1 | 144 | 15.00% | 15 days 14:04:24 hrs | 100.08% | 100.08% | |
sudip | 1 | 48 | 5.00% | 1 day 22:10:14 hrs | 100.07% | 100.07% | |
priyaghosh | 10 | 10 | 1.04% | 1 day 05:47:31 hrs | 100.08% | 100.08% | |
debrajbose | 87 | 87 | 9.06% | 0:27:32 hrs | 99.96% | 99.96% | |
tisita | 2 | 48 | 5.00% | 0:31:11 hrs | 100.00% | 100.00% |
Cluster 11
Nodes Status
Node | np | state | No. of CPUs occupied | No. of free cpus | ||
---|---|---|---|---|---|---|
compute000 | 24 | free | 0 | 24 | ||
compute001 | 24 | job-exclusive | 24 | 0 | ||
compute002 | 24 | job-exclusive | 24 | 0 | ||
compute003 | 24 | job-exclusive | 24 | 0 | ||
compute004 | 24 | down,job-exclusive | 24 | 0 | ||
compute005 | 24 | free | 15 | 9 | ||
compute006 | 24 | down | 0 | 0 | ||
compute007 | 24 | down | 0 | 0 | ||
compute008 | 24 | job-exclusive | 24 | 0 | ||
compute009 | 24 | job-exclusive | 24 | 0 | ||
compute010 | 24 | job-exclusive | 24 | 0 | ||
compute011 | 24 | job-exclusive | 24 | 0 | ||
compute012 | 24 | job-exclusive | 24 | 0 | ||
compute013 | 24 | job-exclusive | 24 | 0 | ||
compute014 | 24 | job-exclusive | 24 | 0 | ||
compute015 | 24 | free | 14 | 10 | ||
compute016 | 24 | down,offline | 0 | 0 | ||
compute017 | 24 | free | 0 | 24 | ||
compute018 | 24 | down | 0 | 0 | ||
compute019 | 24 | down | 0 | 0 | ||
compute020 | 24 | job-exclusive | 24 | 0 | ||
compute021 | 24 | down | 12 | 0 | ||
compute022 | 24 | free | 0 | 24 | ||
compute023 | 24 | down,offline | 0 | 0 | ||
compute024 | 24 | job-exclusive | 24 | 0 | ||
compute025 | 24 | down,offline | 0 | 0 | ||
compute026 | 24 | job-exclusive | 24 | 0 | ||
compute027 | 24 | job-exclusive | 24 | 0 | ||
compute028 | 24 | job-exclusive | 24 | 0 | ||
compute029 | 24 | job-exclusive | 24 | 0 | ||
compute030 | 24 | job-exclusive | 24 | 0 | ||
compute031 | 24 | free | 13 | 11 | ||
compute032 | 24 | job-exclusive | 24 | 0 | ||
compute033 | 24 | offline | 0 | 0 | ||
compute034 | 24 | free | 3 | 21 | ||
compute035 | 24 | free | 20 | 4 | ||
compute036 | 24 | free | 23 | 1 | ||
compute037 | 24 | down | 0 | 0 | ||
compute038 | 24 | down | 0 | 0 | ||
compute039 | 24 | down | 0 | 0 | ||
Cluster 11
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|
560857 | ganeshchandra | submit_p.sh | R | 15 days 21:54:29 hrs | 12 | 25.15 GB | 50.01% |
575781 | tanoykanti | sr_2 | R | 87 days 17:38:48 hrs | 1 | 5.96 MB | 100.04% |
577934 | aparajita | k1=10000_N=60 | R | 62 days 16:12:18 hrs | 1 | 837.95 MB | 100.08% |
577935 | aparajita | k1=10000_N=70 | R | 62 days 16:10:11 hrs | 1 | 1.43 GB | 100.08% |
577936 | aparajita | k1=10000_N=80 | R | 62 days 16:08:55 hrs | 1 | 2.51 GB | 100.08% |
577937 | aparajita | k1=10000_N=90 | R | 62 days 16:07:13 hrs | 1 | 2.89 GB | 100.08% |
577938 | aparajita | k1=10000_N=100 | R | 62 days 16:05:29 hrs | 1 | 4.92 GB | 100.08% |
577939 | aparajita | k1=10000_N=110 | R | 62 days 16:04:10 hrs | 1 | 5.49 GB | 100.08% |
577940 | aparajita | k1=10000_N=120 | R | 62 days 16:02:59 hrs | 1 | 6.17 GB | 100.08% |
577941 | aparajita | k1=10000_N=130 | R | 62 days 16:01:39 hrs | 1 | 10.10 GB | 100.08% |
577942 | aparajita | k1=10000_N=140 | R | 62 days 16:00:15 hrs | 1 | 10.97 GB | 100.08% |
577943 | aparajita | k1=10000_N=150 | R | 62 days 15:59:06 hrs | 1 | 12.12 GB | 100.08% |
577944 | aparajita | k1=10000_N=160 | R | 62 days 15:58:05 hrs | 1 | 19.59 GB | 100.09% |
577945 | aparajita | k1=10000_N=170 | R | 62 days 15:56:55 hrs | 1 | 20.98 GB | 100.08% |
577946 | aparajita | k1=10000_N=180 | R | 62 days 15:55:58 hrs | 1 | 22.55 GB | 100.08% |
577947 | aparajita | k1=10000_N=190 | R | 62 days 15:54:50 hrs | 1 | 23.78 GB | 100.09% |
577948 | aparajita | k1=10000_N=200 | R | 62 days 15:53:52 hrs | 1 | 26.16 GB | 100.08% |
578069 | aparajita | k1=10000_N=128 | R | 60 days 18:20:36 hrs | 1 | 9.93 GB | 100.00% |
579100 | aparajita | k1=50000_N=64 | R | 54 days 20:36:56 hrs | 1 | 1.29 GB | 100.08% |
579101 | aparajita | k1=30000_N=64 | R | 54 days 20:35:59 hrs | 1 | 1.29 GB | 100.08% |
579112 | aparajita | k1=20000_N=64 | R | 54 days 20:25:48 hrs | 1 | 1.29 GB | 100.08% |
580904 | aparajita | k1=10000_N=64_sg | R | 51 days 21:02:25 hrs | 1 | 1.29 GB | 100.06% |
582065 | aparajita | k1=5000_N=128_p=2 | R | 50 days 23:31:03 hrs | 1 | 9.93 GB | 100.05% |
584168 | debs | CsPb_441ph | R | 24 days 19:16:03 hrs | 144 | 62.64 GB | 100.06% |
584795 | aparajita | k1=10000_N=60_tmax=1 | R | 23 days 00:41:41 hrs | 1 | 834.30 MB | 100.05% |
589467 | shilendra | CsPb | R | 15 days 14:04:24 hrs | 144 | 5.39 GB | 100.08% |
595668 | tanoykanti | QFI_vst_bose_55 | R | 2 days 19:52:33 hrs | 16 | 1.52 GB | 25.02% |
596691 | sudip | mos2_abs | R | 1 day 22:10:15 hrs | 48 | 26.66 GB | 100.07% |
597104 | priyaghosh | u0.01_6 | R | 1 day 17:01:42 hrs | 1 | 20.35 MB | 100.09% |
597105 | priyaghosh | u0.03_7 | R | 1 day 17:00:00 hrs | 1 | 20.36 MB | 100.09% |
597106 | priyaghosh | u0.05_8 | R | 1 day 16:58:29 hrs | 1 | 20.35 MB | 100.09% |
597110 | priyaghosh | u0.08_9 | R | 1 day 16:56:59 hrs | 1 | 20.34 MB | 100.09% |
597139 | priyaghosh | u0.1_10 | R | 1 day 16:55:13 hrs | 1 | 20.34 MB | 100.08% |
602267 | priyaghosh | g0.01_1 | R | 18:45:31 hrs | 1 | 22.93 MB | 100.05% |
602268 | priyaghosh | g0.03_2 | R | 18:36:47 hrs | 1 | 22.94 MB | 100.08% |
602269 | priyaghosh | g0.05_3 | R | 18:35:42 hrs | 1 | 23.79 MB | 100.08% |
602270 | priyaghosh | g0.08_4 | R | 18:32:49 hrs | 1 | 23.77 MB | 100.07% |
602271 | priyaghosh | g0.1_5 | R | 18:32:05 hrs | 1 | 23.78 MB | 100.08% |
603022 | tanoykanti | 22_NN | R | 03:06:21 hrs | 24 | 31.62 MB | 4.17% |
603339 | debrajbose | debu | R | 00:37:52 hrs | 1 | 32.75 MB | 100.00% |
603340 | debrajbose | debu | R | 00:37:45 hrs | 1 | 32.75 MB | 100.04% |
603341 | debrajbose | debu | R | 00:37:32 hrs | 1 | 32.75 MB | 100.00% |
603342 | debrajbose | debu | R | 00:37:25 hrs | 1 | 32.75 MB | 100.00% |
603343 | debrajbose | debu | R | 00:37:22 hrs | 1 | 32.75 MB | 99.96% |
603344 | debrajbose | debu | R | 00:37:15 hrs | 1 | 32.77 MB | 100.04% |
603345 | debrajbose | debu | R | 00:37:12 hrs | 1 | 34.75 MB | 100.00% |
603346 | debrajbose | debu | R | 00:37:00 hrs | 1 | 34.75 MB | 100.00% |
603347 | debrajbose | debu | R | 00:36:34 hrs | 1 | 34.79 MB | 100.00% |
603348 | debrajbose | debu | R | 00:36:10 hrs | 1 | 32.77 MB | 100.00% |
603349 | debrajbose | debu | R | 00:35:55 hrs | 1 | 32.76 MB | 99.91% |
603350 | debrajbose | debu | R | 00:35:48 hrs | 1 | 32.75 MB | 99.77% |
603351 | debrajbose | debu | R | 00:35:24 hrs | 1 | 32.75 MB | 99.86% |
603352 | debrajbose | debu | R | 00:35:21 hrs | 1 | 34.75 MB | 99.76% |
603353 | debrajbose | debu | R | 00:34:21 hrs | 1 | 32.75 MB | 100.00% |
603354 | debrajbose | debu | R | 00:33:47 hrs | 1 | 32.75 MB | 100.05% |
603355 | debrajbose | debu | R | 00:33:30 hrs | 1 | 32.76 MB | 100.00% |
603356 | debrajbose | debu | R | 00:33:17 hrs | 1 | 34.75 MB | 100.00% |
603357 | debrajbose | debu | R | 00:33:04 hrs | 1 | 34.75 MB | 100.05% |
603358 | debrajbose | debu | R | 00:32:46 hrs | 1 | 32.75 MB | 100.00% |
603359 | debrajbose | debu | R | 00:32:25 hrs | 1 | 32.88 MB | 100.00% |
603360 | debrajbose | debu | R | 00:32:08 hrs | 1 | 34.75 MB | 100.00% |
603361 | debrajbose | debu | R | 00:31:44 hrs | 1 | 34.75 MB | 100.00% |
603362 | debrajbose | debu | R | 00:31:38 hrs | 1 | 32.75 MB | 100.00% |
603363 | debrajbose | debu | R | 00:31:27 hrs | 1 | 32.75 MB | 100.00% |
603364 | tisita | BiAs | R | 00:31:01 hrs | 24 | 34.46 GB | 99.98% |
603365 | debrajbose | debu | R | 00:31:04 hrs | 1 | 34.75 MB | 100.00% |
603366 | tisita | BiAs | R | 00:31:22 hrs | 24 | 40.17 GB | 100.02% |
603367 | debrajbose | debu | R | 00:30:47 hrs | 1 | 32.76 MB | 100.00% |
603368 | debrajbose | debu | R | 00:30:27 hrs | 1 | 34.88 MB | 100.00% |
603369 | debrajbose | debu | R | 00:29:59 hrs | 1 | 32.75 MB | 100.00% |
603370 | debrajbose | debu | R | 00:29:52 hrs | 1 | 32.75 MB | 100.00% |
603371 | debrajbose | debu | R | 00:29:42 hrs | 1 | 34.75 MB | 100.00% |
603372 | debrajbose | debu | R | 00:29:22 hrs | 1 | 34.75 MB | 100.00% |
603373 | debrajbose | debu | R | 00:29:15 hrs | 1 | 32.85 MB | 100.00% |
603374 | debrajbose | debu | R | 00:29:09 hrs | 1 | 32.75 MB | 100.00% |
603375 | debrajbose | debu | R | 00:29:06 hrs | 1 | 32.75 MB | 100.00% |
603376 | debrajbose | debu | R | 00:29:03 hrs | 1 | 32.75 MB | 100.00% |
603377 | debrajbose | debu | R | 00:28:46 hrs | 1 | 32.75 MB | 100.00% |
603378 | debrajbose | debu | R | 00:28:36 hrs | 1 | 32.75 MB | 100.00% |
603379 | debrajbose | debu | R | 00:29:07 hrs | 1 | 34.66 MB | 100.00% |
603380 | debrajbose | debu | R | 00:28:50 hrs | 1 | 32.81 MB | 99.60% |
603381 | debrajbose | debu | R | 00:28:29 hrs | 1 | 32.75 MB | 100.06% |
603382 | debrajbose | debu | R | 00:28:22 hrs | 1 | 34.75 MB | 100.06% |
603383 | debrajbose | debu | R | 00:27:58 hrs | 1 | 32.78 MB | 100.06% |
603384 | debrajbose | debu | R | 00:27:37 hrs | 1 | 32.75 MB | 100.06% |
603385 | debrajbose | debu | R | 00:27:15 hrs | 1 | 32.84 MB | 100.06% |
603386 | debrajbose | debu | R | 00:26:59 hrs | 1 | 32.79 MB | 100.00% |
603387 | debrajbose | debu | R | 00:26:46 hrs | 1 | 34.75 MB | 99.88% |
603388 | debrajbose | debu | R | 00:26:44 hrs | 1 | 32.76 MB | 99.69% |
603389 | debrajbose | debu | R | 00:26:23 hrs | 1 | 32.75 MB | 100.06% |
603390 | debrajbose | debu | R | 00:26:03 hrs | 1 | 32.82 MB | 100.06% |
603391 | debrajbose | debu | R | 00:25:45 hrs | 1 | 34.75 MB | 99.87% |
603392 | debrajbose | debu | R | 00:25:26 hrs | 1 | 34.82 MB | 99.87% |
603393 | debrajbose | debu | R | 00:25:08 hrs | 1 | 32.75 MB | 100.07% |
603394 | debrajbose | debu | R | 00:24:43 hrs | 1 | 32.75 MB | 99.93% |
603395 | debrajbose | debu | R | 00:24:36 hrs | 1 | 32.75 MB | 100.07% |
603396 | debrajbose | debu | R | 00:24:29 hrs | 1 | 34.75 MB | 99.66% |
603397 | debrajbose | debu | R | 00:24:26 hrs | 1 | 32.84 MB | 99.59% |
603398 | debrajbose | debu | R | 00:24:22 hrs | 1 | 32.76 MB | 99.59% |
603399 | debrajbose | debu | R | 00:23:54 hrs | 1 | 32.75 MB | 99.72% |
603400 | debrajbose | debu | R | 00:23:46 hrs | 1 | 32.75 MB | 99.58% |
603401 | debrajbose | debu | R | 00:23:44 hrs | 1 | 34.75 MB | 99.93% |
603402 | debrajbose | debu | R | 00:23:38 hrs | 1 | 32.75 MB | 99.79% |
603403 | debrajbose | debu | R | 00:23:08 hrs | 1 | 32.75 MB | 100.00% |
603404 | debrajbose | debu | R | 00:22:57 hrs | 1 | 32.75 MB | 99.93% |
603405 | debrajbose | debu | R | 00:22:54 hrs | 1 | 34.75 MB | 99.93% |
603406 | debrajbose | debu | R | 00:22:51 hrs | 1 | 32.74 MB | 100.00% |
603407 | debrajbose | debu | R | 00:22:01 hrs | 1 | 32.74 MB | 100.00% |
603408 | debrajbose | debu | R | 00:21:40 hrs | 1 | 34.74 MB | 100.00% |
603409 | debrajbose | debu | R | 00:22:01 hrs | 1 | 32.74 MB | 99.92% |
603410 | debrajbose | debu | R | 00:21:22 hrs | 1 | 32.82 MB | 100.00% |
603411 | debrajbose | debu | R | 00:21:01 hrs | 1 | 32.74 MB | 99.92% |
603412 | debrajbose | debu | R | 00:20:54 hrs | 1 | 34.74 MB | 100.00% |
603413 | debrajbose | debu | R | 00:20:38 hrs | 1 | 32.79 MB | 100.00% |
603414 | debrajbose | debu | R | 00:20:28 hrs | 1 | 32.74 MB | 100.00% |
603415 | debrajbose | debu | R | 00:20:44 hrs | 1 | 32.75 MB | 99.92% |
603416 | debrajbose | debu | R | 00:20:02 hrs | 1 | 34.74 MB | 100.00% |
603417 | debrajbose | debu | R | 00:19:39 hrs | 1 | 32.74 MB | 100.00% |
603418 | debrajbose | debu | R | 00:19:34 hrs | 1 | 34.79 MB | 100.00% |
603419 | debrajbose | debu | R | 00:19:58 hrs | 1 | 32.75 MB | 99.92% |
603420 | debrajbose | debu | R | 00:19:19 hrs | 1 | 34.73 MB | 100.00% |
603421 | debrajbose | debu | R | 00:18:54 hrs | 1 | 34.74 MB | 100.00% |
603422 | debrajbose | debu | R | 00:18:54 hrs | 1 | 32.74 MB | 99.91% |
603423 | debrajbose | debu | R | 00:18:37 hrs | 1 | 34.74 MB | 100.00% |
603424 | debrajbose | debu | R | 00:18:33 hrs | 1 | 32.82 MB | 100.00% |
603425 | debrajbose | debu | R | 00:18:27 hrs | 1 | 34.74 MB | 100.00% |
603426 | debrajbose | debu | R | 00:18:24 hrs | 1 | 34.74 MB | 100.00% |
603427 | debrajbose | debu | R | 00:18:10 hrs | 1 | 32.74 MB | 100.00% |
Cluster 12
Nodes Summary
Total Number of CPUs: 1056State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
free | 7 | 31 | 137 | 12.97 |
job-busy | 36 | 864 | 0 | 0.00 |
down | 1 | 24 | 0 | 0.00 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
node2 | 5 | |
node16 | 16 | |
node17 | 20 | |
node38 | 24 | |
node39 | 24 | |
node42 | 24 | |
node33 | 24 | |
Total | 137 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
debarupa11 | 3 | 3 | 0.28% | 21 days 00:34:04 hrs | 93.57% | 93.57% | ||
jagjitkaur | 1 | 144 | 13.64% | 17 days 18:28:13 hrs | 100.02% | 100.02% | ||
souravmal | 4 | 96 | 9.09% | 2 days 04:03:12 hrs | 100.02% | 100.02% | ||
shilendra | 1 | 144 | 13.64% | 2 days 05:38:53 hrs | 100.03% | 100.03% | ||
shuvam | 13 | 52 | 4.92% | 1 day 17:15:50 hrs | 25.02% | 25.02% | ||
bikashvbu | 2 | 48 | 4.55% | 11:59:40 hrs | 100.00% | 100.01% | ||
pradhi | 2 | 408 | 38.64% | 06:59:22 hrs | 99.98% | 100.00% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
vanshreep | 1 | 144 |
Cluster 12
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
node2 | workq | 24 | free | 19 | 5 | |
node3 | workq | 24 | job-busy | 24 | 0 | |
node4 | workq | 24 | job-busy | 24 | 0 | |
node5 | workq | 24 | job-busy | 24 | 0 | |
node8 | workq | 24 | job-busy | 24 | 0 | |
node9 | workq | 24 | job-busy | 24 | 0 | |
node10 | workq | 24 | job-busy | 24 | 0 | |
node11 | workq | 24 | job-busy | 24 | 0 | |
node12 | workq | 24 | job-busy | 24 | 0 | |
node14 | workq | 24 | job-busy | 24 | 0 | |
node15 | workq | 24 | job-busy | 24 | 0 | |
node16 | workq | 24 | free | 8 | 16 | |
node17 | workq | 24 | free | 4 | 20 | |
node18 | workq | 24 | job-busy | 24 | 0 | |
node19 | workq | 24 | job-busy | 24 | 0 | |
node20 | workq | 24 | job-busy | 24 | 0 | |
node21 | workq | 24 | job-busy | 24 | 0 | |
node22 | workq | 24 | job-busy | 24 | 0 | |
node1 | workq | 24 | job-busy | 24 | 0 | |
node23 | workq | 24 | job-busy | 24 | 0 | |
node24 | workq | 24 | job-busy | 24 | 0 | |
node25 | workq | 24 | job-busy | 24 | 0 | |
node6 | workq | 24 | job-busy | 24 | 0 | |
node7 | workq | 24 | job-busy | 24 | 0 | |
node26 | workq | 24 | down | 0 | 0 | |
node27 | workq | 24 | job-busy | 24 | 0 | |
node13 | workq | 24 | job-busy | 24 | 0 | |
node28 | workq | 24 | job-busy | 24 | 0 | |
node29 | workq | 24 | job-busy | 24 | 0 | |
node30 | workq | 24 | job-busy | 24 | 0 | |
node31 | workq | 24 | job-busy | 24 | 0 | |
node32 | workq | 24 | job-busy | 24 | 0 | |
node34 | workq | 24 | job-busy | 24 | 0 | |
node35 | workq | 24 | job-busy | 24 | 0 | |
node36 | workq | 24 | job-busy | 24 | 0 | |
node37 | workq | 24 | job-busy | 24 | 0 | |
node38 | workq | 24 | free | 0 | 24 | |
node39 | workq | 24 | free | 0 | 24 | |
node40 | workq | 24 | job-busy | 24 | 0 | |
node41 | workq | 24 | job-busy | 24 | 0 | |
node42 | workq | 24 | free | 0 | 24 | |
node33 | workq | 24 | free | 0 | 24 | |
node43 | workq | 24 | job-busy | 24 | 0 | |
node44 | workq | 24 | job-busy | 24 | 0 | |
Cluster 12
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
93724 | debarupa11 | workq | GMBF10000000 | R | 21 days 00:34:14 hrs | 1 | 6.76 MB | 93.57% |
93725 | debarupa11 | workq | GMBF100000000 | R | 21 days 00:34:04 hrs | 1 | 6.76 MB | 93.57% |
93726 | debarupa11 | workq | GMBF1000000000 | R | 21 days 00:33:55 hrs | 1 | 6.75 MB | 93.57% |
93791 | jagjitkaur | workq | cafcw | R | 17 days 18:28:14 hrs | 144 | 28.34 GB | 100.02% |
94538 | souravmal | workq | smps-914568 | R | 5 days 11:34:10 hrs | 24 | 10.07 GB | 100.02% |
94637 | shilendra | workq | opt_FeCo | R | 2 days 05:38:53 hrs | 144 | 18.59 GB | 100.03% |
94834 | shuvam | workq | zpt75 | R | 1 day 20:27:09 hrs | 4 | 14.23 MB | 25.02% |
94835 | shuvam | workq | zpt80 | R | 1 day 20:25:34 hrs | 4 | 12.45 MB | 25.02% |
94836 | shuvam | workq | zpt85 | R | 1 day 20:25:23 hrs | 4 | 14.23 MB | 25.02% |
94934 | shuvam | workq | zpt10 | R | 1 day 18:31:54 hrs | 4 | 12.44 MB | 25.02% |
94935 | shuvam | workq | zpt15 | R | 1 day 18:31:54 hrs | 4 | 12.44 MB | 25.02% |
94936 | shuvam | workq | zpt20 | R | 1 day 18:31:11 hrs | 4 | 12.45 MB | 25.02% |
94937 | shuvam | workq | zpt25 | R | 1 day 18:31:11 hrs | 4 | 12.45 MB | 25.02% |
94938 | shuvam | workq | zpt30 | R | 1 day 18:31:11 hrs | 4 | 12.45 MB | 25.02% |
94939 | shuvam | workq | zpt35 | R | 1 day 18:31:11 hrs | 4 | 12.45 MB | 25.02% |
94940 | shuvam | workq | zpt40 | R | 1 day 18:31:11 hrs | 4 | 12.45 MB | 25.02% |
94942 | shuvam | workq | zpt50 | R | 1 day 11:09:24 hrs | 4 | 337.46 MB | 25.02% |
94944 | shuvam | workq | zpt60 | R | 1 day 11:09:24 hrs | 4 | 338.61 MB | 25.02% |
94949 | shuvam | workq | zpt85 | R | 1 day 11:09:14 hrs | 4 | 339.67 MB | 25.02% |
95011 | souravmal | workq | smps-792162-C | R | 1 day 01:40:14 hrs | 24 | 14.58 GB | 100.02% |
95012 | souravmal | workq | smps-792162-N | R | 1 day 01:32:13 hrs | 24 | 14.59 GB | 100.02% |
95013 | souravmal | workq | smps-792162-A | R | 1 day 01:26:13 hrs | 24 | 14.57 GB | 100.02% |
95082 | bikashvbu | workq | Tuya | R | 21:48:38 hrs | 24 | 8.89 GB | 100.02% |
95101 | pradhi | workq | 3d_al2o3_rlx | R | 10:38:18 hrs | 240 | 18.83 GB | 100.01% |
95109 | bikashvbu | workq | Tuya | R | 02:10:42 hrs | 24 | 30.63 GB | 99.99% |
95110 | pradhi | workq | fai_ivac_rlx | R | 01:46:37 hrs | 168 | 4.20 GB | 99.95% |
95111 | vanshreep | workq | Pdhs_uc | Q | 144 | 0.00% | ||
Cluster 13
Nodes Summary
Total Number of CPUs: 1024State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 21 | 672 | 0 | 0.00 |
down | 2 | 64 | 0 | 0.00 |
free | 9 | 0 | 288 | 28.13 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
c13node20 | 32 | |
c13node13 | 32 | |
neutrino | ||
c13node24 | 32 | |
c13node25 | 32 | |
c13node26 | 32 | |
c13node27 | 32 | |
c13node28 | 32 | |
c13node30 | 32 | |
c13node31 | 32 | |
Total | 288 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
tanoykanti | 4 | 128 | 12.50% | 64 days 18:20:26 hrs | 97.12% | 97.12% | ||
sankalpa | 1 | 128 | 12.50% | 20 days 01:21:55 hrs | 99.66% | 99.66% | ||
jagjitkaur | 1 | 128 | 12.50% | 4 days 15:19:04 hrs | 99.64% | 99.64% | ||
pradhi | 1 | 128 | 12.50% | 1 day 03:58:32 hrs | 97.67% | 97.67% | ||
prajna | 1 | 64 | 6.25% | 1 day 02:18:50 hrs | 99.66% | 99.66% | ||
shilendra | 1 | 96 | 9.38% | 21:55:19 hrs | 99.67% | 99.67% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
dhirendra | 1 | 128 | ||
ponnappa | 1 | 120 | ||
pradhi | 2 | 256 | ||
ayushitripathi | 1 | 128 |
Cluster 13
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
c13node1 | workq | 32 | job-busy | 32 | 0 | |
c13node2 | workq | 32 | job-busy | 32 | 0 | |
c13node3 | workq | 32 | job-busy | 32 | 0 | |
c13node4 | workq | 32 | job-busy | 32 | 0 | |
c13node5 | workq | 32 | job-busy | 32 | 0 | |
c13node7 | workq | 32 | job-busy | 32 | 0 | |
c13node8 | workq | 32 | job-busy | 32 | 0 | |
c13node9 | workq | 32 | job-busy | 32 | 0 | |
c13node10 | workq | 32 | job-busy | 32 | 0 | |
c13node11 | workq | 32 | job-busy | 32 | 0 | |
c13node12 | workq | 32 | job-busy | 32 | 0 | |
c13node14 | workq | 32 | job-busy | 32 | 0 | |
c13node15 | workq | 32 | job-busy | 32 | 0 | |
c13node0 | workq | 32 | job-busy | 32 | 0 | |
c13node16 | workq | 32 | job-busy | 32 | 0 | |
c13node17 | workq | 32 | job-busy | 32 | 0 | |
c13node18 | workq | 32 | job-busy | 32 | 0 | |
c13node6 | workq | 32 | down | 0 | 0 | |
c13node19 | workq | 32 | job-busy | 32 | 0 | |
c13node20 | workq | 32 | free | 0 | 32 | |
c13node22 | workq | 32 | job-busy | 32 | 0 | |
c13node13 | workq | 32 | free | 0 | 32 | |
c13node23 | workq | 32 | job-busy | 32 | 0 | |
c13node21 | workq | 32 | job-busy | 32 | 0 | |
c13node24 | neutrino | 32 | free | 0 | 32 | |
c13node25 | neutrino | 32 | free | 0 | 32 | |
c13node26 | neutrino | 32 | free | 0 | 32 | |
c13node27 | neutrino | 32 | free | 0 | 32 | |
c13node28 | neutrino | 32 | free | 0 | 32 | |
c13node29 | neutrino | 32 | down | 0 | 0 | |
c13node30 | neutrino | 32 | free | 0 | 32 | |
c13node31 | neutrino | 32 | free | 0 | 32 | |
Cluster 13
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
396991 | tanoykanti | workq | xyz4 | R | 64 days 18:21:01 hrs | 32 | 85.85 GB | 97.10% |
396992 | tanoykanti | workq | xyz2 | R | 64 days 18:20:37 hrs | 32 | 84.54 GB | 97.13% |
396993 | tanoykanti | workq | xyz3 | R | 64 days 18:19:17 hrs | 32 | 86.17 GB | 97.10% |
396994 | tanoykanti | workq | xyz1 | R | 64 days 18:20:50 hrs | 32 | 84.54 GB | 97.17% |
398079 | sankalpa | workq | CeSbTe_scfh | R | 20 days 01:21:55 hrs | 128 | 39.43 GB | 99.66% |
398331 | jagjitkaur | workq | nafe | R | 4 days 15:19:04 hrs | 128 | 56.48 GB | 99.64% |
398358 | pradhi | workq | lytcf_221_r | R | 1 day 03:58:32 hrs | 128 | 7.30 GB | 97.67% |
398372 | prajna | workq | Ni_Pt | R | 1 day 02:18:50 hrs | 64 | 12.95 GB | 99.66% |
398373 | dhirendra | workq | Se | Q | 128 | 0.00% | ||
398374 | ponnappa | workq | c2 | Q | 120 | 0.00% | ||
398375 | pradhi | workq | lytcf_221_r | Q | 128 | 0.00% | ||
398381 | shilendra | workq | GGCO_hbb | R | 21:55:19 hrs | 96 | 13.03 GB | 99.67% |
398382 | pradhi | workq | lytcf_bnd | Q | 128 | 0.00% | ||
398383 | ayushitripathi | workq | ErOCl12 | Q | 128 | 0.00% | ||
Cluster 14
Nodes Summary
Total Number of CPUs: 1040State | No. of Nodes | No. of CPUs occupied/down | No. of CPUs free | % of total CPUs free |
---|---|---|---|---|
job-busy | 16 | 896 | 0 | 0.00 |
free | 3 | 70 | 74 | 7.12 |
Free CPUs (nodewise)
Queue | Node name | No. of free CPUs |
---|---|---|
workq | ||
node8 | 28 | |
node10 | 14 | |
Total | 74 |
Jobs Summary
† Avg. Efficiency per CPU =
∑
CPU time / Walltime
/∑ No. of CPUs assigned†† Overall Efficiency =
∑ CPU time ∑ (Walltime × No. of CPUs assigned)
[Sums are over all the running jobs.]
Job State | Queue | User | No. of Jobs | No. of CPU using | % of total CPU using | Avg. Walltime per CPU | Avg. Efficiency per CPU† | Overall Efficiency†† |
---|---|---|---|---|---|---|---|---|
R | ||||||||
workq | ||||||||
slgupta | 1 | 56 | 5.38% | 17 days 19:12:55 hrs | 99.56% | 99.56% | ||
arijeetsarangi | 1 | 112 | 10.77% | 11 days 11:23:27 hrs | 99.47% | 99.47% | ||
manasagb | 1 | 112 | 10.77% | 10 days 03:29:25 hrs | 99.45% | 99.45% | ||
souravmal | 1 | 56 | 5.38% | 11 days 00:22:38 hrs | 99.51% | 99.51% | ||
tisita | 1 | 108 | 10.38% | 8 days 09:08:47 hrs | 99.35% | 99.35% | ||
pradhi | 1 | 112 | 10.77% | 02:08:41 hrs | 99.21% | 99.21% | ||
tanmoymondal | 45 | 90 | 8.65% | 4 days 21:07:31 hrs | 99.05% | 99.04% | ||
swapnild | 1 | 112 | 10.77% | 06:41:33 hrs | 99.59% | 99.59% | ||
vanshreep | 1 | 112 | 10.77% | 10:46:14 hrs | 99.32% | 99.32% | ||
mab5 | 1 | 56 | 5.38% | 0:14:41 hrs | 91.31% | 91.31% | ||
tanoykanti | 1 | 40 | 3.85% | 0:01:49 hrs | 80.73% | 80.73% |
Job State | Queue | User | No. of Jobs | No. of CPU Requested |
---|---|---|---|---|
Q | ||||
workq | ||||
swapnild | 1 | 112 | ||
mab5 | 1 | 112 | ||
jagjitkaur | 2 | 224 | ||
pradhi | 2 | 224 | ||
shilendra | 1 | 112 | ||
ponnappa | 2 | 224 |
Cluster 14
Nodes Status
Node | Queue | np | state | No. of CPUs occupied | No. of free cpus | |
---|---|---|---|---|---|---|
node1 | workq | 56 | job-busy | 56 | 0 | |
node2 | workq | 56 | job-busy | 56 | 0 | |
node3 | workq | 56 | job-busy | 56 | 0 | |
node4 | workq | 56 | job-busy | 56 | 0 | |
node5 | workq | 56 | job-busy | 56 | 0 | |
node6 | workq | 56 | job-busy | 56 | 0 | |
node7 | workq | 56 | job-busy | 56 | 0 | |
node8 | workq | 56 | free | 28 | 28 | |
node9 | workq | 56 | job-busy | 56 | 0 | |
node10 | workq | 56 | free | 42 | 14 | |
node11 | workq | 56 | job-busy | 56 | 0 | |
node12 | workq | 56 | job-busy | 56 | 0 | |
node13 | workq | 56 | job-busy | 56 | 0 | |
node14 | workq | 56 | job-busy | 56 | 0 | |
node15 | workq | 56 | job-busy | 56 | 0 | |
node16 | workq | 56 | job-busy | 56 | 0 | |
node17 | workq | 56 | job-busy | 56 | 0 | |
node18 | workq | 56 | job-busy | 56 | 0 | |
gpu1 | neutrino | 32 | free | 0 | 32 | |
Cluster 14
Jobs Status
† Efficiency (of parallelization) = CPU time Walltime × No. of CPUs assigned
Job ID | User | Queue | Job Name | Job State | Walltime Used | No. of CPU using | Memory Using | Efficiency† |
---|---|---|---|---|---|---|---|---|
12847.c14m1.clusternet | slgupta@c14m2.clusternet | workq | 0104_LMTO | R | 17 days 19:12:55 hrs | 56 | 106.38 GB | 99.56% |
13062.c14m1.clusternet | arijeetsarangi@c14m2.clusternet | workq | sth | R | 11 days 11:23:27 hrs | 112 | 134.14 GB | 99.47% |
13098.c14m1.clusternet | manasagb@c14m2.clusternet | workq | sth | R | 10 days 03:29:25 hrs | 112 | 134.07 GB | 99.45% |
13100.c14m1.clusternet | souravmal@c14m2.clusternet | workq | FGT | R | 11 days 00:22:38 hrs | 56 | 451.96 GB | 99.51% |
13133.c14m1.clusternet | tisita@c14m2.clusternet | workq | on_P_neb | R | 8 days 09:08:48 hrs | 108 | 14.50 GB | 99.35% |
13226.c14m1.clusternet | pradhi@c14m2.clusternet | workq | lytcf_rlx | R | 02:08:41 hrs | 112 | 32.60 GB | 99.21% |
13364.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.0004 | R | 4 days 21:47:16 hrs | 2 | 122.72 MB | 73.53% |
13365.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.0008 | R | 4 days 21:46:59 hrs | 2 | 128.26 MB | 99.78% |
13366.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.00125 | R | 4 days 21:47:02 hrs | 2 | 121.24 MB | 99.60% |
13367.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.00175 | R | 4 days 21:46:59 hrs | 2 | 119.23 MB | 99.60% |
13368.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.0025 | R | 4 days 21:44:49 hrs | 2 | 130.31 MB | 99.59% |
13369.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.003 | R | 4 days 21:44:49 hrs | 2 | 130.21 MB | 99.61% |
13370.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.008 | R | 4 days 21:44:49 hrs | 2 | 127.61 MB | 99.60% |
13371.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.01 | R | 4 days 21:44:49 hrs | 2 | 130.01 MB | 99.59% |
13372.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_1_2.0_10_0.02 | R | 4 days 21:44:49 hrs | 2 | 122.86 MB | 99.61% |
13373.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.0004 | R | 4 days 21:44:49 hrs | 2 | 120.57 MB | 99.59% |
13374.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.0008 | R | 4 days 21:44:49 hrs | 2 | 120.61 MB | 99.59% |
13375.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.00125 | R | 4 days 21:44:49 hrs | 2 | 121.36 MB | 99.60% |
13376.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.00175 | R | 4 days 21:44:49 hrs | 2 | 122.87 MB | 99.60% |
13377.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.0025 | R | 4 days 21:44:49 hrs | 2 | 127.77 MB | 99.60% |
13378.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.003 | R | 4 days 21:44:49 hrs | 2 | 129.01 MB | 99.59% |
13379.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.008 | R | 4 days 21:44:38 hrs | 2 | 118.52 MB | 99.61% |
13380.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.01 | R | 4 days 20:46:58 hrs | 2 | 119.10 MB | 99.60% |
13381.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_2_2.0_10_0.02 | R | 4 days 20:46:58 hrs | 2 | 115.09 MB | 99.59% |
13382.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.0004 | R | 4 days 20:46:58 hrs | 2 | 132.32 MB | 99.58% |
13383.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.0008 | R | 4 days 20:46:58 hrs | 2 | 119.27 MB | 99.60% |
13384.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.00125 | R | 4 days 20:46:58 hrs | 2 | 127.18 MB | 99.59% |
13385.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.00175 | R | 4 days 20:46:58 hrs | 2 | 130.18 MB | 99.61% |
13386.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.0025 | R | 4 days 20:46:58 hrs | 2 | 117.44 MB | 99.61% |
13387.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.003 | R | 4 days 20:46:58 hrs | 2 | 127.75 MB | 99.59% |
13388.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.008 | R | 4 days 20:46:58 hrs | 2 | 128.05 MB | 99.60% |
13389.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.01 | R | 4 days 20:46:58 hrs | 2 | 122.22 MB | 99.59% |
13390.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_3_2.0_10_0.02 | R | 4 days 20:46:58 hrs | 2 | 118.42 MB | 99.61% |
13391.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.0004 | R | 4 days 20:46:58 hrs | 2 | 127.14 MB | 99.60% |
13392.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.0008 | R | 4 days 20:46:58 hrs | 2 | 130.74 MB | 99.60% |
13393.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.00125 | R | 4 days 20:46:58 hrs | 2 | 129.03 MB | 99.59% |
13394.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.00175 | R | 4 days 20:46:20 hrs | 2 | 123.65 MB | 99.67% |
13395.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.0025 | R | 4 days 20:46:20 hrs | 2 | 131.94 MB | 99.66% |
13396.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.003 | R | 4 days 20:46:20 hrs | 2 | 118.98 MB | 99.66% |
13397.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.008 | R | 4 days 20:46:20 hrs | 2 | 116.67 MB | 99.66% |
13398.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.01 | R | 4 days 20:46:20 hrs | 2 | 130.12 MB | 99.66% |
13399.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_4_2.0_10_0.02 | R | 4 days 20:46:20 hrs | 2 | 123.73 MB | 99.67% |
13400.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.0004 | R | 4 days 20:46:20 hrs | 2 | 121.75 MB | 99.66% |
13401.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.0008 | R | 4 days 20:46:20 hrs | 2 | 131.83 MB | 99.66% |
13402.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.00125 | R | 4 days 20:46:20 hrs | 2 | 120.20 MB | 99.65% |
13403.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.00175 | R | 4 days 20:46:20 hrs | 2 | 121.75 MB | 99.66% |
13404.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.0025 | R | 4 days 20:46:20 hrs | 2 | 132.89 MB | 99.66% |
13405.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.003 | R | 4 days 20:46:20 hrs | 2 | 119.11 MB | 99.67% |
13406.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.008 | R | 4 days 20:46:20 hrs | 2 | 131.01 MB | 99.65% |
13407.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.01 | R | 4 days 20:46:20 hrs | 2 | 121.73 MB | 99.66% |
13408.c14m1.clusternet | tanmoymondal@c14m2.clusternet | workq | sf_5_2.0_10_0.02 | R | 4 days 20:46:49 hrs | 2 | 135.73 MB | 99.74% |
13443.c14m1.clusternet | swapnild@c14m2.clusternet | workq | p0.001 | R | 06:41:33 hrs | 112 | 479.14 GB | 99.59% |
13445.c14m1.clusternet | swapnild@c14m2.clusternet | workq | p0.0001 | Q | 112 | 0.00% | ||
13465.c14m1.clusternet | vanshreep@c14m2.clusternet | workq | ohvac-nupd | R | 10:46:14 hrs | 112 | 30.89 GB | 99.32% |
13470.c14m1.clusternet | mab5@c14m2.clusternet | workq | _p4_SR_2_ | Q | 112 | 0.00% | ||
13472.c14m1.clusternet | jagjitkaur@c14m2.clusternet | workq | test | Q | 112 | 0.00% | ||
13477.c14m1.clusternet | pradhi@c14m2.clusternet | workq | 3d_al2o3_rlx | Q | 112 | 0.00% | ||
13478.c14m1.clusternet | pradhi@c14m2.clusternet | workq | 3d_alf3_rlx | Q | 112 | 0.00% | ||
13482.c14m1.clusternet | jagjitkaur@c14m2.clusternet | workq | test | Q | 112 | 0.00% | ||
13483.c14m1.clusternet | shilendra@c14m2.clusternet | workq | CGGO_hbBi | Q | 112 | 0.00% | ||
13492.c14m1.clusternet | mab5@c14m2.clusternet | workq | _p1_OR_1_ | R | 00:14:42 hrs | 56 | 37.95 GB | 91.31% |
13494.c14m1.clusternet | ponnappa@c14m2.clusternet | workq | Se | Q | 112 | 0.00% | ||
13496.c14m1.clusternet | ponnappa@c14m2.clusternet | workq | SSe | Q | 112 | 0.00% | ||
13497.c14m1.clusternet | tanoykanti@c14m2.clusternet | workq | l50_L1000_k0_T0.31 | R | 00:01:50 hrs | 40 | 868.05 MB | 80.73% |