Executive Summary
In an effort to promote continual improvement at the Oak Ridge Leadership Computing Facility (OLCF), users were sent a survey soliciting their feedback regarding their experience as a user of the facilities and support services. At the end of the two-month survey period, 252 users completed the survey out of 813 possible respondents, giving an overall response rate of 31%. Findings of the survey are outlined as follows:
User Demographics
- Of the OLCF users surveyed, 245 (97%) reported using one or more of the following systems: XT5 Jaguar (94%), XT4 (55%), HPSS (37%), Lens (25%), and Development – Smoky (5%).
- Survey respondents’ projects were supported by INCITE (69%), Director’s Discretion (25%), ALCC (14%), and Other sources such as Frost (6%).
Overall Evaluation
- Overall ratings for the Oak Ridge Leadership Computing Facility (OLCF) were positive, as 89% reported being “Satisfied” or “Very Satisfied” with OLCF overall. Only 1% reported being “Dissatisfied” and 3% reported being “Very Dissatisfied”. On the scale of 1 = Very Dissatisfied to 5 = Very Satisfied, the mean rating was 4.16, a slight decrease from 4.31 in 2010.
- With regard to overall satisfaction with OLCF, the percent of satisfied (“Satisfied” and “Very satisfied”) respondents has been relatively steady from 2007 (86%) to 2011 (89%)
- In response to an open-ended question about the best qualities of OLCF, thematic analysis of user responses identified user support and assistance (found in 32% of responses), computational capacity (found in 30% of responses), and powerful/fast machines (found in 22% of responses) as the respondents’ top three choices.
- In addition to the best qualities of OLCF, respondents were asked what they felt OLCF could use do to improve their computing experience. The most prevalent theme identified was related to queuing policies and the speed of the queue (23%). The second and third most prevalent themes were better performance (21%) and more control over the scratch space purging (13%).
User Assistance Evaluation
- For support services used, 99% of the 221 respondents reported using the User Assistance Center (UAC), followed by 27% using the Scientific Computing/Liaison service, 10% using visualization, and 4% using End-to-End.
- When asked to rate their overall satisfaction with the user support services provided by the OLCF, the average response was 4.08 (SD = 1.07) on a rating scale of 1 = Very Dissatisfied to 5 = Very Satisfied. Mean ratings to questions of overall satisfaction with user assistance ranged from 3.99 to 4.29.
- Respondents with at least one interaction with the UAC and its staff were asked about the speed of initial contact and quality of the response; a large majority of the users (86% and 80%, respectively) were “Satisfied” or “Very Satisfied.”
Training and Education
- Mean ratings to questions of overall satisfaction with various aspects of OLCF training ranged from 3.89 to 4.46 on a rating scale of 1 = Very Dissatisfied to 5 = Very Satisfied.
- Overall satisfaction with the training was between “Satisfied” and “Very Satisfied” (M = 4.19, SD = 0.73).
- The majority of OLCF users said “Yes” (49%) or “Maybe” (44%) to the prospect of attending future OLCF training, based on their previous experience.
OLCF Communications
- Eighty-three percent of respondents (236) rated their overall satisfaction with communications from the OLCF as satisfied or very satisfied, while only 4% indicated they were dissatisfied.
- Users were asked to rate communications methods on a scale from 1 = Not useful to 3 = Very useful. Respondents indicated the email message of the week was most useful (Mean = 2.41). On average, users found all four types of communication methods useful.
OLCF Web Sites
- Overall, respondents indicated they were moderately satisfied with the main OLCF Web site (M = 4.02, SD = 0.77) and the OLCF Users’ Web site (M = 3.97, SD = 0.75).
- Ninety-seven percent of respondents indicated that they had visited the https://olcf.ornl.gov web site. Of these users (237), 37% indicated that they visit the site once a week or more, 3% of whom indicated that they visit the site every day. Only seven respondents indicated they had never visited the site.
- Seventy-eight percent of respondents indicated that they had visited the https://users.nccs.gov web site. Of these users (229), 12% indicated that they visit the site once a week or more, 1% of whom indicated that they visit the site every day. Fifty respondents indicated they had never visited the site.
OLCF Systems
- The majority of XT5 Jaguar PF users (85%) rated their satisfaction with XT5’s overall system performance as “Satisfied” (63%) or “Very Satisfied” (22%) on the scale of 1 = Very Dissatisfied to 5 = Very Satisfied, with a mean rating of 4.02.
- Similarly, 61% of XT4 Jaguar users rated their satisfaction with overall system performance for the XT5 Jaguar as “Very Satisfied” (19%) or “Satisfied” (42%) with a mean rating of 3.77.
- When OLCF users were asked if they found the development platform, Smoky, useful to their work, 9 (23%) of the 39 users who indicated they have used Smoky said “Yes.”
- Regarding maintenance and outages, 88% indicated sufficient notice is given prior to scheduled maintenance. On average across the machines, the majority also indicated that the level/frequency of unanticipated outages and scheduled outages was acceptable, 56% and 59% respectively.
Data Analysis, Visualization, and Workflow
- Users were asked several questions related to current and future productivity needs at OLCF. Sixteen percent of respondents indicated they need help applying workflow tools in their analyses/large scale simulations, 26% indicated they need help optimizing their I/O in their codes, and 33% indicated they would like to use OLCF’s end-to-end dashboard for displaying the results for their simulations in real time.
Introduction
A general survey of all users of the Oak Ridge Leadership Computing Facility (OLCF) at Oak Ridge National Laboratory (ORNL) in 2011 was launched on the internet October 21st, 2011 and remained open for participation through December 16, 2011. Information was collected about the various users, the user experience with OLCF, and the OLCF support capabilities. Attitudes and opinions on the performance, availability, and possible improvements for OLCF and its staff were also solicited.
The survey was created with contributions from OLCF staff and the Oak Ridge Institute for Science and Education (ORISE). The survey was hosted online by ORISE.
ORISE sent e-mails to a distribution list provided by OLCF staff. Over the next eight weeks, the OLCF Project Director sent two e-mail reminders the head of the NCCS User Group sent an e-mail reminder, and the User Council sent the final e-mail reminder to the general list. Each reminder message appealed in a different way to the users expressing why the survey was being conducted, the importance of the feedback provided, and the use of any responses in a positive manner to support OLCF. 252 users completed the survey out of 813 possible respondents (excluding the 27 invalid email addresses), giving an overall response rate of 31%.
The resulting data are discussed in the next section.
User Demographics
While the response rate is 31%, there is a good representative sample as shown below. Each segment of users is represented (Tables 1-5). The majority of users reported using the XT5 Jaguar PF (94%, Table 1) and the User Assistance Center (99%, Table 2). OLCF has a relatively equally balanced distribution of users in terms of their length of time using the systems (Table 3).
Table 1. Systems Used (n = 245)
Systems | N |
% |
---|---|---|
XT5 Jaguar PF | 231 |
94% |
XT4 Jaguar | 134 |
55% |
HPSS | 91 |
37% |
Lens | 62 |
25% |
Development (Smoky) | 12 |
5% |
Table 2. Support Services Used (n = 221)
Services | N |
% |
---|---|---|
User Assistance Center | 219 |
99% |
Scientific Computing/Liaison | 59 |
27% |
Visualization | 23 |
10% |
End-to-End | 8 |
4% |
Table 3. Length of Time as an OLCF User (n = 247)
Years as an OLCF user | N |
% |
---|---|---|
Greater than 2 years | 96 |
39% |
1 – 2 years | 74 |
30% |
Less than 1 year | 77 |
31% |
OLCF user project classifications include:
1) the Department of Energy’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, which aims to accelerate scientific discoveries and technological innovations by awarding, on a competitive basis, time on supercomputers to researchers with large-scale, computationally intensive projects that address “grand challenges” in science and engineering;
2) the National Center for Computational Sciences’ Director’s Discretion program, which is designed to give new researchers an opportunity to carry out a program of scalability and productivity enhancements to their scientific codes;
3) the Advanced Scientific Computing Research (ASCR) Leadership Computing Challenge (ALCC) program, which is open to scientists from the research community in national laboratories, academia and industry, and allocates up to 30% of the computational resources at National Energy Research Scientific Computing Center (NERSC) and the Leadership Computing Facilities at Argonne and Oak Ridge for special situations of interest to the Department’s energy mission, with an emphasis on high-risk, high-payoff simulations;
The most common project classification supporting respondents’ research is INCITE (64%), with 60% of respondents reporting INCITE as the only type of project they have (Tables 4-5).
Table 4. User Classification, by Project Type (n = 226)
Project(s) classification | N |
% |
---|---|---|
INCITE | 155 |
69% |
Director’s Discretion | 56 |
25% |
ALCC | 32 |
14% |
Other | 14 |
6% |
*Note. Users add up to more than 100% because some have more than one project type.
Table 5. User Classification, by Combination of Project Types (n = 226)
Project(s) classification | N |
% |
---|---|---|
INCITE only | 136 |
60% |
Director’s Discretion only | 37 |
16% |
ALCC only | 14 |
6% |
Other only | 12 |
5% |
INCITE and Director’s Discretion | 9 |
4% |
INCITE and ALCC | 6 |
3% |
ALCC and Director’s Discretion | 7 |
3% |
INCITE, ALCC, and Director’s Discretion | 3 |
1% |
INCITE and Other | 1 |
> 1% |
INCITE, ALCC, Other | 1 |
> 1% |
Overall User Satisfaction with OLCF
Users were asked to rate their overall satisfaction with the OLCF. Table 6 contains descriptive statistics by project classification. Mean responses were between 4.00 and 4.29 showing a high degree of satisfaction with OLCF across project classifications (Table 6). Of the optional questions, this question had one of the highest numbers of responses, with 92% of respondents providing their opinions. Of these, a total of 89% (207 respondents) reported being “Satisfied” or “Very Satisfied” with OLCF overall, only three (1%) reported being “Dissatisfied,” and only seven (3%) reported being “Very Dissatisfied”.
Analysis of “Satisfied” Users
The group of respondents indicating they were “Satisfied” or “Very Satisfied” with OLCF overall comprised users of all OLCF systems and all support services (with XT5 Jaguar PF and the User Assistance Center being most popular, respectively). Categories of length of time as an OLCF user were all represented as well: less than one year (71 respondents), 1-2 years (69 respondents), and greater than two years (84 respondents). Of the satisfied users who have been OLCF user for more than one year (n = 157), 63 (49%, indicated they noticed an improvement in overall system performance from the previous year. Satisfaction is also evident through other questions about various aspects of OLCF. The majority of respondents indicating overall satisfaction with OLCF also indicated they were satisfied with the service of the User Assistance Center, the OLCF Web site, data transfer abilities and bandwidth offered by OLCF, visualization services, and performance of the XT5 and XT4 platforms.
Analysis of “Dissatisfied” Users
Only three people (1%) reported being “Dissatisfied” and seven (3%) reported being “Very Dissatisfied” with OLCF overall. Four of the dissatisfied users indicated they had used OLCF for greater than two years, while three indicated using the center for 1-2 years, and another two indicated they had used the center for less than one year (one did not respond to this question, but did indicate improvement since last year in the next question). Four of the eight returning users (50%) indicated that they had not seen an improvement in overall system performance over the previous year, while the other four indicated they felt the systems had gone down in performance.
Table 6. Overall OLCF Evaluation – Descriptive Statistics by Project Classification
Satisfaction with OLCF | N | Mean |
Standard Deviation |
---|---|---|---|
Director’s Discretion | 49 |
4.29 |
0.74 |
Other | 14 |
4.29 |
1.07 |
INCITE | 146 |
4.14 |
0.86 |
ALCC | 31 |
4.00 |
1.15 |
All Users | 231 |
4.16 |
0.84 |
User Opinions of OLCF Services
In response to an open-ended question about the best qualities of OLCF, thematic analysis of user responses identified user support and assistance (found in 32% of responses), and computational capacity (found in 30% of responses) performance (found in 22% of responses) as the respondents’ top three choices (Table 7). Some respondent comments about these qualities included:
User support and assistance
- “OLCF is run very professionally. All of the people I have interacted with have been very helpful and knowledgeable.”
- “Support and accounts people have been incredibly helpful to me as a new user.”
- “User assistance is great, friendly and knowledgeable.” Capacity
- “The sheer scale of Jaguar allows for s/w scaling studies that are simply not possible elsewhere.”
- “It offers the largest net amount of processor-hours available at a single site.”
- “I’m most impressed by the OLCF’s ability to deliver timely access to very large processor counts on JaguarPF.”
Performance
- “The overall performance and reliability are the best that I have ever encountered with large-scale computing. The resulting calculations that we have been able to complete on Jaguar would not have been possible anywhere else.”
- “Internode communications is a lot faster than any other system I use. The people I’ve been in contact with are very competent. The nodes operate at consistent speeds (no nodes are lagging behind in a distributed run).”
- “The systems there have been very productive and interruptions in service handled mostly quickly and gracefully, the staff is informed and eager to help, outstanding leadership.”
Table 7. Best Qualities of OLCF (n = 74)
Theme | N |
% |
---|---|---|
User support and assistance | 24 |
32% |
Capacity | 22 |
30% |
Performance | 16 |
22% |
Systems/Facilities | 14 |
19% |
Stability | 13 |
18% |
Resources | 9 |
12% |
Ease of use | 4 |
5% |
The queuing system favoring large-scale jobs | 4 |
5% |
N/A | 1 |
1% |
In addition to the best qualities of OLCF, respondents were asked to select the areas in which they felt OLCF could use the most improvement (Table 8). Systems topped the list, with 94% of respondents selecting this. Thirty-five percent of respondents selected the response option, “Other” (35%) when asked what area(s) need the most improvement to enhance their experience using the OLCF. The most common response theme was that they were satisfied in general with the OLCF (21%), which was followed by system performance, queuing policy, and disk space which were each mentioned by 13% of respondents.
Table 8. Areas in Need of Improvement (n = 68)
What area(s) needs the most improvement to enhance your experience using the OLCF? |
N | % |
---|---|---|
Systems | 64 | 94% |
Web site | 30 | 44% |
Data analysis, visualization, and workflow | 28 | 41% |
Other | 24 | 35% |
User assistance | 21 | 31% |
Training and education | 21 | 31% |
Communications | 13 | 19% |
Note. Users add up to more than 100% because some provided more than one area for improvement.
User Assistance Center
Seventy-seven percent of the respondents had at least one interaction with the User Assistance Center (UAC) and its staff. The project classification with the highest percentage of users (92%) who had at least one interaction with the UAC was Director’s Discretion projects (Table 9).
Table 9. Number of User Assistance Center (UAC) Queries by Project Classification
Approximately how many total queries have you forwarded (via phone or e-mail) to the UAC this year? | INCITE (n = 150) | Director’s Discretion (n = 55) | ALCC (n = 32) | Other (n = 16) |
All Users (n = 243) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
0 | 34 (23%) | 4 (7%) | 4 (13%) | 5 (31%) | 55 (23%) |
1 – 5 | 102 (68%) | 35 (64%) | 21 (66%) | 10 (63%) | 154 (63%) |
6 – 10 | 5 (3%) | 10 (18%) | 4 (13%) | 0 (0%) | 17 (7%) |
11 – 20 | 7 (5%) | 3 (5%) | 2 (6%) | 0 (0%) | 10 (4%) |
Greater than 20 | 2 (1%) | 3 (5%) | 1 (3%) | 1 (6%) | 7 (3%) |
When asked to rate their overall satisfaction with the user support services provided by the OLCF, the average response was 4.08 (SD = 1.07) on a rating scale of 1 = Very Dissatisfied to 5 = Very Satisfied. Mean ratings to questions of overall satisfaction with various aspects of user assistance ranged from 3.99 to 4.29. Among project classifications, users with “Other” projects (M = 4.62, SD = 0.65) and Director’s Discretion projects (M = 4.36, SD = 0.76) were the most satisfied (Table 10). Overall, users reported a high level of satisfaction with OLCF service in providing support and responding to needs. When asked about the speed of initial response to queries, a large majority of the users (86%) were “Satisfied” or “Very satisfied” (Table 11).
Table 10. User Assistance Center (UAC) Evaluation by Project Classification
Overall, rate your satisfaction with the following aspects of User Assistance: | INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Speed of initial response to queries | 4.33 (0.90) | 4.44 (0.73) | 4.37 (0.61) | 4.36 (0.81) | 4.29 (0.62) |
Effectiveness of response to account management query | 4.21 (0.91) | 4.40 (0.70) | 4.20 (0.71) | 4.46 (0.78) | 4.22 (0.55) |
Speed of final resolution to queries | 4.21 (0.93) | 4.38 (0.75) | 4.30 (0.70) | 4.36 (0.81) | 4.21 (0.55) |
Speed of response to account management query |
4.26 (0.86) | 4.33 (0.82) | 4.07 (0.87) | 4.38 (0.77) | 4.20 (0.53) |
Effectiveness of problem resolution | 4.17 (0.98) | 4.42 (0.72) | 4.10 (0.88) | 4.45 (0.69) | 4.18 (0.94) |
Overall satisfaction | 4.09 (1.09) | 4.36 (0.76) | 4.00 (1.19) | 4.62 (0.65) | 4.08 (1.07) |
Response to special requests (e.g. scheduling exceptions, software installation, etc.) |
3.99 (0.99) | 4.22 (0.77) | 3.97 (0.85) | 4.18 (1.08) | 3.99 (0.93) |
Table 11. User Assistance Center (UAC) Evaluation –All Users
Overall, rate your satisfaction with the following aspects of User Assistance: |
N | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied | 4 = Satisfied | 5 = Very Satisfied |
M (SD) |
---|---|---|---|---|---|---|---|
Speed of initial response to queries | 208 | 4 (2%) | 3 (1%) | 24 (12%) | 74 (36%) | 103 (50%) | 4.29 (0.62) |
Response to special requests (e.g. scheduling exceptions, software installation, etc.) |
190 | 3 (2%) | 3 (2%) | 56 (29%) | 59 (31%) | 69 (36%) | 4.22 (0.55) |
Effectiveness of response to account management query | 206 | 4 (2%) | 0 (0%) | 37 (18%) | 71 (34%) | 94 (46%) | 4.21 (0.55) |
Effectiveness of problem resolution | 206 | 3 (1%) | 9 (4%) | 30 (15%) | 70 (34%) | 94 (46%) | 4.20 (0.53) |
Speed of final resolution to queries | 207 | 3 (1%) | 4 (2%) | 34 (16%) | 72 (35%) | 94 (45%) | 4.18 (0.94) |
Overall satisfaction | 225 | 14 (6%) | 5 (2%) | 22 (10%) | 93 (41%) | 91 (40%) | 4.08 (1.07) |
Speed of response to account management query | 210 | 3 (1%) | 2 (1%) | 40 (19%) | 69 (33%) | 96 (46%) | 3.99 (0.93) |
When asked to provide comments on ways in which OLCF can improve user assistance, 31% of respondents were satisfied with the assistance center’s services, while 21% offered complaints, and 10% suggested having technical staff available. The following quotes are representative of these themes:
Satisfied
- “Good job here. Consultants have been very helpful toward maximizing job throughput. Regarding account management, I think requiring notarization of the account form is a little paranoid. The secure id tags are cumbersome.”
- “They very promptly installed Hypre 2.7b with the configuration that I asked for. I was very impressed.”
- “Everything has been great. I’ve enjoyed working on Jaguar other than the fact that shared libraries are not supported.”
Complaints
- “User Assistance has not been able to help resolve problems with the LibSci library and did not refer me to someone who could — e.g., at Cray.”
- “Please notify us before changes are made. We should have been informed before SSH settings and queuing priority were changed.”
- “User assistance should be aware of all recent software and hardware changes, and include the possibility that a system change could be the reason why a code suddenly fails.”
Training and Education
Mean ratings to questions of overall satisfaction with various aspects of OLCF training ranged from 3.89 to 4.46 on a rating scale of 1 = Very Dissatisfied to 5 = Very Satisfied. Users’ rating of their “Overall satisfaction with the training event” was between “Satisfied” and “Very Satisfied” (M = 4.19, SD = 0.73). The aspect of training users’ were most satisfied with was the ease of registration (M = 4.46, SD = 0.62), whereas the aspect they were least satisfied with was the functionality of the webinar software (if attended online) (M = 3.89, SD = 0.84). Among project classifications, users with Other projects (M = 4.40, SD = 0.89) and Director’s Discretion projects (M = 4.27, SD = 0.80) were the most satisfied on average (Table 12). Refer to Table 13 for a breakdown of users’ satisfaction with training by response option for all users.
Table 12. User Satisfaction with Training by Project Classification
Overall, rate your satisfaction with the following aspects of Training: | INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Ease of registration | 4.52 (0.64) | 4.67 (0.49) | 4.29 (0.61) | 4.80 (0.45) | 4.46 (0.62) |
Venue (if attended in person) | 4.36 (0.66) | 4.42 (0.67) | 4.15 (0.69) | 4.25 (0.96) | 4.29 (0.68) |
Information provided before the event | 4.15 (0.67) | 4.27 (0.59) | 4.07 (0.62) | 4.40 (0.55) | 4.19 (0.65) |
Overall satisfaction with the training event |
4.26 (0.71) | 4.27 (0.80) | 3.86 (0.77) | 4.40 (0.89) | 4.19 (0.73) |
Workshop content | 4.15 (0.72) | 4.13 (0.92) | 3.86 (0.77) | 4.20 (0.84) | 4.02 (0.81) |
Functionality of webinar software (if attended online) |
3.86 (0.83) | 4.09 (1.04) | 3.64 (0.81) | 3.50 (0.71) | 3.89 (0.84) |
Table 13. User Satisfaction with Training – All Users
Overall, rate your satisfaction with the following aspects of Training: |
N | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied |
4 = Satisfied | 5 = Very Satisfied | M (SD) |
---|---|---|---|---|---|---|---|
Ease of registration | 48 | 0 (0%) | 0 (0%) | 3 (6%) | 20 (42%) | 25 (52%) | 4.46 (0.62) |
Venue (if attended in person) | 41 | 0 (0%) | 0 (0%) | 5 (12%) | 19 (46%) | 17 (41%) | 4.29 (0.68) |
Information provided before the event | 47 | 0 (0%) | 0 (0%) | 6 (13%) | 26 (55%) | 15 (32%) | 4.19 (0.65) |
Overall satisfaction with the training event | 48 | 0 (0%) | 0 (0%) | 9 (19%) | 21 (44%) | 18 (38%) | 4.19 (0.73) |
Workshop content | 48 | 0 (0%) | 3 (6%) | 6 (13%) | 26 (54%) | 13 (27%) | 4.02 (0.81) |
Functionality of webinar software (if attended online) | 37 | 0 (0%) | 1 (3%) | 12 (32%) | 14 (38%) | 10 (27%) | 3.89 (0.84) |
The majority of OLCF users said “Yes” (49%) or “Maybe” (44%) to the prospect of attending future OLCF training, based on their previous experience (Table 14). Only 20% (48) of users reported that they participated in live OLCF training events. The number one reason users gave for not participating in any live training events was that they do not have the time to attend (56%), which was consistent across project classifications. The second and third most frequently cited reasons for not participating were “Do not require training” (37%) and “Do not have the budget to attend” (28%) (Table 15).
Table 14. Plans to Attend Future Training Events by Project Classification
Based on your previous experience, would you attend a future OLCF training event? | INCITE (n = 49) | Director’s Discretion (n = 25) | ALCC (n = 17) | Other (n = 7) |
All Users (n = 82) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
Yes | 23 (47%) | 13 (52%) | 9 (53%) | 4 (57%) | 40 (49%) |
Maybe | 22 (45%) | 10 (40%) | 6 (35%) | 3 (43%) | 36 (44%) |
No | 4 (8%) | 2 (8%) | 2 (12%) | 0 (0%) | 6 (7%) |
Table 15. Reasons for Not Participating in Live Training Events by Project Classification
Training Topics | INCITE (n = 117) | Director’s Discretion (n = 32) | ALCC (n = 16) | Other (n = 9) |
All Users (n = 178) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
Do not have the time to attend | 70 (60%) | 17 (53%) | 8 (50%) | 5 (56%) | 99 (56%) |
Do not require training | 43 (37%) | 13 (41%) | 6 (38%) | 3 (33%) | 65 (37%) |
Do not have the budget to attend | 27 (23%) | 14 (44%) | 3 (19%) | 3 (33%) | 49 (28%) |
Prefer to learn on my own | 29 (25%) | 7 (22%) | 3 (19%) | 2 (22%) | 45 (25%) |
Training topics were not of interest | 18 (15%) | 5 (16%) | 1 (6%) | 0 (0%) | 24 (13%) |
The training was too basic | 7 (6%) | 4 (13%) | 1 (6%) | 0 (0%) | 13 (7%) |
The training was too advanced | 0 (0%) | 0 (0%) | 0 (0%) | 0 (0%) | 0 (0%) |
Users were asked to elaborate on other reasons why they did not participate in training. The majority of respondents (31%) stated they lived too far away, while 31% said they were new users or did not know about it, and 15% noted there was too much overlap with training last year (Table 16). The following quotes exemplify these themes:
Live too far away
- “I live and work in Europe and time zone differences are large.”
- “I live in Switzerland”
New user/didn’t know
- “Just recently became an OLCF user”
- “Have not been a user long enough”
Too much overlap with last year
- “Attended training previous year, intend to attend training coming year for titan”
- Too much overlap with last year’s training.”
Table 16. Other Reasons for not Participating in Training – All Users (n = 13)
Theme | N |
% |
---|---|---|
Live too far away | 4 |
31% |
New user/didn’t know | 4 |
31% |
Too much overlap with last year | 2 |
15% |
Already know the content | 1 |
8% |
A co-worker attended | 1 |
8% |
Don’t think it would help | 1 |
8% |
When presented with a list of training topics, respondents’ most frequently requested topic was GPGPU Programming (61%), followed by Tuning and Optimization (59%), and Hybrid Programming (MPI and OpenMP) (51%). The frequencies of requested topics were consistent across programs with GPGPU Programming (53-85%), followed by Tuning and Optimization (46-65%), and Hybrid Programming (MPI and OpenMP) (31-65%). However, respondents from the ALCC program indicated a slightly higher preference for Advanced MPI for their 3rd place selection (57% versus 50%). Also, respondents from the “Other” programs indicated Visualization and Data Analysis Tools was equally important as Tuning and Optimization (both 46%). Other less frequently requested topics included help with Advanced MPI, parallel debugging, visualization and data analysis tools, managing I/O, and MPI basics (Table 17).
Table 17. Training Desired by Project Classification
Training Topics | INCITE (n = 121) | Director’s Discretion (n = 48) | ALCC (n = 30) | Other (n = 13) |
All Users (n = 201) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
GPGPU Programming | 77 (64%) | 34 (71%) | 16 (53%) | 11 (85%) | 122 (61%) |
Tuning and Optimization | 73 (60%) | 31 (65%) | 18 (60%) | 6 (46%) | 118 (59%) |
Hybrid Programming (MPI and OpenMP) | 65 (54%) | 31 (65%) | 15 (50%) | 4 (31%) | 103 (51%) |
Advanced MPI | 56 (46%) | 26 (54%) | 17 (57%) | 5 (38%) | 98 (49%) |
Visualization and Data Analysis Tools | 51 (42%) | 24 (50%) | 12 (40%) | 6 (46%) | 84 (42%) |
Debugging | 47 (39%) | 22 (46%) | 14 (47%) | 4 (31%) | 77 (38%) |
Managing I/O | 40 (33%) | 16 (33%) | 9 (30%) | 3 (23%) | 64 (32%) |
MPI Basics | 25 (21%) | 15 (31%) | 9 (30%) | 3 (23%) | 51 (25%) |
The majority of respondents selected documentation (69%) as their preferred method of training, followed by online training (55%), and live in-person (32%). The three “other” training methods respondents suggested included: 1) “Self-guided lab exercises on the target system,” 2) “Searchable interface,” and 3) “Documentation– simple examples.” The order of preferred methods of training is consistent across programs except for “Other” projects which respondents indicated preference for online training (88%). Refer to Table 18 for users’ training preferences by project classification.
Table 18. Users’ Training Preferences by Project Classification
Training Method | INCITE (n = 146) | Director’s Discretion (n = 54) | ALCC (n = 32) | Other (n = 16) |
All Users (n = 238) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
Documentation | 104 (71%) | 38 (70%) | 22 (69%) | 13 (81%) | 165 (69%) |
Online training | 83 (57%) | 27 (50%) | 21 (66%) | 14 (88%) | 132 (55%) |
Live – in-person | 44 (30%) | 21 (39%) | 12 (38%) | 3 (19%) | 75 (32%) |
Live – via web | 39 (27%) | 14 (26%) | 5 (16%) | 6 (38%) | 60 (25%) |
Other, please specify | 2 (1%) | 0 (0%) | 0 (0%) | 1 (6%) | 3 (1%) |
Users were asked to provide comments on ways in which OLCF can improve their training and education curriculum. The majority of respondents (35%) suggested focusing on additional topics offered, while 29% desired hands-on sessions, and 24% asked for webinars and online documentation (Table 19). The following quotes exemplify these themes:
Additional topics offered
- “The things I want to learn are below MPI (Portals, uGNI, DMAPP). I do not blame you for not offering courses on these topics, but it would be nice if there was documentation to supplement what Cray publishes, which is not great. In short, OLCF is not a great place for users who want to develop state-of-the-art software that employs architecture-specific optimizations.”
- “During the hands on training session only basic logon was proposed. It would have been better if some simple exercises were provided for MPI programming or I/O management was provided. This would have also been the case for visualization where computer configures to run exercises could be provided and we could have a chance to work with the visualization tools.”
- “I would like to have workshop / training on the more advanced optimization stuffs, especially related to hardware characteristics (e.g. the new Gemini interconnect, some knowledge / information on the new Bulldozer CPU, GPU).”
Personal or in-class hands-on training
- “The Spring Training Session was extremely good, but a bit more hands-on with OpenMP, MPI, and DDT would have been good. It was a lot of information, so balancing it with a bit more hands on would be good.”
- “Hands on tuning and optimization training for users trying to catch up with the constant hardware improvements is probably the most critical training.”
Webinars and online documentation
- “Have shorter and more focused online presentations and tutorials Instead of having one-hour presentations on a broader topic. These are better suited for on-site live training. However, they should still be available for later viewing on the NCCS/OLCF web sites.”
- “Getting slides put up immediately following training would be helpful.”
Table 19. Comments to Help the OLCF Improve Their Training and Education Curriculum (n = 17)
Theme | N |
% |
---|---|---|
Additional topics offered | 6 |
35% |
Personal or in-class hands-on training | 5 |
29% |
Webinars and online documentation | 4 |
24% |
Examples | 2 |
12% |
Better publicity | 2 |
12% |
Miscellaneous suggestions | 1 |
6% |
OLCF Communications
Eighty-three percent of respondents (n = 236) rated their overall satisfaction (on a rating scale of 1 = Very Dissatisfied to 5 = Very Satisfied) with communications from the OLCF as “satisfied” or “very satisfied”, while only 4% indicated they were dissatisfied (M = 4.06, SD = 0.90). Satisfaction with OLCF communications was highest among users with “Other” projects (M = 4.25, SD = 0.77). Refer to Table 20 for users’ satisfaction with OLCF communication by project classification.
Table 20. Users’ Overall Satisfaction with Communications from the OLCF by Project Classification
Satisfaction with OLCF | N |
Mean |
Standard Deviation |
---|---|---|---|
Other | 16 |
4.25 |
0.77 |
Director’s Discretion | 51 |
4.20 |
0.92 |
INCITE | 147 |
4.05 |
0.91 |
ALCC | 31 |
4.03 |
0.75 |
All Users | 236 |
4.06 |
0.90 |
Users were asked to rate communications methods on a scale from 1 = Not useful to 3 = Very useful. Respondents indicated the email message of the week was most useful (Mean = 2.41). On average, users found all four types of communication methods useful (Table 21). This is consistent across programs. See table 22 for a more detailed breakdown of averages.
Table 21. Users’ Communication Methods by Project Classification
Please rate the following communications methods: | INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Weekly Email Message | 2.36 (0.55) | 2.59 (0.54) | 2.24 (0.51) | 2.17 (0.58) | 2.41 (0.55) |
General Email Announcements | 2.31 (0.52) | 2.28 (0.54) | 2.23 (0.50) | 2.27 (0.65) | 2.29 (0.53) |
Opt-In Email Notification Lists | 2.29 (0.55) | 2.42 (0.67) | 2.05 (0.59) | 2.10 (0.57) | 2.27 (0.55) |
Message of the Day (MOTD) | 2.19 (0.60) | 2.13 (0.67) | 2.11 (0.57) | 2.43 (0.53) | 2.14 (0.62) |
Table 22. Communication Methods – All Users
Please rate the following communications methods: | N | Not aware of method | 1 = Not at all useful | 2 = Somewhat useful | 3 = Very useful | M (SD) |
---|---|---|---|---|---|---|
Weekly Email Message | 222 | 14 (6%) | 7 (3%) | 114 (51%) | 96 (43%) | 2.41 (0.55) |
General Email Announcements |
227 | 11 (5%) | 8 (4%) | 135 (59%) | 71 (31%) | 2.29 (0.53) |
Opt-In Email Notification Lists |
216 | 73 (34%) | 11 (5%) | 81 (38%) | 49 (23%) | 2.27 (0.55) |
Message of the Day (MOTD) | 224 | 72 (32%) | 20 (9%) | 91 (41%) | 41 (18%) | 2.14 (0.62) |
Users were asked to rate their likeliness to download and use mobile phone apps on a rating scale of 1 = Not at all to 5 = Definitely. Averages ranged between 2.10 and 2.93, indicating a somewhat neutral response to this question. The highest rated was “See a snapshot of the queue” with an average of 2.93 for all users. This app was consistently the highest rated across programs (2.08 to 3.23) with users in the ALCC rating it the highest (Table 23). Refer to table 24 for a more detailed breakdown of responses. Eighty-eight percent of respondents (189) selected “yes” they would download and use a mobile phone app that allowed them to do things not listed in Table 29.
Table 23. User Likeliness of Downloading Mobile Phone Apps by Project Classification
Please rate the likeliness that you would download and use a mobile phone app that allowed you to: |
INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD)* | |
See a snapshot of the queue | 3.04 (1.56) | 2.88 (1.65) | 3.23 (1.52) | 2.08 (1.71) | 2.93 (1.57) |
View real-time system status notifications |
2.88 (1.55) | 2.78 (1.58) | 3.10 (1.51) | 2.08 (1.71) | 2.80 (1.54) |
View your usage | 2.80 (1.52) | 2.58 (1.53) | 3.13 (1.52) | 2.08 (1.71) | 2.72 (1.51) |
Read important center communications | 2.56 (1.34) | 2.62 (1.48) | 2.66 (1.40) | 1.75 (1.36) | 2.52 (1.37) |
View the status of open OLCF HelpDesk tickets |
2.35 (1.26) | 2.29 (1.34) | 2.40 (1.19) | 1.75 (1.36) | 2.31 (1.26) |
Query the OLCF knowledgebase, https://www.olcf.ornl.gov/suppor t/knowledgebase for answers to common issues |
2.37 (1.27) | 2.26 (1.40) | 2.32 (1.25) | 1.75 (1.36) | 2.28 (1.27) |
Read OLCF science, people, and technology highlights |
2.12 (1.15 ) | 2.08 (1.37) | 2.33 (1.30) | 1.75 (1.36) | 2.12 (1.21) |
View the support tip of the day | 2.14 (1.17) | 2.04 (1.31) | 2.29 (1.27) | 1.67 (1.23) | 2.10 (1.19) |
OLCF Web Site Evaluation
Ninety-seven percent of respondents indicated that they had visited the https://olcf.ornl.gov web site. Of these users (237), 37% indicated that they visit the site once a week or more, 3% of whom indicated that they visit the site every day. Only seven respondents indicated they had never visited the site. Fourteen percent of users in INCITE indicated they visited the OLCF web site at least twice a week. Director’s Discretion users had the highest visitation with 54% of users visiting the web site at least once a week. See Table 24 for a more complete breakdown.
Table 24. Frequency of Visits to OLCF Web Site
Please rate the likeliness that you would download and use a mobile phone app that allowed you to: |
N | 1 = Not at all | 2 = Probably not | 3 = Possibly | 4 = Probably | 5 = Definitely | M (SD) |
---|---|---|---|---|---|---|---|
See a snapshot of the queue | 228 | 72(32%) | 19(8%) | 40(18%) | 45(20%) | 51(22%) | 2.93 (1.57) |
View real-time system status notifications |
229 | 73 (32%) | 31(14%) | 41(18%) | 36(16%) | 48(21%) | 2.80 (1.54) |
View your usage | 228 | 75(33%) | 34(15%) | 39(17%) | 39(17%) | 41(18%) | 2.72 (1.51) |
Read important center communications | 224 | 76(34%) | 39(17%) | 49(22%) | 37(17%) | 23(10%) | 2.52 (1.37) |
View the status of open OLCF HelpDesk tickets | 222 | 78(35%) | 54(24%) | 51(23%) | 22(10%) | 17(8%) | 2.31 (1.26) |
Query the OLCF knowledgebase, https://www.olcf.ornl.gov/suppor t/knowledgebase for answers to common issues |
223 | 83(37%) | 54(24%) | 40(18%) | 32(14%) | 14(6%) | 2.28 (1.27) |
Read OLCF science, people, and technology highlights | 223 | 92(41%) | 57(26%) | 42(19%) | 19(9%) | 13(6%) | 2.12 (1.21) |
View the support tip of the day | 220 | 90(41%) | 59(27%) | 41(19%) | 18(8%) | 12(5%) | 2.10 (1.19) |
Overall, respondents indicated they were moderately satisfied with the main OLCF Web site (M = 4.02, SD = 0.77) based on a rating scale of 1 = Very Dissatisfied to 5 = Very Satisfied. OLCF system status information was the highest rated across programs (Mean averages 4.08 to 4.19) indicated users were more than satisfied. See Table 25 for a detailed breakdown.
Table 25. Evaluation of OLCF Web site by Project Classification
How often do you visit the OLCF web site, https://olcf.ornl.gov? | INCITE (n = 146) | Director’s Discretion (n = 52) | ALCC (n = 31) | Other (n = 16) |
All Users (n = 237) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
Every day | 4 (3%) | 2 (4%) | 0 (0%) | 1 (6%) | 6 (3%) |
Twice a week | 16 (11%) | 10 (19%) | 6 (19%) | 1 (6%) | 29 (12%) |
Once a week | 32 (22%) | 16 (31%) | 10 (32%) | 2 (13%) | 52 (22%) |
Twice a month | 25 (17%) | 8 (15%) | 5 (16%) | 1 (6%) | 38 (16%) |
Once a month | 29 (20%) | 4 (8%) | 4 (13%) | 2 (13%) | 39 (16%) |
Less than once a month | 34 (23%) | 11 (21%) | 6 (19%) | 9 (56%) | 66 (28%) |
I have never visited an OLCF web site | 6 (4%) | 1 (2%) | 0 (0%) | 0 (0%) | 7 (3%) |
Respondents indicated being most satisfied with the timely information regarding system status, with 83% reporting they were either “Satisfied” or “Very Satisfied” with this aspect of the site (Table 26). The aspect which had the highest percentage of respondents indicating they were either “Dissatisfied” or “Very Dissatisfied” was ease of finding information (6%). For each of the other aspects of the web site addressed, approximately 2-3% of users reported being either “Dissatisfied” or “Very Dissatisfied.”
Table 26. Evaluation of OLCF Web Site – All Users
Aspects of the main OLCF Web site |
N | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied | 4 = Satisfied | 5 = Very Satisfied |
Mean |
---|---|---|---|---|---|---|---|
OLCF system status information | 216 | 2 (1%) | 3 (1%) | 32 (15%) | 99 (46%) | 80 (37%) | 4.17 (0.80) |
Overall satisfaction with the OLCF web site |
224 | 2 (1%) | 4 (2%) | 53 (24%) | 103 (46%) | 54 (24%) | 4.02 (0.77) |
Accuracy of information | 219 | 1 (0%) | 7 (3%) | 41 (19%) | 109 (50%) | 61 (28%) | 4.01 (0.80) |
Timeliness of information | 216 | 2 (1%) | 4 (2%) | 53 (25%) | 103 (48%) | 54 (25%) | 3.94 (0.81) |
Ease of finding information |
219 | 2 (1%) | 12 (5%) | 52 (24%) | 101 (46%) | 52 (24%) | 3.86 (0.87) |
Seventy-eight percent of respondents indicated that they had visited the https://users.nccs.gov web site for either project and/or allocation information?. Of these users (229), 12% indicated that they visit the site once a week or more, 1% of whom indicated that they visit the site every day. Fifty respondents indicated they had never visited the site (Table 27).
Table 27. Frequency of Visits to OLCF Users’ Web Site by Project Classification
How often do you visit the OLCF Users’ web site, https://users.nccs.gov for either project and/or allocation information? | INCITE (n = 146) | Director’s Discretion (n = 52) |
ALCC (n = 31) | Other (n = 16) |
All Users (n = 229) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
Every day | 0 (0%) | 1 (2%) | 0 (0%) | 1 (6%) | 2 (1%) |
Twice a week | 5 (4%) | 3 (6%) | 3 (9%) | 1 (6%) | 9 (4%) |
Once a week | 11 (8%) | 4 (8%) | 2 (6%) | 2 (13%) | 17 (7%) |
Twice a month | 12 (9%) | 7 (13%) | 4 (13%) | 1 (6%) | 17 (7%) |
Once a month | 28 (20%) | 5 (10%) | 8 (25%) | 2 (13%) | 38 (17%) |
Less than once a month | 58 (41%) | 18 (35%) | 8 (25%) | 9 (56%) | 96 (42%) |
I have never visited an OLCF web site | 27 (19%) | 14 (27%) | 7 (22%) | 0 (0%) | 50 (22%) |
Overall, respondents indicated they were moderately satisfied with the OLCF Users’ Web site (M = 3.97, SD = 0.75). The Other projects had the highest overall satisfaction with a mean response of 4.27. Respondent average responses to accuracy of project information, overall satisfaction, and ease of finding project information are described in detail by project in Table 28.
Table 28. Evaluation of OLCF Users’ Web Site by Project Classification
Aspects of the OLCF Users’ Web site |
INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Accuracy of project information | 4.06 (0.75) | 4.14 (0.77) | 3.88 (0.74) | 4.18 (0.75) | 4.01 (0.75) |
Overall satisfaction | 3.97 (0.78) | 4.00 (0.72) | 3.92 (0.65) | 4.27 (0.79) | 3.97 (0.75) |
Ease of finding project information |
3.93 (0.80) | 4.08 (0.68) | 3.71 (0.75) | 4.00 (1.00) | 3.90 (0.76) |
Respondents indicated being most satisfied with the accuracy of project information, with 76% reporting they were either “Satisfied” or “Very Satisfied” with this aspect of the site (Table 29). The aspect which had the highest percentage of respondents indicating they were either “Dissatisfied” or “Very Dissatisfied” (3%) was ease of finding project information. For each of the other aspects of the web site addressed, approximately 1-3% of users reported being either “Dissatisfied” or “Very Dissatisfied.”
Table 29. Evaluation of OLCF Users’ Web Site – All Users
Aspects of the OLCF Users’ Web site |
N | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied | 4 = Satisfied | 5 = Very Satisfied |
M (SD) |
---|---|---|---|---|---|---|---|
Accuracy of project information |
169 | 1 (1%) | 0 (0%) | 40 (24%) | 83 (49%) | 45 (27%) | 4.01 (0.75) |
Overall satisfaction | 177 | 1 (1%) | 3 (2%) | 37 (21%) | 95 (54%) | 41 (23%) | 3.97 (0.75) |
Ease of finding project information |
173 | 0 (0%) | 5 (3%) | 45 (26%) | 86 (50%) | 37 (21%) | 3.90 (0.76) |
The three main themes identified among users’ responses to a call for suggestions for both of the web sites, including information and/or documentation that they would like to have access to were: information is wrong or not available (24%), provide more documentation (18%), and make information easier to find (18%).
Table 30. Suggestions for both OLCF Web Sites (n = 17)
Theme | N |
% |
---|---|---|
Information is wrong or not available | 4 |
24% |
Documentation | 3 |
18% |
Make information easier to find | 3 |
18% |
Suggestions | 3 |
18% |
Two separate web sites is confusing | 3 |
18% |
Examples | 1 |
6% |
Too slow | 1 |
6% |
Satisfied | 1 |
6% |
OLCF Systems Evaluation
Overall, respondents indicated they were satisfied with the OLCF systems (M = 4.00, SD = 0.82). Satisfaction with OLCF Systems was highest among users with Director’s Discretion projects (M = 4.17, SD = 0.75) and lowest among users with INCITE projects (M = 3.97, SD = 0.86). (Table 31)
Table 31. Users’ Satisfaction with OLCF Systems by Project Classification
Aspects of the OLCF Web sites | INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Sufficient notice given prior to scheduled maintenance | 4.21 (0.73) | 4.31 (0.65) | 4.00 (0.69) | 4.08 (0.79) | 4.21 (0.71) |
Sufficient project disk space | 4.03 (0.84) | 4.22 (0.67) | 3.97 (0.81) | 4.00 (0.95) | 4.05 (0.80) |
Bandwidth offered by OLCF | 3.85 (0.89) | 4.16 (0.75) | 3.90 (0.76) | 3.92 (0.64) | 3.93 (0.83) |
Ease of transferring data to/from the OLCF |
3.78 (0.96) | 3.98 (0.93) | 3.73 (0.91) | 4.00 (0.58) | 3.82 (0.95) |
Overall Mean (SD) | 3.97 (0.86) | 4.17 (0.75) | 3.90 (0.79) | 4.00 (0.74) | 4.00 (0.82) |
Overall, respondents indicated they were “Satisfied” or “Very Satisfied” with the OLCF systems (82% of users on average, across the system aspects evaluated). Respondents indicated being most satisfied with the notice given prior to scheduled maintenance, with 88% reporting they were either “Satisfied” or “Very Satisfied” with this aspect of the systems (Table 32). The aspect which had the highest percentage of respondents indicating they were either “Dissatisfied” or “Very Dissatisfied” (12%) was the ease of transferring data to/from the OLCF. For each of the other aspects of the systems addressed, approximately 1-6% of users reported being either “Dissatisfied” or “Very Dissatisfied.”
Table 32. Users’ Satisfaction with OLCF Systems – All Users
Aspects of the OLCF Systems | N | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied | 4 = Satisfied | 5 = Very Satisfied |
Mean |
---|---|---|---|---|---|---|---|
Sufficient notice given prior to scheduled maintenance | 229 | 1 (0%) | 3 (1%) | 23 (10%) | 121 (53%) | 81 (35%) | 4.21 (0.71) |
Sufficient project disk space | 228 | 1 (0%) | 14 (6%) | 19 (8%) | 132 (58%) | 62 (27%) | 4.05 (0.80) |
Bandwidth offered by OLCF | 226 | 3 (1%) | 12 (5%) | 33 (15%) | 128 (57%) | 50 (22%) | 3.93 (0.83) |
Ease of transferring data to/from the OLCF | 225 | 4 (2%) | 22 (10%) | 35 (16%) | 114 (51%) | 50 (22%) | 3.82 (0.95) |
Of the 180 respondents who provided answers when asked “Compared to previous years, have you noticed a change in systems performance overall at the OLCF?” 41% (74 respondents) said they noticed an overall improvement in systems performance. Users from Director’s Discretion noted the most agreement that there was a change in systems performance (56% selected yes). Users from Other indicated the highest disagreement with this question (75% said no). Details are provided in Table 33.
Table 33. Changes in Systems Performance Overall at the OLCF Compared to Previous Years
Compared to previous years, have you noticed a change in systems performance overall at the OLCF? | INCITE (n = 113) | Director’s Discretion (n = 39) |
ALCC (n = 25) | Other (n = 12) |
All Users (n = 180) |
---|---|---|---|---|---|
n (%) | n (%) | n (%) | n (%) | n (%) | |
Yes | 50 (44%) | 22 (56%) | 8 (32%) | 3 (25%) | 74 (41%) |
No | 63 (56%) | 17 (44%) | 17 (68%) | 9 (75%) | 106 (59%) |
When asked about satisfaction with various features of specific platforms, users were moderately satisfied in their satisfaction ratings of various aspects of the XT4 and XT5 (Tables 34-37).
Table 34. Evaluation of XT4 Jaguar by Project Classification
Aspects of XT4 Jaguar | INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Scratch disk size | 3.88 (0.81) | 3.84 (0.80) | 3.53 (0.77) | 4.17 (0.75) | 3.82 (0.78) |
Usability of batch queue system | 3.84 (0.76) | 3.84 (0.75) | 3.53 (0.70) | 4.17 (0.75) | 3.78 (0.81) |
Overall system performance | 3.84 (0.79) | 3.88 (0.78) | 3.58 (0.77) | 4.17 (0.75) | 3.77 (0.79) |
Accessibility of batch queue system | 3.81 (0.72) | 3.84 (0.80) | 3.47 (0.70) | 4.17 (0.75) | 3.76 (0.79) |
Job success rate | 3.74 (0.82) | 3.84 (0.80) | 3.47 (0.70) | 4.17 (0.75) | 3.74 (0.82) |
Scratch disk performance | 3.74 (0.84) | 3.76 (0.78) | 3.47 (0.70) | 4.17 (0.75) | 3.73 (0.80) |
Available 3rd party software, applications, and/or libraries |
3.76 (0.72) | 3.88 (0.83) | 3.47 (0.70) | 4.17 (0.75) | 3.72 (0.73) |
Archival storage | 3.78 (0.87) | 3.88 (0.83) | 3.42 (0.69) | 4.33 (0.82) | 3.71 (0.83) |
Interface with HPSS | 3.81 (0.89) | 3.80 (0.82) | 3.47 (0.70) | 4.33 (0.82) | 3.71 (0.85) |
Debugging tools | 3.68 (0.76) | 3.83 (0.82) | 3.37 (0.68) | 4.00 (0.89) | 3.67 (0.75) |
Job turnaround time | 3.64 (0.84) | 3.72 (0.84) | 3.47 (0.70) | 4.17 (0.75) | 3.64 (0.85) |
Frequency of unscheduled (unanticipated) outages |
3.63 (0.80) | 3.88 (0.83) | 3.47 (0.70) | 4.00 (0.63) | 3.59 (0.80) |
Frequency of scheduled outages | 3.64 (0.79) | 3.80 (0.76) | 3.37 (0.60) | 3.50 (1.05) | 3.58 (0.78) |
Overall Mean (SD) | 3.75 (.80) | 3.83 (.80) | 3.47 (.70) | 4.12 (.79) | 3.71 (.80) |
Table 35. Evaluation of XT4 Jaguar – All Users
Aspects of XT4 Jaguar | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied |
4 = Satisfied | 5 = Very Satisfied |
M (SD) |
---|---|---|---|---|---|---|
Scratch disk size | 1 (1%) | 0 (0%) | 40 (35%) | 49 (43%) | 23 (20%) | 3.82 (0.78) |
Usability of batch queue system | 2 (2%) | 0 (0%) | 40 (36%) | 49 (44%) | 21 (19%) | 3.78 (0.81) |
Overall system performance | 1 (1%) | 1 (1%) | 42 (37%) | 48 (42%) | 21 (19%) | 3.77 (0.79) |
Accessibility of batch queue system | 1 (1%) | 1 (1%) | 42 (38%) | 48 (43%) | 20 (18%) | 3.76 (0.79) |
Job success rate | 1 (1%) | 3 (3%) | 41 (37%) | 46 (41%) | 21 (19%) | 3.74 (0.82) |
Scratch disk performance | 1 (1%) | 2 (2%) | 44 (39%) | 46 (41%) | 20 (18%) | 3.73 (0.80) |
Available 3rd party software, applications, and/or libraries | 0 (0%) | 0 (0%) | 50 (44%) | 45(40%) | 18 (16%) | 3.72 (0.73) |
Archival storage | 1 (1%) | 1 (1%) | 23 (21%) | 37 (33%) | 23 (21%) | 3.71 (0.83) |
Interface with HPSS | 1 (1%) | 3 (3%) | 47 (42%) | 39 (35%) | 23 (20%) | 3.71 (0.85) |
Debugging tools | 0 (0%) | 0 (0%) | 56 (50%) | 36 (32%) | 19 (17%) | 3.67 (0.75) |
Job turnaround time | 2 (2%) | 3 (3%) | 47 (42%) | 43 (38%) | 18 (16%) | 3.64 (0.85) |
Frequency of unscheduled (unanticipated) outages |
1 (1%) | 2 (2%) | 55 (50%) | 35 (32%) | 17 (15%) | 3.59 (0.80) |
Frequency of scheduled outages | 1 (1%) | 2 (2%) | 55 (50%) | 38 (34%) | 15 (14%) | 3.58 (0.78) |
Table 36. Evaluation of XT5 Jaguar PF by Project Classification
Aspects of XT5 Jaguar PF | INCITE | Director’s Discretion |
ALCC | Other | All Users |
---|---|---|---|---|---|
M (SD) | M (SD) | M (SD) | M (SD) | M (SD) | |
Scratch disk size | 4.11 (0.81) | 4.20 (0.72) | 3.96 (0.74) | 4.22 (0.83) | 4.11 (0.76) |
Usability of batch queue system | 4.01 (0.74) | 4.02 (0.74) | 4.04 (0.79) | 4.11 (0.60) | 4.02 (0.73) |
Overall system performance | 3.98 (0.77) | 4.07 (0.68) | 4.07 (0.78) | 4.22 (0.67) | 4.00 (0.79) |
Accessibility of batch queue system |
3.96 (0.84) | 4.04 (0.75) | 4.00 (0.67) | 4.00 (0.71) | 3.96 (0.81) |
Job success rate | 3.89 (0.81) | 3.96 (0.97) | 3.96 (0.85) | 4.22 (0.67) | 3.95 (0.84) |
Scratch disk performance | 3.85 (0.90) | 4.04 (0.76) | 3.82 (0.77) | 4.22 (0.83) | 3.91 (0.85) |
Available 3rd party software, applications, and/or libraries | 3.82 (0.82) | 3.95 (0.89) | 3.84 (0.80) | 4.00 (0.87) | 3.84 (0.81) |
Archival storage | 3.81 (0.94) | 3.80 (0.92) | 3.58 (0.86) | 4.22 (0.83) | 3.80 (0.88) |
Interface with HPSS | 3.80 (0.96) | 3.91 (0.74) | 3.67 (0.73) | 4.20 (0.79) | 3.79 (0.89) |
Debugging tools | 3.55 (0.84) | 3.83 (0.79) | 3.70 (0.72) | 3.89 (1.05) | 3.67 (1.00) |
Job turnaround time | 3.60 (1.03) | 3.65 (1.04) | 3.75 (0.93) | 4.00 (0.71) | 3.63 (0.82) |
Frequency of unscheduled (unanticipated) outages |
3.50 (0.92) | 3.49 (0.97) | 3.37 (0.88) | 4.00 (0.53) | 3.61 (0.86) |
Frequency of scheduled outages | 3.54 (0.91) | 3.72 (0.91) | 3.32 (0.90) | 3.88 (0.64) | 3.53 (0.89) |
Overall Mean (SD) | 3.80 (.87) | 3.90 (.84) | 3.78 (.80) | 4.09 (.75) | 3.83 (.84) |
Table 37. Evaluation of XT5 Jaguar PF – All Users
Aspects of XT5 Jaguar PF | 1 = Very Dissatisfied |
2 = Dissatisfied | 3 = Neither Satisfied nor Dissatisfied | 4 = Satisfied | 5 = Very Satisfied |
M (SD) |
---|---|---|---|---|---|---|
Scratch disk size | 3 (1%) | 2 (1%) | 26 (12%) | 118 (56%) | 62 (29%) | 4.11 (0.76) |
Overall system performance | 3 (1%) | 3 (1%) | 26 (12%) | 132 (63%) | 46 (22%) | 4.02 (0.73) |
Usability of batch queue system | 2 (1%) | 8 (4%) | 29 (14%) | 120 (57%) | 50 (24%) | 4.00 (0.79) |
Job success rate | 2 (1%) | 11 (5%) | 28 (13%) | 120 (57%) | 48 (23%) | 3.96 (0.81) |
Accessibility of batch queue system | 3 (1%) | 10 (5%) | 31 (15%) | 115 (55%) | 49 (24%) | 3.95 (0.84) |
Scratch disk performance | 3 (1%) | 11 (5%) | 36 (17%) | 114 (54%) | 47 (22%) | 3.91 (0.85) |
Available 3rd party software, applications, and/or libraries |
1 (1%) | 8 (4%) | 54 (27%) | 97 (49%) | 40 (20%) | 3.84 (0.81) |
Interface with HPSS | 4 (2%) | 8 (4%) | 51 (26%) | 93 (47%) | 40 (20%) | 3.80 (0.88) |
Archival storage | 4 (2%) | 6 (3%) | 43 (22%) | 83 (43%) | 43 (22%) | 3.79 (0.89) |
39 | 8 (4%) | 18 (9%) | 48 (23%) | 95 (46%) | 39 (19%) | 3.67 (1.00) |
Debugging tools | 1 (1%) | 9 (5%) | 79 (41%) | 73 (38%) | 29 (15%) | 3.63 (0.82) |
Frequency of scheduled outages | 5 (2%) | 11 (5%) | 70 (34%) | 94 (46%) | 26 (13%) | 3.61 (0.86) |
Frequency of unscheduled (unanticipated) outages |
7 (3%) | 13 (6%) | 70 (34%) | 93 (46%) | 21 (10%) | 3.53 (0.89) |
For the XT5 and XT4 platforms, user satisfaction was highest with the scratch disk size (mean ratings of 4.11 and 3.82 respectively). The lowest rated aspect of the platforms for XT5 and XT4 was frequency of scheduled outages (3.53 and 3.58; Table 38).
Table 38. Comparison of User’s Average Satisfaction with Various Aspects of OLCF Systems
Aspects of systems | XT5 Jaguar PF platform | XT4 Jaguar platform |
---|---|---|
Scratch disk size | 4.11 (0.76) | 3.82 (0.78) |
Overall system performance | 4.02 (0.73) | 3.78 (0.81) |
Usability of batch queue system | 4.00 (0.79) | 3.77 (0.79) |
Job success rate | 3.96 (0.81) | 3.76 (0.79) |
Accessibility of batch queue system | 3.95 (0.84) | 3.74 (0.82) |
Scratch disk performance | 3.91 (0.85) | 3.73 (0.80) |
Available 3rd party software, applications, and/or libraries | 3.84 (0.81) | 3.72 (0.73) |
Interface with HPSS | 3.80 (0.88) | 3.71 (0.83) |
Archival storage | 3.79 (0.89) | 3.71 (0.85) |
Job turnaround time | 3.67 (1.00) | 3.67 (0.75) |
Debugging tools | 3.63 (0.82) | 3.64 (0.85) |
Frequency of unscheduled (unanticipated) outages | 3.61 (0.86) | 3.59 (0.80) |
Frequency of scheduled outages | 3.53 (0.89) | 3.58 (0.78) |
Overall Mean (SD) | 3.83 (.84) | 3.71 (.80) |
When asked about software, libraries, or tools they would like to have installed and if they were necessary or desired, 30 users responded (Table 39). The software/libraries requested by more than one respondent were the Matlab, hdf5, DDT, and SLEPc. Many other miscellaneous requests were also made. The following quotes were selected as examples of the miscellaneous requests.
Table 39. Software/ Libraries, or Tools that Users Would Like or Need (n = 30)
Application software/libraries, tools | N |
% |
---|---|---|
Other Desired | 8 |
27% |
Other NOS | 7 |
23% |
None | 4 |
13% |
Other Necessary | 3 |
10% |
Matlab | 3 |
10% |
hdf5 | 2 |
7% |
DDT | 2 |
7% |
SLEPc | 2 |
7% |
Data Analysis, Visualization, and Workflow
When asked how large their datasets for data analysis and visualization are, 211 users responded, the majority (40%) of whom said their datasets were less than 10 GBs (Table 48).
Table 40. How large are your data sets in data analysis and visualization? (n = 211)
Theme | N |
% |
---|---|---|
Less than 10 GBs | 85 |
40% |
10 to 50 GBs | 42 |
20% |
100 GBs to 1 TB | 38 |
18% |
Larger than 1 TB | 29 |
14% |
50 to 100 GBs | 17 |
8% |
With regards to data analysis and visualization tools currently used, the highest percent of respondents (30%) reported using VisIt. This was followed by miscellaneous tools not reported by other users (20%) and users own tools or offsite tools (16%). The rest of the 21 tools were reported by two or more users (Table 41).
Table 41. Data Analysis and Visualization Tools Currently Utilized (n = 122)
Theme | N |
% |
---|---|---|
VisIt | 36 |
30% |
Miscellaneous | 23 |
19% |
My own tools/ offsite custom apps | 20 |
16% |
IDL | 17 |
14% |
Matlab | 15 |
12% |
NCL/ gvncl | 14 |
11% |
ParaView | 14 |
11% |
VMD | 11 |
9% |
Python | 10 |
8% |
N/A | 7 |
6% |
NCO | 7 |
6% |
Gnuplot | 6 |
5% |
ncview, netCDF climate tools | 6 |
5% |
Tecplot | 5 |
4% |
Ferret | 4 |
3% |
Line plotting tools | 3 |
2% |
R | 3 |
2% |
CFView | 3 |
2% |
OpenDX | 2 |
2% |
Fortran | 2 |
2% |
Grads | 2 |
2% |
Mathematica | 2 |
2% |
Perl scripts | 2 |
2% |
In response to three questions regarding users’ visualization needs, the majority of respondents did not show interest or indicate their needs. Only 16% of respondents reported that they needed assistance using workflow tools for their analysis or large-scale simulations, while 26% wanted help in optimizing the I/O in their codes. Even more users 33% (62) indicated that they would be interested in applying the OLCF end-to-end dashboard to their simulations in real-time (Table 42).
Table 42. OLCF User Visualization Needs (n =193)
Needs/Interests | Yes n (%) |
No n (%) |
---|---|---|
Do you need assistance applying workflow tools for your analysis or large-scale simulations? |
30 (16%) | 160 (84%) |
Do you need help in optimizing your I/O in your codes? | 50 (26%) | 143 (74%) |
Would you be interested in applying our end-to-end dashboard, which is a web-based application, for displaying results (images/textual information) from your simulations in real-time? |
62 (33%) | 126 (67%) |