The principal mission of ORNL’s Visualization Task Group is to help researchers gain a better understanding of their data through visualization techniques. We seek out and engage with projects at ORNL and with collaborators who might benefit from applying visual data understanding techniques to scientific data and to find ways of doing visualization that are different, are more effective, and better integrate with other research activities at ORNL. Contact Dave Pugmire, visualization task lead, at email@example.com.
OLCF users and the visualization team have a number of tools at their disposal to help analyze data. Lens is a cluster used for parallel analysis and remote visualization. The EVEREST PowerWall is also available at the OLCF, providing on-site users an extremely high-definition resource to view their scientific visualizations.
The visualization team also provides a portable stereo 3D projector to our customers for use within ORNL, providing the ability to view data in true stereoscopic 3D in conference room venues.
We use and support a wide variety of scientific visualization software, including VisIt, EnSight, POV-Ray, AVS/Express, ParaView, and IDL. Maya is used for advanced modeling. For wall delivery, Chromium is used to distribute OpenGL, while DMX is used to enable large-scale two-dimensional X11 graphics. We have custom solutions for delivering image and movie content to the EVEREST facility. Production tools such as AfterEffects, Photoshop, and Premier round out the tool kit, providing the tools for end-to-end delivery of high-quality visualizations.
We have a number of custom visualization solutions that we write and deploy as needed to solve specialized visualization needs. These custom solutions are tailored to each customer’s needs and may often be deployed in users’ offices as well as in EVEREST.
ORNL has a dedicated visualization team to assist customers in solving visualization and data-interpretation issues. Support can range from assisting users in displaying data on the EVEREST PowerWall to writing custom visualization tools for specialized needs to producing production-quality images and movies for publications and public relations.
As scientific simulations scale up exponentially, so does the pressure on processes and tools for making use of the data. The OLCF is committed to developing tools that enable researchers to efficiently manage, analyze, and visualize simulation results. A data management process dubbed “end-to-end” is being developed to free OLCF users to produce data and interpret it, rather than puzzle over how to move, store, and manage it. For more information on the end-to-end project, contact task lead Scott Klasky at firstname.lastname@example.org.
Many routine data management tasks have been either automated or taken on by OLCF staff so that scientists can concentrate on their research. One of the goals of end-to-end is to ensure that calculations produce all the information the researchers need. One way of doing that is to generate more metadata, or self-describing in- formation, along with the scientific data. For example, a researcher might produce scores of binary files containing thousands of data points from calculations of different variables. If the researcher doesn’t keep detailed notes, it could be difficult later to match data sets with the calculations that produced them. To avoid such situations, OLCF staff can help researchers set up calculations so that they generate metadata to provide a context and description in every data file stored. Because generating metadata uses computer time, it is important to make the process as efficient as possible.
The workflow process has been automated to automatically archive data on tape as simulations run to avoid filling up the disk space on the supercomputers. The provenance information for the data—what is being archived, where the data are, which run they are from—is also preserved to provide a context for all data files.
The fast networks at the OLCF make it possible for off-site users to keep tabs on their simulations. Researchers can watch their work in progress via a Web browser anywhere they have Internet access. Faster input/output allows quick delivery and constant updates. The capability to monitor simulations in progress using the Internet enables users to catch mistakes and correct them early in a run before valuable processing time has been wasted. Faster networks also make it possible to move simulation results from the OLCF supercomputers to computers at a user’s own institution for processing. The increased portability of data allows users to get a head start on reviewing their results and react to interesting trends they observe as they analyze them.