Title: Department of Computer Science & Australian Partnership for Advanced Computing Seminar Date: Monday, May 29, 2000 Time: 11:00 am to 12:00 pm Venue: Room N101, CSIT Building [108] Speaker: Dr Bernard A Pailthorpe (Associate Director, Scientific Visualisation San Diego Supercomputer Center & NPACI SDSC, University of California, San Diego. ) Description: "Scalable volume visualisation, spanning the scales: from brain mapping through oceanography to astrophysics" Abstract The frontiers of Scientific Visualisation now include problems arising with data that scales in size, or scales in complexity. This talk addresses the challenges in generating algorithms and constructing software libraries which are suitable for the large scale data emerging from tera-scale simulations and instruments. With datasets becoming larger, moving into the 100GB-1TB realm, and more complex, scalable methodologies and tools are required. The efforts to address these challenges, currently underway at SDSC and within the National Partnership for Advanced Computational Infrastructure (NPACI), will be addressed. The ultimate aim of this research and development programme is to optimise visualisation queries across multiple, large data sets derived from motivating applications in astrophysics, planetary-scale oceanographic simulations and human brain mapping. The NPACI project is "end-to-end" in scope, with research spanning data handling, graphics, visualisation and scientific application domains. Research themes include: efficient access to large archives, and data orchestration; performance optimisation of the algorithms on parallel architectures; scalable, direct volume rendering with perspective viewing; and feature recognition. Components of the research focus on the following areas: intelligent data storage, layout and handling, using an associated "FloorPlan" (meta data); compiled expression trees, encapsulating visualization queries, which yield efficient data movement and computation on advanced computing architectures; extension of SDSC's scalable, parallel, direct volume renderers (MPIRE & VISTA) to allow perspective viewing; the interactive rendering of fractional images (imagelets), computed on-the-fly from data-blocks streams; and intelligent volume decimation, facilitating interactive examination of the largest datasets on a range of graphics platforms. These concepts are coordinated within a data-visualisation pipeline which operates on component data blocks sized to fit within available computing resources. A key feature of the scheme is that the meta data, which tags the data blocks, can be propagated and applied consistently: at the disk striping level, in managing the distribution of computations across parallel processors; in imagelet composition; and in feature tagging. The system operates in a client-server model and is designed to utilise data and compute resources in the "Grid" environment. Compute-intensive tasks, such as rendering are executed on available supercomputers, while interacting with desktop GUIs. Components of the volume tools have been implemented already, with a java3D API, and tested on volumes comprising (1000)3 voxels. Production renderings of astronomical objects have been executed on the pre-delivery NPACI Teraflops computer as a feasibility test of our scheme. Parallel efforts, underway in related groups, focus on scalable, rapid iso-contouring algorithms which can extend the more usual paradigms. We aim to coordinate with those efforts to combine the surface and volume visualisation tools. Research challenges in the science application domain provide the motivation to develop such tools. Current planetary-scale oceanographic simulations have resolutions limited to 2 deg. latitude and 2 deg. longitude, yet still require 40 hrs of computation per simulation run on the full SDSC Cray T90. With Teraflop computing resources coming on line, such simulations will be conducted at 10x and 100x resolution, soon yielding multiple sets of 100 Gbyte data output. In mapping the human brain, up to four distinct imaging modalities are used, with datasets already at 10s of GBytes. The immediate research challenge is composite these. Then there is the need to find visualisation metaphors and tools for navigation, feature finding and analysis within these dense, rich volumes. As characterises much of modern science, this research requires a multi-disciplinary investigative team, comprising computer and computational scientists, visualisation researchers and applications domain researchers. The work reflects the emerging challenges and opportunities presented by the ongoing revolution in the scientific method which is being stimulated by high performance computing. URL: http://cs.anu.edu.au/lib/seminars/seminars00/dept20000529 BIO Pailthorpe founded Sydney VisLab, with Australian Research Council funding, to support computational and visualisation research. That lab underpins research and teaching innovations in a broad spectrum of disciplines. Its recent work has provided visualisation support for high-resolution weather modeling, specifically in the context of the Sydney 2000 Olympics. Currently he directs the scientific visualisation program for NPACI and for SDSC, at UCSD. The group efforts there are focused on scalable volume visualization. He established and lead a research effort (at the University of Sydney) in computational physics, specifically simulating advanced materials by classical and quantum molecular dynamics. This lead to an understanding of the mechanism of diamond formation in carbon thin films. He has wide experience in physics education, including developing new classes in Computational Physics. He has advised Government at senior levels on HPC, including: with NSF in the USA; advice to the Australian Prime Minister's Science & Engineering Council, which resulted in a new funding program to establish the Australian Partnership for Advanced Computing in 1998; and advice to the New South Wales state Premier also resulting in a new HPC funding program in 1998.