Visualization of Function Fields

John C. Anderson, Luke Gosink, Mark A. Duchaineau, and Ken Joy



With the increase in computing power and our ability to gather more and more data via increasingly powerful imaging and sensor technology, the size and complexity of scientific datasets continues to grow. Datasets that represent physical phenomena now contain billions of multi-valued, multi-dimensional, time-varying elements, and we are no longer able to fully analyze them.

Despite these advances in complex data generation and acquisition, the majority of the research in scientific visualization has concentrated on the scalar field. Several mature visualization techniques exist for scalar fields, including isosurfaces, slicing, and volume rendering.

In this project, we address function fields. Function fields map points in n-dimensional Euclidean space to one-dimensional scalar functions. Features in function fields are spatial regions where the functions are similar. Probes are used to derive scalar fields that show the similarity structure of function fields.

Projections and mappings of complex data to traditional visualization domains (i.e., scalar fields) often produce "collisions", which can hinder end users' ability to interpret and explore the data. In this paper, we derive scalar fields that capture similarity structures in function field datasets by using multiple probes. Visualizations of these scalar fields support exploration and interpretation. These methods are a major step forward in function field visualization, as they

Function fields themselves are just one of the myriad of data types that can populate the variables in a multi-dimensional dataset. Interactive exploration methods developed for function fields can be extended to multi-variate data; multi-variate data exhibits problems similar to function fields, including feature detection and segmentation, collisions, etc. The capability to visualize these complex fields allows us to gain insight into datasets we could not reach before.


Function fields arise in many application domains. Hyperspectral imaging systems are used in remote sensing for a broad range of applications, including environmental studies and military preparation. Each pixel in a hyperspectal image contains data for multiple spectral channels (instead of only grayscale or RGB), thus allowing more in-depth image analysis. The Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) is aircraft-mounted and acquires calibrated 614x512 images of up-welling spectral radiance. In AVIRIS images, each pixel consists of 224 radiance (or reflectance) samples over visible and short-wave infrared wavelengths. The image used in this paper, approximately 270 megabytes, is of Moffett Field and the San Francisco Bay.

The structure of hyperspectral images is shown in the following image:

Particulate pollution datasets used for air quality research also require function fields. These time-varying, three-dimensional function fields track particulate concentration. The size of aerosol particles, however, is an important factor in their toxicity; thus, each cell contains a sampled function of particle concentration versus diameter. We exhibit datasets that have 9-sampled functions, and 25 timesteps; they differ in type of aerosol particle, and in spatial extent. The first dataset (CRPAQS) is a 185x185x15 grid of particulate SO4 concentration throughout the San Joaquin Valley, California, U.S.A. from the California Regional Particulate Air Quality Study. The second dataset (National) is a 148x112x19 grid of particulate H20 concentration over much of the United States. Despite the low spatial resolutions, the CRPAQS dataset is approximately 450 megabytes, and the National dataset is approximately 260 megabytes.

The structure of the particulate pollution datasets is shown in the following image: