AkuVis: Exploring Visual Noise

Katy Bvrner and Ipke Wachsmuth

Faculty of Technology,

University of Bielefeld, PF 10

01 31, D-33501 Bielefeld,

Germany.

ABSTRACT

The AkuVis (Interactive Visualization of Acoustic Data) project is under development by researchers of the University of Bielefeld and Governmental Institutions. It seeks to create a highly interactive virtual environment of modeled acoustic data in order to sensitize and improve human decision-making.

KEYWORDS:

Visualisation, Decision-support.

INTRODUCTION

The AkuVis (Interactive Visualization of Acoustic Data) project is under development by researchers of the University of Bielefeld and Governmental Institutions. It seeks to create a highly interactive virtual environment of modelled acoustic data in order to sensitize and improve human decision-making. In particular, it attempts to enhance the understanding of noise emission data as a basis for governmental decisions about noise protection regulations for streets or industrial areas.

AKUVIS

A well-established method of visualizing data of noise pollution simulations are two dimensional plots. However, decision makers are often uncomfortable with this kind of presentation and the complexity inherent in these plots. In AkuVis, acoustic data are mapped into a three-dimensional visual and acoustic space as visual noise that can be sensed by eyes and ears and explored interactively.

Input data provided by the German TV are used to extract three-dimensional models of road maps and houses. Furthermore, numerical data of noise pollutions at discrete points modelled for night and day conditions are mapped onto the three spatial dimensions - x/y for position, z for decibel level. Regions showing the same decibel level show the same colour in the resulting acoustic landscape. To simulate the noise conditions in a certain region three general kinds of sound are employed. Permanent background noise provides an impression of the general dB value. Transitory noise equals temporary sounds of, e.g., a passing truck. Random events like a bicycle ring or a step on the accelerator pedal are very short. The sounds are merged depending on the decibel value of a selected region and the daytime.

The Responsive Workbench developed at the German National Research Center for Information Technology is used as a virtual reality output device. It allows projecting stereoscopic graphics onto a surface of a translucent, 6' by 4' tabletop. Users wear LCD Shutter glasses that allow for time-multiplexing different images to different eyes and are synchronized by an infrared signal from an emitter located near the scene. A stylus glove is used to pick virtual objects as well as to manipulate interaction elements. Electromagnetic position sensors keep track of the users' eye and hand positions. One user acts as active viewer, controlling the stereo projection

reference point. Other participants see stereo, but from the tracked person's perspective. A stereo audio system provides acoustic feedback. Several computers are used to process tracker data, run the application and rendering as well as to simulate the sound in real-time.

Running the application, the user is free to select several data sets. The Houses View presents the houses and streets of the modelled region only. The Night and Day View projects the decibel values for night and day pollution respectively. Moreover, the user may activate certain features introducing the street names, turning on the sound, replacing active head tracking by a standard normal view or inserting a virtual sensor in the shape of a human ear into the acoustic landscape.

Visually, users experience a richly detailed, interactively changing landscape illustrating the noise conditions in a city district. Different positions of the tracked glasses result in different perspectives of the scene giving a free view of previously hidden objects. Acoustically, the landscape can be explored by way of the virtual sensor. The ear's position relative to the acoustic landscape determines the sound level, frequency, and kind of sound samples played to simulate the noise conditions at the selected point. Several persons can discuss what they see in the visual noise. Our project partners, who have also tested the setting, found it very helpful for understanding complex noise emission data.

In a next step, the project aims to implement visual and acoustic zoom functionality such that different regions of a city can be selected and explored in detail. For the acoustic zoom, the height of the ear determines the diameter of the region observed. Placing the ear at street level, users can explore the local street noise at this position. Moving the ear up results in a larger diameter and thus in a global mixture of sounds of a certain region.

In the accompanying video the real-world problem, the setting used, as well as the interactive application are sketched.

ACKNOWLEDGEMENTS

The authors are grateful to Heiko Rommel and Timo Thomas who did substantial work on the implementation of the system. Additionally, we thank Elke Bernauer for providing data material as well as Peter Serocka and Marc Latoschik for giving technical advise.

REFERENCES

Bvrner, K., Fehr, R., Wachsmuth, I. (1998) 'AkuVis: Interactive Visualization of

Acoustic Data', 12. Internationales Symposium Informatik fur den

Umweltschutz, Universitdt Bremen, Germany.

Bvrner, K. (1998) 'AkuVis: Interactive Visualization of Acoustic Data',

http://www.TechFak.Uni-Bielefeld.DE/techfak/ags/wbski/akuvis/