Erik Altman photoPerry S. Cheng photo

Harmonicon - Demonstration

This is a demonstration of the real-time capabilties of IBM's WebSphere RealTime (WRT) Java virtual machine product, with its unique real-time garbage collection technology, code-named Metronome, developed at IBM's T.J. Watson Research Center.

The application being demonstrated is a MIDI music synthesizer we wrote entirely in Java, running with a 2 millisecond buffer size (making it competitive with hardware synthesizers and suitable for live performance). This is an extreme hard real-time system, since any interference by the hardware, operating system, virtual machine, or daemon processes would cause an audible "glitch" in the music.

Music synthesis is, in fact, more stringent in its real-time needs than many other hard real-time systems. For instance, avionics typically operate at a period of 20 milliseconds, or about 10 times longer than the synthesizer.

Visualization of the all-Java Synthesizer running on top of a non-real-time Java virtual machine.

Above is a screenshot of the video we'll be showing you. The synthesizer takes a stream of MIDI notes (shown in the top panel) and turns them into digital audio data (shown in the middle panel) while the Java virtual machine runs and garbage collects memory allocated by the program (shown in the bottom panel).

The screenshot and video were made with our TuningFork real-time visualization and trace analysis tool.

An incoming MIDI note comes from an instrument like a keyboard, and represents a command like "play middle-C on a piano" (but also includes information about how hard the key was pressed, for how long, etc). The synthesizer takes the note and looks it up in a soundbank, applies sound-shaping filters, mixes it together with any other currently playing notes, and then sends the resulting CD-quality audio to a sound card to be played on a speaker.

The synthesizer was written in completely standard Java and made extensive use of object-oriented features and dynamic memory allocation. In addition, to stress the system additional threads are running and allocating about 8 megabytes per second of object data.

In the screenshot above, you can clearly see gaps in the audio data (the green waveform in the middle panel). These gaps are caused by Java's automatic memory management or garbage collection. Garbage collection is a crucial component of Java's ease of programming, safety, security, and portability. However, up until now this has also meant unpredictable interruptions of the program, which are shown by the red bars. You can clearly see that every time there is a major garbage collection it takes about 100 milliseconds and there is a large gap in the sound.

There are also smaller garbage collections (the thin red lines) which collect a memory area devoted to newly allocated objects, also called the nursery. Nursery collections take about 5 milliseconds and their effects are not obviously visible in the audio waveform, but if you listen carefully you will hear them in the audio -- between the big gaps caused by full garbage collections, the audio has a kind of "static" caused by the nursery collections.

Now, play the video and see how standard Java garbage collectors interfere with real-time programs (16MB Quicktime, 4MB Windows Media Video).

Visualization of the all-Java Synthesizer running on top of IBM's WebSphere RealTime Java virtual machine with Metronome Garbage Collection technology.

In the screenshot above, you can see the same synthesizer code running with the exact same input on IBM's WebSphere RealTime Java virtual machine, on top of an open-source real-time Linux kernel co-developed by the IBM Linux Technology Center.

As you can see, there are no gaps in the audio. That's because the Metronome garbage collector does its work in tiny slices of about 500 microseconds each. That's why the garbage collections are "pink", because the visualizer is actually showing hundreds of tiny little pieces of garbage collection activity. The slices are so short they don't cause any perturbation of the synthesizer application, even with a 2 millisecond latency.

Now, play the video and hear for yourself how much better music sounds with Metronome (16MB Quicktime, 4MB Windows Media Video).

Zoomed-in View of 140 milliseconds of Metronome Garbage Collection

Finally, the screenshot above shows a zoomed-in view of 140 milliseconds of time during which the Metronome garbage collector is active. Each time it runs, it interrupts the application for a time slice of about 500 microseconds and does a little bit of the work needed to reclaim memory.

These slices are spaced and grouped regularly, with about 6 slices every 10 milliseconds. The regular spacing of the slices combined with their predictable size is a key part of why Metronome works so well for so many kinds of real-time applications.

WebSphere RealTime (WRT) with Metronome is being used by Raytheon to develop the software for the Navy's new DDG 1000 destroyer and is being integrated into telecommunication systems. Pilot projects are underway with customers in other industries, especially by financial institutions for real-time stock and derivatives trading.