Bill Alves
Static Cling

www2.hmc.edu/~alves

My piece was done with Povray and Premiere on a Mac. The soundtrack was done with Csound. More information about it is available at http://www2.hmc.edu/~alves/static.html.

James Ellis
Whisper

www.secret-sauce.com

Aaron Ross
Trance Mission
www.dr-yo.com

Trance Mission is an experimental work of visual music. It echoes the practices throughout world religion designed to evoke awe, terror, and euphoria. Stained glass cathedrals, mandalas, and polyphonic chanting are a few of the inspirations for this work. It is a brief glimpse into a transcendent psychic space, and an invitation to lose oneself in the singularity of the infinite.

Created using very expensive toys such as the Scitex DVEOUS, Softimage, and XAOS Tools Pandemonium. To my chagrin, there is a band with the name Trance Mission. It's a pretty good band, although I think it may now be defunct. Since I thought of the title way back in 1991, I decided to use it anyway.

Trance Mission is 95% video feedback using a super-advanced realtime digital effects device called the DVEous. (The DVEous is worth about US$250,000 retail, and I was very lucky to be able to use it.) The output of the device was directly connected to its own input, and four simultaneous channels of effects (crop, resize, colorize, perspective) were applied. I created keyframes which I then scrubbed through in real time using the joystick. I listened to some of my college radio shows while improvising with the visuals.

I created the images in real time in two overnight sessions. Final output went to Betacam SP.

After watching the window dubs from my video improv session, I thought about what audio would work well with the visuals. I recorded the audio at home using a PC and various software such as Digidesign Session, Cakewalk, Cool Edit, and Audio Compositor. Sound sources included my voice, analog synthesizer, drum machine, and samples generated by exploiting a malfunctioning noise reduction algorithm in Cool Edit.

Once the audio was done, I dumped it to Beta SP and laid snippets of images over the audio. So the piece was edited to the music. I also added a little bit of procedural animation I had laying around, unused from a previous project. I created this animation using the Marble texture in Softimage and some embossment FX in XAOS Pandemonium.

Aaron Ross
Cruise the Circuit

www.dr-yo.com

My latest computer animated video began its life as a musical piece, which you can hear in MPEG-3 or RealAudio.

I took the audio recordings of Cruise the Circuit to The Experimental Television Center in Owego, New York, where I was offered a five-day artist's residency. With their custom-built video synthesizer system, I transduced the audio signals into video imagery.

The next step was to digitize over 20 minutes of video for subsequent manipulation in 3D Studio MAX. I applied the video as texture maps to geometry in the scene. The audio controller in 3D Studio MAX provided additional geometric transformations, locked to the music tracks.

The result is a tightly orchestrated piece of visual music. I rendered two camera views for a stereoscopic 3D version of the animation. The standard monoscopic version is available on the VHS video, Opus Alchymicum.

Here is the descriptive statement I wrote about the piece.

For nearly a century, filmmakers and other visual artists have attempted to emulate the ineffable qualities of music, and to create visual analogies to specific musical pieces. From Kandinsky’s music-inspired paintings, through the abstract animation of Oscar Fischinger and others, to the lightshows of the 1960s, many artists have struggled with the inherent paradoxes of music visualization. The holy grail of synaesthetic art seems elusive, but electronic tools open up many paths toward perceptual unity.

Cruise the Circuit is a synthesis of many artistic disciplines and techniques. In contrast to the usual collaboration between musician and filmmaker, this piece is the result of a singular aesthetic vision. It is a musical composition I wrote, performed, and recorded with the intent of subsequent visualization.

Previous experiments with audio/video synhesis, virtual reality and computer animation have given me the experience to utilize those aspects of disparate technologies which most effectively meet my artistic goals. Analog synthesis, often considered obsolete in the world of video art, presents many opportunities for realtime creation of visual music. A custom-built analog video synthesizer is like a fine Stradivarius, and will always have unique aesthetic properties, regardless of technological progress or the whims of fashion.

These modular devices enable, among other things, the instantaneous transduction of audio signals into abstract video art. During a five-day residency at the Experimental Television Center in upstate New York, I transformed the individual audio tracks of Cruise into video imagery using many different synthesizers, including Nam June Paik’s famous "Wobulator." The disadvantage of this technique is the limited visual palette which video synthesis offers, so I recorded the results on videotape for subsequent manipulation in a virtual 3D space.

Ideally, the work would be presented in a realtime immersive environment such as the music-driven virtual reality created by Fakespace Inc. However, limitations of today’s hardware present technical barriers, so the only viable alternative for me is prerendered 3D animation. It is a clear trade-off: computer animation sacrifices the immediate feedback of VR in favor of an extended range of visual instrumentation. And in this context, form and content are nearly synonymous, so the advantages of greater control cannot be underestimated.

Since pure mimicry of audio events (the "Mickey Mouse" effect) is usually less than captivating, it is necessary to provide visual counterpoint to the music. In Cruise the Circuit, I accomplish this via the environments through which the visual instruments move. Subtle, chaotic algorithmic control over object position and rotation further enhances the unpredictability which sustains audience interest. Finally, direct soundfile control over object parameters such as polyhedral symmetry add another element of fascination to the experience.

The result is a polyphonic, multitimbral synthesis of light, color, and geometry. The piece can be considered as a time-based synaesthetic experience stored on a magnetic medium, but I prefer to view it as a document of a one-of-a-kind "color organ" which operates in extreme slow motion.

The music was a labor of love involving a Korg control voltage hardware sequencer and various analog synths. I had to program the music using a numeric keypad!!! It took forever.

Finally, I recorded the sequence to 4-track tape (thanks to Christian Greuel). Later I dumped the four audio tracks to individual tracks of an audio CD-R so I could isolate the instruments to drive the visuals. I added sync pops to allow me to align the visual tracks in the animation phase.

At the Experimental TV center, it was a relatively simple matter of feeding the audio into the Korg MS-20 synth, which converts pitch and volume into control voltages. These signals in turn controlled various analog devices, primarily the unique Jones modular video synth and the Paik Wobulator (a primitive raster manipulation device). There was some direct synthesis, some rescanning of the Wobulator, and some modulated video feedback with a camera and monitor.

Video output was recorded onto Hi-8 and later dubbed to Beta SP.

I digitized the Beta sources and used them as animated texture maps in the final 3D animation. I also used the volume dynamics of one original audio track to modulate animation parameters (the P and Q factors of the icosahedral object).

The animation process took about two or three weeks working in my spare time. Then the rendering process began. To get the look I wanted, I had to render in several passes and composite the results. For the five minute animation, my little 400 MHz Pentium II rendered constantly for three weeks. There are about 500,000 octahedra in the final segment, laid out in a cubic array.

I showed the renderings to my friend Michael Starks, who demanded that I create a stereoscopic version. So I rendered another camera view, which took another three weeks.

The piece looks spectacular in stereoscopy, so Starks was right. Cruise the Circuit isn't that impressive without the stereo effect, so I don't screen it in the version you see on one iota.

By the way, I will be showing it on a 35" flickerless stereoscopic color monitor for an upcoming show at the Exploratorium in San Francisco. More on that as the time approaches.

Christian Greuel
Still Life
www.xian.com

Fakespace Music presents "Still Life", a unique virtual reality experience. Unlike conventional VR, in which sounds are triggered by actions within the virtual environment, Fakespace Music worlds are activated by sounds coming from the real world.

In "Still Life", music is the driving force behind a continuous stream of stereoscopic 3D graphics. All visual elements move in temporal harmony with the music. The visitor is completely immersed, and free to explore a world generated by music.

The Soundsculpt Toolkit, developed by Fakespace Music, maps incoming musical data onto 3D objects within the virtual world. Behaviors of these objects are directly controlled by the music, responding in real time to create a rich, multi-sensory experience.

The concept for "Still Life" was developed by Christian Greuel, [then] a visual designer with Fakespace Music. In this updated version of the traditional still life motif, all of the geometry is original, crafted by hand from real world objects.

An original music composition by Aaron Ross brought this eerie scene to life. Greuel and Ross worked together to orchestrate the interaction of music and visuals, creating one of the most tightly choreographed works to be seen in a virtual environment.

This dynamic piece is presented using a Fakespace PUSH Desktop Display, a highly intuitive display and control device. The viewpoint of this high-resolution stereoscopic display is controlled by a simple "push" or "pull" from the user. The Soundsculpt Toolkit runs on a Silicon Graphics Onyx Reality Engine 2.

A second user can play on an electronic drumpad to control the translation, rotation, and scaling of one of the virtual objects. This provides a live, interactive accompaniment to the recorded score, in addition to emphasizing the direct relationship between the music and visuals.

The still life has been the subject of artistic study since ancient times. Painters have long worked directly from purposeful arrangements of selected objects, grouped in such a way as to provide a visual structure and poetic cohesion. Ordinary objects become infused with meaning by their association with everyday uses.

The inspiration for this work originally came from a paragraph in Nietzsche's "Birth of Tragedy", in which he relays an interpretation of the story of Socrates. The Greek philosopher, long a champion of rational thought, is haunted by a spirit beseeching him to take up the instrument of music. Socrates views music as beneath him, a lowly form of art fit only for the commoner. But after great hesitation, he finally acquiesces on the chance that there might be some essential of life that had somehow escaped the trappings of his logic.

The central figure in "Still Life" is a human skull, representing the head of the philosopher. Dead and gone, his entire world was once kept in this container. The outer world is reflected back to him, visualized by the expansion of a globe, inverted to encapsulate the subject.

The flame symbolizes the internal light of the mind, as the lightning does the workings of the external world. The anthropomorphic behaviors of the objects, most notably the dancing violin, describe how music may bring to life an otherwise stagnant world, heavy with the weight of existence.

The Soundsculpt Toolkit was developed as a demonstration of how VR can fulfill its promise as a tool for artistic expression and entertainment. The intention was to develop an alternative tool for aesthetic VR manipulation that allows for visual interpretation of musical themes. The solution required an interface that was both logical and precise, yet flexible enough to accommodate the visual design process.

A Soundscape is defined as a complete virtual reality experience in which the contents are controlled by the music. A scene is the visual staging area (i.e. virtual environment) within which a Soundscape takes place. A visual instrument is a computer generated geometry set within a scene that is to be visually affected by incoming control signals. Each visual instrument is associated with its own set of traits, or potential behaviors that can be applied (i.e. translate left, scale down, turn blue).

Successful visual instruments can be described as acoustically driven objects capable of exploiting 3D space in a temporal manner. Ideally, instruments should be capable of provoking emotional responses from the viewer, as an audio instrument may, or representing images relevant to a specific song.

The Soundsculpt Toolkit offers a high degree of artistic direction and control of 3D environments, facilitating a generation of world-class virtual environments that appear to be alive. The system can be configured to react to any kind of signal that can be sent over a standard serial port, allowing various types of external data to manipulate these virtual worlds. Because of its versatility and wide availability, Fakespace Music has adopted the Musical Instrument Digital Interface (MIDI) as a standard way of controlling visual instruments.

The Soundsculpt Toolkit consists of three basic parts that work together to turn MIDI data into the desired visual responses within the scene:

Although virtual instruments can be played live and in real-time, a choreographed Soundscape can be scripted using a MIDI sequencer. A sequencer, whether it is a dedicated piece of hardware or software, is a device that allows users to record, edit and playback layered compositions of MIDI data in real-time. Familiar to musicians as a tool for writing songs, a sequencer also becomes very useful for the choreography of visual music.

This music driven virtual reality opens up many possibilities for new types of artistic and entertainment experiences, such as fully immersive 3D "music videos" and interactive landscapes for live performance. Artists can create new worlds which transform in direct relationship with their music. Calling up various Patch Bay configurations on stage, the visual instruments may be played just as their audible counterparts are. The same information can be captured with sequencing devices in order to be presented as kinetic sculpture in a virtual environment.

__________________________________________________________

THE BIRTH OF TRAGEDY (excerpt from)

For that despotic logician had now and then with respect to art the feeling of a gap, a void, a feeling of misgiving, of a possibly neglected duty. As he tells his friends in prison, there often came to him one and the same dream-apparition, which kept constantly repeating to him: "Socrates, practice music." Up to his very last days he comforts himself with the statement that his philosophizing is the highest form of art; he finds it hard to believe that a deity should remind him of the "common, popular music." Finally, when in prison and in order that he may thoroughly unburden his conscience, he consents to practice also this music for which he has but little respect. And in this mood he composes a poem on Apollo and turns a few Aesopian fables into verse. It was something akin to the demonic warning voice which urged him to these practices; it was due to his Apollonian insight that, like a barbaric king, he did not understand the noble image of a god and was in danger of sinning against a deity - through his lack of understanding. The voice of the Socratic dream-vision is the only sign of doubt as to the limits of logic. "Perhaps" - thus he must have asked himself - "what is not intelligible to me is not therefore unintelligible? Perhaps there is a realm of wisdom from which the logician is shut out? Perhaps art is even a necessary correlative of, and supplement to, science?"

FRIEDRICH NIETZSCHE, 1872

__________________________________________________________

REFERENCES

P. Borsook, "The Art of Virtual Reality", IRIS Universe, no. 36, pp. 39-40, Summer 1996.

B. Blau and C. Dodsworth, ed., "Soundscapes Entertainment", Visual Proceedings: The Art and Interdisciplinary Programs of Siggraph 96, p. 82, Association for Computing Machinery, Inc., 1996.

M.T. Bolas, "Design and Virtual Environments," M.S. Thesis, Design Division, Stanford University, 1989.

H. Gardner, Art through the Ages, Harcourt Brace Jovanovich, New York, 1980.

C. Greuel, M.T. Bolas, N. Bolas and I.E. McDowall, "Sculpting 3D Worlds with Music," IS&T/SPIE Symposium on Electronic Imaging: Science & Technology, San Jose Convention Center, California, 1996.

F. Nietzsche, The Birth of Tragedy, Random House, New York, 1954.

R.A. Penfold, Computers and Music, pp. 142, 109-122, PC Publishing, Kent, 1989.

H. Rheingold, Virtual Reality, Simon & Schuster, New York, 1991.

D. Trubitt, "Into New Worlds: Virtual Reality and the Electronic Musician", Electronic Musician, vol. 6 no. 7, pp. 30-40, July 1990.

__________________________________________________________

LINKS

"Fakespace Music: Digital Bayou at SIGGRAPH 96" http://www.fakespacemusic.com/bayou96.html

"Visualizing Music in the Virtual Environment" http://www.operatotale.org/OT3/Percorsiingc.html

"Sculpting 3D Worlds with Music" http://www.xian.com/fsm/spie96_2653b-45.doc

"Online Multimedia Clips" http://www.dr-yo.com/multi.html

Ying Tan
mi vida (My Life)
un aldor (Dawn)
Elements in Transformation #1
Elements in Transformation #2

darkwing.uoregon.edu/~tanying

In my case, Belson's work + many hours conversations influced me a great deal (since 1996), for which I am ever grateful. His spirit, vision and inventive usage of material and techniques, are inspriring to me no matter how different the tools and process that I am using in my work.

Bret Battey
Writing on the Surface

www.BatHatMedia.com

Someone asked about meaning/intent:

The work arose in part out of my contemplation of time and our motivations for striving, what that striving means in the context of the immensity of time. 'All of our striving is like writing on the surface of water...'

When my work has themes like this, I tend to focus on expressing something about the lived emotions/experience I have around those issues rather than a specific message. Those feelings vary widely -- I have ambivalence -- so it is appropriate that the piece is somewhat "cryptic"!

Some quotes from Lao Tzu and Denise Levertov were focus points for me. Those quotes are at: http://students.washington.edu/bbattey/Gallery/wots-index.html

SOUND

Sound tools: Common Music (LISP based algorithmic music environment), Csound (for sound synthesis), plus a bit of the synthesis language SuperCollider. On Linux and Macintosh platforms.

Primary techniques:

* Csound phase vocoder manipulations of the opening of a Rennaisance polyphonic choir piece. [Occurs lots of places, but most audible as such after the climax where there is a sequence of closups of the tree dissolving away to water.]

* "Compressed Feedback Synthesis" - something I cooked up that places amplitude compression and pitch shifting inside a comb filter loop. [Ex: Drone and massive build at beginning of piece.]

* "Feedback Networks" - where I have networks of chaotic oscillators controlling each other. That algorithmic means was used to create the Csound note lists for the dense percussive material. It created 20 to 40 second chunks of material which I either threw away or edited and linked with other chunks to make the longer term forms. The percussion is lots of samples from a wood and metal working shop. The algorithm also distributes the events gesturally in quadraphonic space.

VIDEO

Tools: Canon XL1 digital video camera. Adobe AfterEffects for each segment, segments linked with Final Cut Pro. Astarte DVD Director for DVD preparation, and Astarte A.Pack for Dolby Digital surround sound encoding.

Interesting techniques:

* cymatics - low frequency vibrations of water and oil. Visible in lots of forms throughout the piece! Acknowledgements to Hans Jenny.

* Animation Extension to Common Music (AECM) -- software I have developed that allows common music to write MaxScripts for the 3D Studio Max 2.x animation environment. Using this, the loud, percussive center section was created mostly by using the same algorithms that generated the sound to also generate the images. I had 3D Studio Max create grey-scale renderings which I then luma-composited with video footage -- the intent here was to get textures that didn't have so much of the cliche 'glossiness' of computer animation.

* In the case of the twirling blue images that go with the audio "ticks", that also uses AECM, with cymatics footage in a splotlight in Studio 3D Max being cast obliquely on twirling cone primitives.

* Used TIGER Geographic Information Systems data to generate a street and rail map of Chicago and environments, exported to EPS and manipulated in AfterEffects. Pretty obvious where that is in the piece. [Not a recommended technique if you like to work fast :(]

* Video taped at night, a huge piece of ripped canvas on the Seattle Space Needle flapping in gale force winds made for an easily-chroma-keyed flapping gesture! & etc.

Ultimate form of delivery is 4-channel surround sound DVD.

Given that this all took two years, the next project is going to be shorter and simpler. I hope.

Copyright 2000 Fred Collopy. This document was last updated on 3/6/01; it is located at RhythmicLight.com.