Virtual Theater

Dateline: 2/9/98


Courtesy NYU Media Research Lab

The use of VRML for theatrical type performance, both live and virtual is beginning to take off. Beginning with the now classic Floops from the folks at Protozoa, who used live motion capture to make that little bugger move characters are being brought to life in a variety of ways. The illustration for this article is from a Siggraph 95 paper by Ken Perlin and Athomas Goldberg about Improv. The Improv system being developed at NYU's Media Research Lab is truly remarkable. The system "provides tools to create actors that respond to users and to each other in real-time, with personalities and moods consistent with the author's goals and intentions." To be honest I was very skeptical about the system, I've heard of it but never got around to checking out what it can do. This is (with best California girl voice) like way cool dude! The characters have this uncanny ability to appear truly lifelike they react to the camera's position, to other actors, to various parameters one can play with. Check out the Improv site and play with them yourself. The PC demos are a little sparse but quite amazing. The use of VRML as the rendering engine is terrific because of the ubiquity VRML provides. I'll cover the Improv in more detail for a future feature article...but in the meantime check it out.

One big, (really big) theatrical project is VRML Dream. In the Artistic Statement about the project is described as:

The two primary leaders of the project are Stephan Matsuba and Bernie Roehl, both major contributors to the VRML standard. To describe the project as ambitious is an understatement. They want to present a "live" VRML performance which will last approximately 30 minutes. It will have actors controlling avatars, like puppets, and the performers will be in several different physical locations, but in a shared virtual space. The data will be streamed to the audience. The first technology demo will take place at VRML98 with a full performance scheduled for sometime in April. We'll be sure to bring you a full feature on this project and keep you informed of developments.

Another major theater project,  is The Virtual Theater project at Stanford. Run by Dr. Barbara Hayes-Roth a major Artificial Intelligence researcher the project's goals are:

Still one more AI based research project is Carnegie Mellon's OZ project. This project is lead by Joseph Bates and CMU hotbed of AI activity for years is sure to produce some amazing results. Oz has characters called Woggles. One performance called Edge of Intention is described as: Both the Virtual Theater and Oz projects are focused on the intelligence aspects of theatrical characters, the hard part. Once an intelligent agent is created the graphics can be handled by existing graphics engines and formats such as VRML. In fact the Virtual Theater project has animated some sequences using the Improv system.

Finally if you want to read some more theoretical/philosophical issues surrounding issues of user interaction and storytelling in cyberspace check out Computers As Theatre by Brenda Laurel and Hamlet on the Holodeck : The Future of Narrative in Cyberspace by Janet Murray. Computers as Theatre provides a novel innovative perspective on human  computer interfaces and offers historical reasons for the emergence of narrative virtual environments. Hamlet on the Holodeck is an innovative look at storytelling in the context of  the on-line world. Interactive tales, stories as  games and games as stories are all explored in a  fascinating look at the evolving world of cyberspace.
 
Let's remember to stop by the chat room..and the bboards open up on Tuesday.
Hope you've enjoyed the show.

Previous Features