David Saltz (current head of University of Georgia’s Theater and Film Studies Department) has identified 12 possible functions that media can serve as components of the live theater event:
Virtual Scenery. The media provide a backdrop depicting the environment within which the staged action takes place. This virtual scenery can either be static or animated.
Diegetic Media. Diegetic media exists as media within the world of the narrative--when, for example, character onstage turns on a radio or television set.
Instrumental Media. Interactive technology is used to create new kinds of instruments. For example, one could cover the stage floor with pressure-sensitive tiles and program each tile to produce a different sound or different image when a performer steps on it.
Interactive Costumes. Interactive costumes invert the relationship established by virtual scenery: while virtual scenery provides a backdrop against which the live actors perform, interactive costumes use the body of the live performer as a canvas for the media.
Dramatic Media. This type of media representation functions dramatically by interacting with the performers as a character in the narrative.
Virtual Puppetry. The media create a performer's double, functioning as a virtual performer in its own right under a performer's control.
Affective Media. The media produce an emotional effect on an audience. Affective media are nondiegetic; they do not exist within the character's world. The most familiar form is the background music that gave melodrama its name, now ubiquitous in film.
Subjective Perspective. The media depict the thoughts, fantasies, dreams, or sensations of some or all of the characters onstage.
Alternate Perspective. The media depict the events enacted onstage from another visual perspective.
Synesthesia. Synesthetic media is similar to affective media, but does not serve so much to tell the audience how to feel about the events onstage as to mirror the performance in a different sense modality. Synesthesia is a neurological condition in which stimulating one sense organ triggers the experience of another sense; for example, a person might "hear" colors or "see" temperature.
Although this is not an exhaustive list, having the available taxonomy was helpful for suggesting possible media uses and identifying places where media might be utilized for multiple purposes. If we knew, for example, that media was being used as synesthesia and illumination, adjustments to that specific cue were made taking both elements into consideration. In the third scene, when the Jeweller approaches an open window to acknowledge the incoming sunlight, color saturation levels were higher closer to the nexus of the media and the actor’s hands as he explored the temperature of the glow (figure 8).
When the MDS was used as virtual backdrop, the images utilized were always in motion, never static. This was done to evoke motion perception and maintain the kinetic illusion of the ride (figure 4).
Dark Ride consists of 24 scenes with 18 locations. I expanded the number of scenes to 103 in order to identify spaces in the text where possibilities for kinetic alteration were strong and to create an overall document to identify cues for light, sound and media. That document could then be used to create separate cue sheets for each department.
A dark blue moving image with animated smoke or fog made with particle animation was created in Autodesk® Maya® 3D animation software as a virtual backdrop to avoid having the CPU go to ‘sleep’ during a period of inactivity and also provide a ‘point of stasis’ (POS) for the media delivery system. This was meant as place where the monitors were displaying some imagery or light that complemented the production without being distracting. The POS image ran constantly in the background. The system would default to POS when there was nothing specific to a particular action of the play running in the foreground.
No comments:
Post a Comment