Aaron Lieb
02 . 26 . 09 ProZeuxis Paper Prototype and Use-Case Script
This document serves two purposes. One is to act as the script of the ProZeuxis Paper Prototype video. It's second purpose is to provide a more detailed description of what is going on during this interaction. By combining this information, I am able to provide a better explanation of what is being seen in the less technical Paper Prototype video and how that relates to the inter workings of the system. The following conventions will be followed throughout for consistency and ease of understanding: Document Conventions •Use Case Name ( Indicates that the previous use case has been completed, and a new one of this title is beginning ) “Title or Caption Card” Interaction Step Heading (if not described by Caption Card) User 1
Actions taken by the user are listed here in sequential order
2
System Actions performed by the system are listed here in sequential order Some system steps performed while the user waits to respond Please confirm selection
3
Some user steps are taken while the system waits for input
Notes: - Any extra notes about this block of events including clarification of terminology and technical considerations are listed here as bulleted points.
Colors – The following colors are used when pertaining to: - Audio Input Nodes - Video Tracking Nodes - Generative Visual Nodes
“ProZeuxis Paper Prototype” •System Setup “First, find a space to setup (any typical venue would do)” Begin setup of Venue Space User System 1
Finds venue space with stage to set up system
“Setup Audio and Video Hardware” User 1
System
Sets up Tracking Camera and Projector mounted to ceiling or stage trusses
2
Attaches video cables to these components
3
Strategically place wireless microphones
“Connect Microphones to System Dedicated Mixer” User 1
System
Sets up Wireless Mic Receiver and connects four unbalanced 1/4” cables to 4 inputs of system mixer
“Connect Components to ProZeuxis Server” User 1
System mixer is attached to the ProZeuxis
System Is connected to mixer via machine running
Server's sound card via 1/4” stereo cable
ProZeuxis Server Application
1
converted to 1/8” jack
2
Tracking camera is attached via video cable to
Is connected to Tracking Camera via PZS
ProZeuxis Server video capture card
3
Projector is connected to the DVI-out of the
Is connected to Projector via PZS
ProZeuxis Server secondary video card
Notes: - 1ProZeuxis Server (PZS) is a dedicated machine physically located closer to the projector, and running the ProZeuxis Server application as well as the Presentation Client.
- ProZeuxis Server is equipped with a wireless network card for communication with a remote Laptop running the Console Client.
“Setup ProZeuxis Console Client” User 1
System
Sets up Laptop running the Console Client Application and ReactiVision software
2
1
Gains a Console Client Machine that can interact with the system
2
Connects ReacTable Console to laptop
Is connected to ReacTable Projector via CCM
ReacTable Projector connects to laptop's secondary DVI-out port
3
Connects ReacTable Tracking camera
Is connected to ReacTable Tracking Camera via
connects to laptop usb/firewire port
4
CCM
Connects MIDI keyboard controller using
Is connected to MIDI keyboard via CCM
MIDI cable to laptop's MIDI-in port
5
Wireless Router is connected to laptop's
Is connected to CCM over the wireless network
Ethernet port for communication with the remote PZS
Notes: - 1Console Client Machine (CCM) is any machine running the ProZeuxis Console Client - 2ReacTable consists of a tracking camera and a user interface projector mounted within a custom designed table housing
• Log into system “Laptop” Configure Input/Output Connections on PZS User
System
1
Logs into remote ProZeuxis Server
PZS application is launched by the user
2
Configures Input for Video-In and Audio-In
Communication is established on PZS with connected hardware components
3
Starts Presentation Client on the PZS
Presentation client is started by the user, and communicates with the PZS
Starting Client and Connecting to the PZS User 1
Starts Console Client on laptop
2
Selects machine running Server and Presentation Client
System
3
ReacTable will come alive.
4
Presents user with choice between loading a New or Old session
5
Selects “New Session” via finger cursor selection.
6
Waits for user to begin configuring tracking markers
• Configure Video Tracking Node “Add Markers to Begin” User 1
System Presented user with instruction
Add Markers to Begin
2
Adds a single tracking marker to the ReacTable tracking surface
3
1
Presents user with Main VMenu to select between configuring the current marker as an Input Node of the type “Generative Visual”, or as an Output Node of the type “Audio” or “Video”.
4
Selects to configure the marker as a “Video” Node via finger cursor selection
5
Links tracking marker to node “Video-In #1.”
6
Preview box of the Video-In tracking camera input
2
appears on the ReacTable projection surface. The Video-In #1 VMenu displays two options, “Color Tracker”, and “Planar Tracker.”
7
Selects to configure a Color Tracker via finger cursor selection.
Notes: - 1The Venn Menu Class will be a custom user interface component of ProZeuxis. Parameters will branch out from interlocking, semitransparent, elliptical nodes (looking much like a Venn Diagram). This feature will be abbreviated as (VMenu).
2
If other Video-In cameras were presently connected to the system, the user would now be asked to select which Video-In to configure this node with Because we only have the one Video-In, this step is skipped.
“Pick Color of singer's shirt to track” User 1
System Waits while user selects tracking color from selection box or by using an eye dropper tool on the video preview frame. Select Color to track
2
Makes color selection via finger cursor, then confirms selection by touching the Color Tracking node.
3
Displays two resulting Parameter-Output Nodes, 1
“Point” and “Shape”
4
2
System continuously updates Color Tracking data for use as live input for other nodes.
Notes: - 1Point outputs a point data-type (x,y) based on the center of the tracked color glob 2
Shape outputs a Polygon Shape data-type (point[]) based on the approximated glob edges
• Create Generative Visual Node Add second tracking marker to create first Generative Visual User 1
System
Adds second tracking marker to the ReacTable tracking surface
2 3
Presents user with Main VMenu Selects to configure the marker as a “Generative Visual”, Node via finger cursor selection
Navigate through available visuals by rotating the tracking marker User
System
Select a Visual
1
Rotates the tracking marker to view the available Visuals to choose from.
2
Selects “Emitter” via finger cursor selection.
• Link Generative Visual To Color Tracking Node Emitter “Center” can be linked to Color Tracker “Point” User 1
Slides the “Emitter” Visual toward the Video
System The Emitter parameter “Center” gravitates toward
Node to reveal compatible parameters.
2
the Color Tracker “Point” parameter.
1
Uses Two fingers on the ReacTable to “link” these two parameters
3
Creates a link between the two parameters via double finger cursor selection.
Notes: - 1Reveals compatible parameters by having them gravitate toward one another (incompatible parameters repel one another)
“ Effect will now follow the center point based on where the performer moves on stage” User 1
System
Performer wearing the tracked color shirt, walks back and forth across the stage
2
Updates Emitter parameters causing the visual's Center point to visibly follow the performer's movements
• Link Generative Visual To MIDI Controller “All other parameters will remain a default value until linked to a data source such as a tracker point.” “Parameters can also be linked to a MIDI Controller” User 1
System
Selects parameter of the Emitter Visual, “Particle Size” via finger cursor selection.
2
Displays overlay of the system MIDI keyboard on the ReacTable projection surface.
3
Selects the keyboard overlay with a finger cursor selection.
4
Indicates that a link between the parameter value to a MIDI Keyboard Controller can be made.
Which controller do you want to use? ( tweak to select )
5
User moves Slider 1 to indicate that they wish this Controller to effect “Particle Size”
6 This Control is not already in use.
1
Touch node to confirm link.
7
User confirms this link via finger cursor selection of the parameter node
Notes: - This indicates that the selected control is not already linked to some other parameter It will be possible to link the same control multiple times, but each time this occurs, the user will be notified with a visual list of the existing linkages
“Now sliding this control will effect the Emitter 'Particle Size' “ User 1
System
Moves Slider 1 controller
2
Shows Emitter Visual as the particle size changes from sliding the newly linked controller
• Create Audio Detection Node “Add another tracking marker to control Audio-In” User 1
System
User adds a third node the ReacTable tracking surface
2 3
Presents user with Main VMenu Selects “Audio” via finger cursor selection
4
Links Tracking Marker to “Master Audio” node
“Touch node to configure a Beat Detector” User 1
System
User selects Master Audio node via finger cursor selection
2
Displays Beat Detector sub node connected to the Master Audio node VMenu
“By default, the Beat Detector will analyze the audio signal with a flat equalizer” User 1
System
Presented with a graphic menu of a linear equalizer used to analyze the audio
1
2 Gesture on table to change the equalizer
3
Draws a curved line that gives emphasis to the lower band of the equalizer
4
Confirms change via finger cursor selection of Beat Detector sub node
Notes: - 1By default the equalizer is set with a flat setting “Select MIDI control to effect the detection 'Threshold' “ User
System
1 Select Threshold Controller
2
1
User selects a controller to effect this parameter
2
3
Displays EQ preview box including a line indicating the threshold
4
Confirms settings, via finger cursor selection of Beat Detector node
Notes: - 1By default the user will need to configure a control that effects how much the of the audio signal is analyzed by the Beat Detection 2
Their selection here will both set the starting value for the Threshold as well as determine the control that will be used to effect this parameter from now on
“The Audio Node can now output the parameters Pulse1, Velocity2, Pitch3, and Tempo4” Notes: - 1Pulse (boolean) – triggers boolean value (beatOn = true) each time a beat is detected 2
Velocity (int) – passes integer value based on the strength of a detected beat
3
Pitch (int) – passes integer value based on the midi pitch value of a detected beat
4
Tempo (int) – is set to a tempo (bpm) value based on a detection of beats over time
• Link Generative Visual To Audio Detection Node “Compatible nodes will gravitate toward one another” User 1
System
Slides the tracking marker linked to the Emitter Visual node toward the Beat Detector node Displays compatible parameters in the VMenu of each gravitating toward one another
“Let's link the Audio Velocity to the Emitter Velocity with two fingers” User 1
System
User links these two parameters to one another via a double finger cursor selection
“Now the Emitter is linked to the performance through Audio and Video Tracking” • Effect Master Audio Node using Mixer hardware “To better link the Audio to the singer's voice, adjust the microphone input on the mixer” User 1
System
Looks at the configuration of the system dedicated audio mixer
“If the singer is on “Input 1”, Mute or turn down the other inputs” User 1
System
Mutes Inputs 2 – 4 to isolate the incoming audio signal of the Singer's Microphone
2
Receives alternatively mixed Master Audio Input
“Now this one Generative Visual, Emitter, will react to the performer's movements and voice simultaneously” User 1
System
Performer user moves on stage while also speaking or singing into their microphone
2
VJ user changes MIDI Slider 1 to effect Emitter Visual, “Particle Size” parameter.
3
Shows results of the data linked Emitter Visual reacting to movement, singing, and MIDI input changes occurring simultaneously