Dance in the Virtual Space
New York University
Tisch School of the Arts
Yasmin Schönmann
MFA Candidate 2018
Track: Dance & Technology
Master Thesis
Concept Yasmin Schönmann
Advisor Cari Ann Shim Sham
Choreography & Dance Yasmin Schönmann & Yuriko Hiroura
Lighting Design Keegan Butler
Music Zoe Yang
Kangding Ray
Sound Operator & Crew Sean Nederlof
Post Production Yasmin Schönmann
Location Tisch School of the Arts, New York, NY,
Premier May 12 2018
Link to work https://youtu.be/Hs1Nyl73UFQ
Have you ever wondered what it feels like to be on stage with performers?
Dancing on stage is exciting, yet can be nerve-wracking. You are exposing yourself to an unknown audience made invisible by blinding lights. It makes you feel naked and vulnerable, but at the same time powerful and special because everybody is here to see you and your art.
What if you had the possibility to change your perception of the performance, from enchanted to observer to active participant?
I have created a Dance in the Virtual Space by filming my choreography Worship in 360° in order to give an audience the opportunity to encounter a dance piece through virtual reality (VR) goggles. This is an intimate and personal experience in which the viewers find themselves in the midst of the piece that happens around them. The motion and the story of the dance is spun in a way that makes each viewer move his/her head to stay in touch with the choreography.
What interested me the most about this project was the act of reversing the typical audience-performer configuration of a theater-in-the-round by putting the audience in the center of the choreography, surrounded by the performers.
There are numerous concepts of theaters: black box theaters, theaters in the round, immersive theaters, proscenium theaters, etc. But none of these theaters can do what VR can.
Virtual reality simulates a real life experience. The immersive environment can be similar to the real world or it can be fantastical, creating an adventure not possible in our physical reality. In real life one would not be able to see an intimate, live performance from inside the piece. Putting the audience in the middle of the stage is rare, because this limits the size of the audience and revenue .
What I like about virtual reality is that it enables me to put the audience in whatever place I like, as well as to share my work with a lot of people without sacrificing intimacy and space. In terms of choreographing for 360, I was intrigued by thinking of a new concept for presenting my piece.
The reason why I came to grad school was that I wanted to learn and practice how to translate choreography from the stage to the digital medium. I choose the word `translate´ because you have to exactly do that. One has to keep the concept of performance alive, capture it and portray it through a digital medium without loosing its essence. That is why I decided to work with my choreography Worship that was designed for the proscenium theater in order to learn how to translate it from one performance venue to the other.
Whenever I start creating a dance for a regular theatrical space, I have a clear sense of direction and orientation; my canvas is designed by the upstage, sides, and downstage areas. In the case of envisioning choreography for a 360° space, I have to take into account that there are no more clear directions since the choreography is spinning in a circular motion around the camera/viewer. To experience the dramatic narrative of this choreography to the fullest extent, I wanted the audience to be in the middle of the piece in order to encounter the story up close and in an intimate way. That is why I chose this choreography to be presented in virtual reality.
This thesis concept gave me the opportunity to acquaint myself with the 360° space in the role of the choreographer as well as the film maker.
The Inspiration for the piece
What happens when people have too much power over us? Do we follow them regardlessly, or would we start to revolt at some point?
To learn more about the relationship of oppressors and the oppressed I started to do online research about powerful leaders and their impact on their followers. The case that struck me the most was the 1978 mass suicide led by Jim Jones, the leader of the People’s Temple in Jonestown, located in the South American nation of Guyana. On November 18, 1978, Jones instructed his followers—the majority of whom were American—to commit mass suicide to show the world that it is not worth living in this cruel and corrupt environment. 909 people worshipers living in this remote settlement voluntarily died, because Jones told them to. This deliberate mass suicide at Jonestown resulted in the greatest single loss of American civilian life, prior to the events of 9/11.
In my choreography Worship, I aim to demonstrate how worshipful admiration can affect someone, and gradually take over his or her life. The lead character finds herself in a web that is spun tighter and tighter by her idol. Although her admiration overtakes her power and her life, she has short moments where she is overwhelmed by the domination and tries to find her way back to an independent life. The audience will experience the struggles within the lead character induced by her idol, who eventually takes over her life. In the end she is no more than a puppet.
In the beginning of this piece we see two female dancers in a spotlight on stage left. One dancers is sitting up, the other one is laying on the floor in front of her. In silence we see how one is gently helping up the other. As they start to move together their relationship reminds of a mother taking care of her daughter. The mother caresses her child´s hair, holds her in her arms and never lets her out of sight. The daughter like character is slowly starting to trust her carer and together they travel through the space, always connected with one another. The music, initially a slow heart beat, has turned into a muffled bass that is gradually getting stronger. While both characters are getting bolder in their movement and partnering, we slowly see a shift happening. The gentle care has changed to a strong dictation, and even though the gestures are not aggressive the friendly atmosphere from the beginning is gone. During the course of the next minutes the dancers transition from a mother-child like relationship to a leader-follower one. The grip and hold of the former mother is strict and dominant whilst the follower tries to get out of the net that is spun around her. And while we see that the former daughter is revolting from time to time, it is clear that the leader has the upper hand, chasing after her prey and deciding where they travel together. In a last attempt to set herself free the follower pushes her leader down to the floor and runs off stage. The later spurts after her and for a moment the audience is left with an empty stage. After a couple of stretched seconds the leader returns, dragging her follower with her. She throws her onto the ground where the follower stays motionlessly. While the lights fade out we see how the dominant dancer is bent over the submissive one and is trying to suffocate her. Both fight for the upper hand while the stage turns dark.
Research - VR in Dance
Several dance companies have created virtual reality pieces in order to let audiences experience their works in a different way, making them more accessible online without losing a theatrical atmosphere. One good example is the virtual reality video Waltz of the Snowflakes by the Mark Morris Dance Group. In December 2017, this excerpt of Mark Morris´ choreography from The Hard Nut was filmed in 360 at BAM Fisher. The Hard Nut is a modern take on The Nutcracker and a Christmas favourite of the New York audience. In the Waltz of the Snowflakes Mr. Morris stays true to Tchaikovsky’s classic score and creates a twirling, vibrant snow fall. The male and female dancers embodying snowflakes are wearing short, glistening skirts, sparkling tops and hats reminiscent of snow piles. The ensemble is taking over the whole stage, gradually coming in from all entrances in small groups, leaving flakes of snow on stage, and then exit gracefully with saut de chat leaps. The amount of snowflakes on stage increases as the piece goes on and the dancers meet in bigger formations, threading through gaps between them and spreading more snow. At the culmination of the piece the whole ensemble is jumping, twirling and dancing together.
The VR experience of the choreography is kept simple. The viewer is positioned at the edge of the downstage area with the orchestra pit in the back to be able to see the whole stage. When starting to play the VR video one can see the whole stage and all the movement is happening only in front of the viewer. Knowing that this is a 360 degree video one will turn their head when wearing VR goggles or scroll the screen around to see what else is going on. Unfortunately there is just the orchestra in its pit directed by a conductor and the theater that is empty besides a few audience members. This position places the viewer at the top of the stage, instead of inside the dance. While this position is something that you cannot experience in a staged version, it does not fully embrace the power of what VR 360 degree filming has to offer.
That kind of concept makes me wonder what the goal and aspired experience was. Why was this piece captured in 360 degree and not in a normal video? The intriguing possibility of virtual reality is that you can put your viewers anywhere. The canvas capturing your art is not rectangular. Instead it wraps all around the viewer and you decide from which angle one should see your work. It allows for immersion and a controlled placement of perspective within a performance - you can make the audience move with your dance and story to engage them more than in a theater seat. Using 360 degree filming to portray a dance piece the same way it is experienced in reality, is a missed opportunity of the medium of virtual reality. The sole term virtual reality describes its biggest asset: this medium is close to reality but not quite. It seems to be realistic but it is not. So instead of portraying reality as good as possible, but missing out on creating an experience that is suited for VR I am interested in how 360 degree can be employed to support my vision of a dance piece. How do I have to alter my choreography, spacing and concept to create an interesting experience?
A good example of changing the concept of a piece to make it fit the 360 degree medium is Wait for It performed by the Broadway cast of Hamilton for the 70th Annual Tony Awards Promo. This 360 video was created by the New York virtual reality company KonceptVR. When starting the video the viewer is surrounded by the cast of Hamilton gathered on stage. The house lights of the theater are on and the seats are empty. The lead character stands directly in front, and starts to perform the song as different cast members, positioned in a circle around the viewer join in. During this experience one is induced to turn the head to follow the singers. Different ensemble members are grouped together to lead different parts of the song; in order to follow the sound of their voices the viewer has to move either their head or the screen, which makes this experience immersive and engaging. The viewer feels invited into the space and like the center of attention since the whole cast is directly looking at the camera/them.
This way one is motivated and curious to explore the 360 range of motion made possible by the VR medium. The viewer feels engaged because they are the center of attention with the whole cast directly looking at him/her. Even though this VR take on Wait for It is successful I think that the plainness of the concept lacks a certain kind of mystery and falls short of exploiting the possibilities offered. While the performance on stage in The Waltz of the Snowflakes was well translated, the experience lacked curiosity and viewer induced motion. While Wait for It is an immersive experience that engages one´s interest to look around, it lacks a strong physical performance as well as lighting set design, choreography, costumes, and overall theatricality.
With these two examples I would like to make clear that I aspire to incorporate attention to developing strong performance aesthetics as well as an immersive and intimate experience for my audience through placement of camera, set design, lighting, choreography and sound design. Only if you understand how important it is to adapt your piece to the presenting medium you will successfully translate the essence of the choreography from the non-digital to the digital stage.
The Concept
How can I engage and touch my audience with my piece? How can I keep its essence in VR? How much do I have to change it?
Those were the initial questions which answers built the core of my concept. I wanted to make sure that I reach the viewer with the storyline and context of the choreography. That´s why I decided to position the camera in the center of the space, so that the spectator is surrounded by the dance. Making the experience of my audience my priority meant that I had to change the pathways and spacing of my piece to guarantee that no gesture or exchange of the dancers was unseen. The original piece had a lot of straight pathways and only a few curves in it. Now I wanted to focus on spinning the movement in a circular motion around the camera, and adding only a couple of straight lines to change it up. Naturally the duet partners stay together during almost the whole piece, but the few times they separate I wanted them to be on opposite sites. This way I was hoping to entice the viewer to staying engaged and curious about the dance, as well as exploring the range of motion possible in VR. Further I wanted the dancers to acknowledge the camera/viewer by looking at it whenever it made sense. In one part the suppressed character is reaching her arms out for help; that was a good moment to directed at the camera/audience.
The essence of the piece was going to be maintained by keeping the original movement and partnering of the dancers, the original music and the original lighting idea.
The initial lighting on stage supported the two-faced character of the leader who at first comes off as caring and supportive, to then show her real, controlling and dominant character. There was a constant interplay of light and darkness which accompanied the performers. In the VR version of Worship we were more limited with the lighting choice because we had to make sure to keep the position of the camera in the dark to avoid any kind of shadow. We kept the original spotlight for the beginning and end of the piece, but in the middle part we refrained from setting a lot of lighting changes. Nevertheless my lighting designer, Keegan Butler, found a way to preserve the essence of his concept by blurring the edges of the blackbox. In the video one doesn't see where the stage ends and the curtains start, the line where light turns into darkness is blurry and still supports the somber atmosphere of the choreography. By recording the sound of the dancers and the music in a spatial way, I wanted to add another layer to the VR video that would let the viewer dive deeper into the experience.
Research - Trial and Error
I started my research by investigating different camera types that create 360° videos and their workflow. There were several criteria that I took into account: consumer friendliness, workflow, quality of footage, post production and the price constraint given my budget. After my first meeting with my advisor, Cari Ann Shim Sham, at Tisch School of the Arts I decided to start working with the Ricoh Theta V. This camera is very consumer friendly:I can control its two 220 degree lenses simultaneously; it does the stitching inside the camera which means that my workflow will be fast; it records in 4K which is a great quality for watching a 360 degree video; and it costs around $700, which was within my budget.
My equipment included the Ricoh Theta V, the Ricoh Theta app and a Benro A48FD Series 4 Aluminum Monopod with 3-Leg Locking Base.
When I started working with the Ricoh Theta V, I watched several tutorials on how to set up and use the camera. The app that came with the camera made it easy to watch the filmed content on my phone and to upload it directly to a VR content website. In order to do some colour or white balance correction I could use Adobe Premier since they support 360 degree footage. When doing my first test shoots in the studio during rehearsal I was satisfied with the results, and decided to go farther and film my piece Worship on stage. My initial concept was to put the viewer on stage during a live dance performance. This idea was intriguing for me because that would give my audience a chance be in a place where no-one usually can watch a performance from. I presented this choreography on October 16th and 17th at the Jack Crystal Theater in the East Village, and decided to do several test shoots during our tech and dress rehearsal.
I positioned the camera in the center of the stage and captured the first impression of what it would look like to be on stage with the dancers in virtual reality. When I checked the footage shortly after, I became aware of several significant problems: First, I realized that the automatic setting that took over as soon as one selects to film and not to photograph with the Ricoh Theta V didn't perform well in low light. In order to achieve a better quality I would have needed to be able to adjust the white balance and ISO, which was not possible with this camera in video mode. It was so dark on stage that the camera automatically raised the exposure which made the side lights on stage blow out the whole frame. My duet partner and I were barely visible because the lights from the side were blinding. Secondly it looked like the lens that faced upstage added a slight red layer over the whole frame which made the transition from one lens to the other very uneven. I assume that the red shimmer came from the recording light that was blinking during the shoot even though I had covered it with tape. Lastly the quality was less than optimal; It didn't look like a 4K video and the automatic setting was clearly the wrong one to work with. After that first try I decided to put the camera in the first row of the audience to avoid the blinding lights and to see if that quality would increase. Unfortunately I encountered the same problems; the red shimmer was still in the frame and I could barely follow the dance because of the poor quality.
Based on this, I decided to change the location for my shoot and switched from a classic stage to a blackbox studio/lab. This way, the viewer would really be the only person in the room beside the dancers which would increase the intimacy of the experience. My collaborators and I prepared Studio 1 for a 360 degree shoot and tried out several lighting settings. We decided to go with a brighter lighting atmosphere to deal with the ISO and Shutter issues of the camera. After I checked the footage and saw that the blinding lights were no longer problematic, but the red shimmer and poor quality persisted, I wanted to do some more research to find a different camera. This time I was looking for cameras which would let me change their setting while in video mode.
After a second round of research I chose to work with the Freedom 360 rig with six GoPros. The GoPros can record up to 4K individually, and using six of them would enhance the quality of the footage. What convinced me the most was that I could use a GoPro app that let me see the quality of the footage in live mode, as well as being able to adjust all settings like exposure, white balance, colour mode, and ISO through the app. In order to change the settings I had to download the GoPro app to my phone and then connect one GoPro with my phone via WIFI. Once connected I was able to see the live image of the camera, as well as to change its settings. These were the settings I worked with for the shoot: Normal colour mode, video resolution of 1440p which will add up to 4K with all six cameras, frame rate of 30 frames per second, exposure of -1 and a wide field of view (4:3).
Being able to see the live image while recording was important to recording the VR project, because the lighting designer and sound operator had to hide while filming. Through seeing a live image of the dance they would be able to cue the lights and sound at the right moment. To learn more about the use of the rig, cameras, and the workflow I watched tutorials by Jeremy Sciarappa on YouTube (see bibliography for links). Jeremy was using an editing software by kolor and I realized that having the right program to work with after the shoot would be as crucial as using the right camera and setting. Since I had decided to work with six cameras in a cube that would record the piece simultaneously, I had to use a stitching software to combine all six angles to one fluent shot. That´s why working with the right program was going to be very important. The French company kolor is collaborating closely with Gopro and is known for its user friendliness. Their software Autopano Giga 4.4 and Autopano Video 3.0 enabled me to stitch the different camera angles together, to adjust the colour and horizon, to hide unwanted objects in the recording and to select the sound source. My new equipment included the Freedom 360 rig, six Hero4 GoPros, a lighting stand, the H2n spatial audio recorder, Autopano Giga 4.4 and Autopano Video 3.0.
The Shoot
The shoot took place in NYU/Tisch Dance´s Studio 1 at 111 2nd Avenue, New York,NY 10003. My crew included my dance and choreography partner Yuriko Hiroura, Lighting Designer Keegan Butler and Sound Operator Sean Nederlof.
We started the day with clearing Studio 1 of its ballet barres and unnecessary tech equipment. To transform it into a blackbox space, we used black curtains that we could pull out to cover the walls, mirror and windows. Once the location was set, Keegan started to set the lights and I put the six GoPros into the cube. I decided to place the camera in the middle of the studio because I wanted the choreography to spin in a circular motion around the viewer. In putting the GoPro rig in the center of the stage the lighting designer was challenged to adjust the angle of the hanging lights in a way that the light would not cross the cameras. During an earlier test shoot we had found out that a centered placement of the rig and the use of the overhead lights in their normal position would cast several shadows of the camera rig on the floor. The solution for this problem was to reposition the hanging lights and to leave a dark spot in the middle of the blackbox for the camera position.
After installing the GoPro controlling app on three phones we did a tech run to get an idea of the interplay of lighting and choreography. After clarifying some light and tech cues, we were all set for our first run. This camera system is set up in a way that films six different angles of a space. The six cameras are fastened into a rig in a cube shape; each side of the cube houses one camera. Then you screw one of the corners of the cube onto the tripod. That way you have three GoPros facing diagonally down and three facing diagonally up. Each GoPro is set to film in a wide field of view (FOV) and is capturing around 200 degrees with its lens. Since there are six cameras each edge of the frame of one camera overlaps with four other cameras. In post production one has to find the overlapping lines to stitch all six GoPros together. The stitching is often achieved by the software which is looking for either a sound or a motion cue. After I started the recording of each GoPro manually, I had to clap several times right next to the rig. This was necessary to find a sound cue for stitching in post production. I also turned the tripod gently from side to side to provide a motion cue for stitching. Then I hit record on my spatial audio recorder which was standing right underneath the tripod and we started our first run.
At two minutes into the piece my dance partner and I heard a beeping sound from one of the cameras that made us stop the run. After checking the cube I saw that one of the GoPros had ran out of battery. Hooking up each camera to our phones via WiFi, changing the settings and putting them in a tight rig made the GoPros use up their batteries faster that expected.
We took an early lunch break to give each battery pack time to recharge. One hour later we were able to record a full run of the choreography without technical complications. The only thing that I was worried about was the quality of the sound. There was loud piano music coming from the studio next door that had started to play during our recording session. A new try was definitely required to aim for flawless quality of sound and video. Our third time was going to be our charm, and we finished filming the piece shortly before two GoPros had to shut down due to insufficient battery. Out of five hours of prep work and filming we came out with one good run that was 6.30 minutes long.
The Workflow
Working with the softwares from kolor came with big advantages and saved me a lot of time. I was able to import the footage directly to Autopano Video where the software was doing the stitching. The only thing I had to tell the program was to look for a motion cue to define the stitching point and when to find it. After the software had successfully stitched my footage I could adjust the horizon, the colour and the length of the piece. When I started to look for stitching errors in the video I became aware of obvious lines that would slice through the dancers´ body parts and reattach them in a wrong angle. After doing some research on how to solve that problem, I found out that kolor offers a stitching option where the software goes through every frame of the footage to find out where in space movement is happening. Once analyzed the program would create a stitching pattern based on the movement and not based on the static cameras. Once the second round of stitching was finished I could add the spatial audio and export the file. Spatial audio is important for the viewer´s experience in virtual reality because when they move their head the sound that they hear changes in space. Working with an audio recorder that was set up to capture spatial audio would enhance the quality of my project since it adds the layer of 360 sound to a 360 video. The next step was to add some titles to the video which I could do in Adobe Premier´s VR setting.
After exporting the video from Adobe Premier, I started working with the software Reaper64 to include the spatial audio. But since I am still figuring out how to edit and add the audio track to the video, I was not able to export my VR experience with spatial audio yet - this piece is still a work in progress.
If you would like to take a look at my choreography you can find it on YouTube for now, since it supports VR, and is easy to distribute and connect to a website. In the future I am going to search for a better platform which is made for virtual reality experiences.
Bibliography
Websites
Live Action VR Production
Hamilton 360 – The Tony Awards Promo
http://www.konceptvr.com/portfolio-item/hamilton-360-the-tony-awards-promo/
Illy – One for the City a 360 Music Video
http://www.konceptvr.com/portfolio-item/illy-one-for-the-city-a-360-music-video/
Jump Start
https://vr.google.com/jump/start/
Using VR Headsets
https://developers.theta360.com/en/forums/viewtopic.php?f=5&t=249
Viar360 - Most Intuitive Solution for Virtual Reality
360/VR Workflow and After Effects in Premier Pro
Blog
How to stitch GoPro footage
https://wistia.com/blog/producing-360-video
How to shoot a 360 Video
https://www.wired.com/2017/02/shoot-360-video/
How to work with spatial audio
https://medium.com/@webjournalist/spatial-audio-how-to-hear-in-vr-10914a41f4ca
YouTube
The Premier Pro 360 - Workflow
https://www.youtube.com/watch?v=iOhWVFq7jKs
Immersive Video Effects in Premier
https://www.youtube.com/watch?v=9dBcdYossaI
Video Tutorials by Jeremy Sciarappa - Working with the Freedom 360 rig
https://www.youtube.com/playlist?list=PLTFMW47bD0a85Gm1Bc9KePFfm21BPQmd0
The full 360 video workflow with kolor softwares
https://www.youtube.com/playlist?list=PLqPZ7zvLePxfysSp9C-dTZFvIQiIt8EzV
BAM Virtual Reality: Mark Morris' The Hard Nut in 360°
https://www.youtube.com/watch?v=a2OA1Ch3sJI