Lessons on the joy of movement and why that's pertinent to dance... and to continue dancing as you get older...
Posted this to Youtube a coupla weeks ago and forgot to add it here... so doing that now.
Lessons on the joy of movement and why that's pertinent to dance... and to continue dancing as you get older...
Posted this to Youtube a coupla weeks ago and forgot to add it here... so doing that now.
Here's a shot of a typical audience:
Here's the (admittedly prompted) cheering that performers receive at the end of the routine:
Mobile VR is a socially awkward technology.
If you're ever in a group of people and one person pulls out a Google Cardboard, brace for an awkward social situation. Based on my experience, what'll happen is:
The whole exercise is self-defeating. Immersion seems impossible to achieve when you're tethered to a collective objective eye.
Is it possible to create a mobile VR experience that's more socially compatible?
In trying to crack this nut, my brain jumped to Heads Up! -- a simple (non VR) mobile game where you "guess the word on the card that's on your head from your friends' clues before the timer runs out!" Here's a clip of the gameplay:
What's so useful about Heads Up! is that it relies heavily on information blindness. Participants have to contend with the fact that certain people are privy to certain information. This knowledge gap is bridged via performance, creating a cohesive and shared social experience.
This type of overt performance seemed like the right move for what I was looking for in the problem of making mobile VR more social. My theory was, if you gave specific, performative roles to the VR user and the people watching, the awkwardness would wash away.
But can VR be a performance? I mean, when you're in VR you can't make eye contact with the people watching you, never mind the fact that you're supposedly in an entirely alternate reality.
Well, while performing almost always involves eye contact, the primary function of a performance is for one person to communicate an experience to another. In the case of mobile VR, the only thing really worth performing / sharing is how a user engages with the tech. Mobile VR only has rotational tracking, meaning that a user's agency is confined to face orientation. (I may write a blog post later as to why I prefer the term "face" over "head" when it comes to VR tracking...)
So, given that we're working with performance and body shape, my mind jumped to some sort of face-controlled VR rhythm game. Everybody would hear a song, and the performer would execute a sequence of face orientations to the beat, with the audience watching. Hopefully, this would set clear enough roles to overcome social VR awkwardness.
And lastly, perhaps to make it even more social, what if multiple VR headsets could network together? This could create a choreographed dance among multiple participants.
So one day I casually mention this sketch of an idea to Matt over lunch. After spitballing some more thoughts on it, the conversation moves on and I quickly forget about it. A few weeks later we're having lunch again and Matt whips out a VR headset and tells me he's built a prototype.
I put it on and start the demo. I try to forget that I'm in a crowded diner with a hunk of plastic on my face that's loudly blaring "Poison" by Bell Biv DeVoe. As I rock my head around, it feels great. We decided at this point that this thing has legs and we should continue. It was just so weird and fun. It was just enough directed activity to keep the user busy, but porous enough to feel fundamentally social.
Shortly after this, my life became crazy busy with other things, but Matt continued to crush in on the development, doing pretty much the entire build. The hardest part, it turned out, was getting multiple devices to sync at exactly the same time stamp, but eventually we found a workable solution.
We aimed to debut the game at Come Out and Play, and started play testing at the Game Center where he works. Here's an early play test:
After seeing it on networked performers, we felt even better. It validated that it not only felt good from the performer's perspective, but it was also very entertaining to watch.
One unexpected thing we discovered while testing was that in addition to feeling great, it also felt totally stupid. Perhaps it felt great because it felt stupid. It felt stupid in a good way, like Old Spice commercials.
So when seeking play testers, we'd sell the game as an "intentionally stupid VR experience." This wasn't to hedge criticism. It was because we wanted to share a discovery we made about VR that was indeed totally stupid and awesome at the same time.
I mean, if you look at the mechanics of the game, it's a perfect crockpot of stupid. First off, it's VR. I mean, let's face it: VR on it's own terms looks stupid. You put your face in a brick of plastic and enter a suspended state of stupor. You're so wrapped up in your own magical experience that you lose touch with reality. Back in meatspace, you're completely unable to respond to things that are obvious to everybody else in the room. VR = Textbook Stupid.
Take that, and add dancing in public (while unaware of how public you are), and you have the level of stupid that #WeAreDanceFace can provide.
Celebrating the notion of being labeled "stupid" isn't something that's unique to this project or even VR. I'd argue that all cultural movements have varying degrees of appearing "stupid" to those outside the culture.
For example, if you look at hippies -- here's a culture that, from the conventional perspective of their era, valued sexual deviance and drug abuse over owning up to personal responsibility. They were a group of unwashed kids who lost touch with reality.
The inability for conventional folks to understand the new culture's value system is celebrated by these countercultures. If countercultural actors feel confident in their value system, it only makes sense that they'd want to play up the boogey-man appearances, as if to say: "Screw your labels, we all agree that you simply don't and won't get me, and honestly, that's not my problem anymore."
I bring this up because #WeAreDanceFace takes the form of a countercultural statement. Instead of treating the we-look-stupid issue as a VR thing we'll someday outgrow, this project directly addresses it by declaring: "This grotesque face-appendage of plastic is totally awesome. So is my dancing and so is this ridiculous 90s song that we're piping in from an alternate reality. Eat me."
To sum it up, our countercultural statement soon became the design thesis for the project: Celebrate VR Stupidity.
The name "#WeAreDanceFace" incapsulates this assertive stupidness. It's bombastic and self-involved. It's also a social, declarative, and performative (something that a band yells at a performance), all of which gets further digitally amplified with a gratingly annoying hashtag.
The hashtag in the name prompted us to get a twitter handle, which we used during the event to publish animated gifs of the performances. I tried my best to accompany each gif with equally bombastic and stupid text (which, by the way, was an exhausting exercise for someone as typically chill as me).
This thesis of celebrating VR stupidity also provided direction on the UX at the event. While training the on-deck performers, the line that consistently got the most laughs was "Remember: You probably look cooler than you feel". This instruction partially points to a game mechanic (your choreography may feel boring but only because you can't see the group as a whole), but it also reinforced that we're here to not give a damn.
Having a cultural statement like this elevated the project above "tech demo" status and into something else, which proved to be hugely useful.
This project would have been much less successful if everybody viewed it as a demo. Demos succeed and fail based on how useful the underlying tech appears to be, and utility is defined by what value it adds and how reliable and convenient it is.
By these metrics, at this stage, #WeAreDanceFace is a pretty, uhm... not great technology... lol. At runtime, the software required the core developer to babysit it. The VR visual interface is so confusing that it requires that you sit through a lecture (by me) and an in-VR training session with Matt. It took a long time to reset between demos and the line to enter the experience was dauntingly long.
But despite these challenges, we were able to pull together a convincingly good experience because...
Early on we discovered a different way to frame the project: as a live performance. This framing made magically made everything better.
First off, it sets clear expectations for performers and audience. Focus gets directed to how much fun people are having, not how performant / useful the tech is. They don't mind waiting in lines or waiting for tech to resolve. From this framing, watching tech people do stuff almost feels like peeking under the hood of a magic trick. You're wait not because the tech is bad, but because the experience is worth it.
In addition to clarifying expectations for the audience, it helped direct my and Matt's behavior during the run. It was a lot of fun to assume the role of crowd control manager -- hamming up on all things performance-related. I (annoyingly) started referring to the performers as "the talent", and coached them on how to get the best reactions from the crowd. Between games Matt and I would assume the role of stage hands / roadies, handling equipment and ushering people around.
Lastly, by seeing this as a "live performance", everybody was better equipped to process the experience. Performers were 110% behind the work they were tasked to do and took their roles seriously enough to instigate a fun time. Audiences grasped at the ephemeral nature of the performance, cheering, laughing, snapping photos, and generally just enjoying themselves.
The event was a success. People who already know about VR encountered a fresh look, and the unacquainted got introduced to a form of VR that extolled its awesomeness while poking fun at its shortcomings.
To be clear: mobile VR is still socially aberrant and still looks stupid. And running an event based on mobile VR technology sucks because you're juggling an immature tech with crowd management -- a logistical nightmare.
But despite all this, people keep flocking to VR experiences anyways. Why? Because VR is just that awesome.
And this tension between stupid and awesome undergirds how VR fits into the bigger picture. Designing along this tension is definitely a challenge, but it's phenomenally fun and rewarding to do so.
A while back I toyed with the idea of VR dance education, and even made a Tilt Brush sketch. At the time, though, I couldn't figure out how to make the content more compelling than just a YouTube tutorial, so I abandoned it.
Then just recently I did a video on how great VR is for visualizing 3d data as well as meta-3d data (like 4d data). I shared the work on /r/Vive, and then, earlier today, in a recent Reddit exchange, /u/Sir-Viver suggested that I try dancing in Tilt Brush to visualize the work. He connected the dots for me to give it another try now armed with more insight on what meta-space means. Thanks!
Came up with a Tilt Brush creation that I'm quite proud of. It visualizes two fundamental liquid dance moves, the Figure 8 and the Rail. When I got to building the rail visualization, I found that it both served as a visualization and a tutorial, so I put in some explanation text on how you can step through to learn the structure of a rail.
There's definitely a lot more to explore here, but if anybody has any other ideas on how to use VR for dance education, post a comment or reach out or something!
Download the Hypercube demo (HTC Vive required to run it)
One one of the two controllers works. You need to get both controllers to register and figure out which one it is.
Lastly, here's the Reddit post where there's some additional discussion.
In my last blog post, I shared some AR teleportation concepts that I designed. That work piqued my interest in the teleportation problem set, so I built out a few new sketches:
If you want to try out the geometry-screen-wipe teleportation, download and run this (if you have a Vive). To note, this demo was built for the purpose of illustration on a YouTube video, so more would need to be added to make the UX fully production-ready.
I have to admit that the idea of VR teleportation (which I'll call "TP") always weirded me out. In real life things don't just pop in and out of existence. Entire environments less so.
Then after having tried out a bunch of VR TP examples from a variety of different applications, I felt proven both right and wrong at the same time. There's no denying how useful and convenient it is, but something about it is still unsettling. Not like rip-off-my-headset unsettling, but just unsettling enough to be distracting.
Diving a bit deeper into that feeling, I notice that I'm trying to rationalize a 4th dimensional experience with a 3rd dimensional brain. Teleportation becomes a wellspring of nagging existential questions: How'd I get from there to here? Did I lose time in the process? What happened to the old me? What does "me" even mean in this context -- like did I momentarily stop existing during the transition? Was I in an alternative universe?
Now, I know the argument: we'll get used to it, just give it time. But that argument always felt wrong to me. Firstly, it just ignores my dissatisfaction by sweeping it under the rug. More importantly, though, it misses out on the huge design opportunities to exploit my reptilian desire to cling to conventional 3d wisdom.
TP violates the laws of physics, and there's no designing your way out of that. Perhaps, though, clever design can provide a POV about this violation. By hinting at answers, TP can be more than just a utility for getting around. It can also serve as a story-telling device that drives the type of engagement you want. For example, if your game is about being an assassin of the night, perhaps your TP mechanic leaves behind a small puff of smoke behind you and NPCs who saw it audibly gasp and freak out a bit.
That's not to say that the default TP mechanic (fade to black and back again) has no place, but I do wonder if if it will someday feel like the default Unity material under the default Unity light -- just enough to help scaffold new ideas, but too awkwardly devoid of opinion to be production-worthy.
I have some more teleportation concepts that I'd like to figure out how to implement. The two main ones are "data moshing" between positions to create trippy transitions and animating with crazy cross-section glides. But perhaps I'll build those sketches another day.
If you happen to be interested in my process stuff, here goes:
First was lots of sketches / scribbling in my notepad. After I got them all down, I reviewed them to see which ones I thought I'd be able to build and communicate in a reasonable manner.
Some of them were built in post w/ Premiere (dip to various colors / patterns, representational 2d screen). The geometric wipe one was done in Unity because I thought that would be easier.
I chose to do the live action one IRL because I didn't want to learn how to paint a Unity camera's image onto a fake photograph that wafted in the wind. That one involved a lot of hand dexterity, handling the phone, the photos, and the controller while dealing with the wind, passers by, etc...
A few weeks ago I had the pleasure of building some design concept work for Object Theory as a get-to-know each other sort of deal.
The idea was to explore teleportation IXD for Hololens, and I came up with a bunch of design sketches that I really liked and thought could be applicable to the greater AR / VR community at large. So here are some of those designs, both as gifs and as the presentation. I've also included some thoughts below.
Two things to point out: (1) this is not in production, just concept sketches and (2) this work is built on top of Object Theory's really solid investigation on avatar design for enterprise clients. Definitely worth a watch.
And now, some thoughts on the work...
Back in college when I was studying theater, I was taught to focus on two things: goes-intas and goes-outas. In other words, focus on the design choices that transition the audience into and out of scenes. Goes-intas set the stage and set expectations. Goes-outas wrap things up into a singular takeaway.
Most theatrical design is transition work. Awesome transitions not only direct the audience's attention to the right subject, but also coerce a point of view about that subject.
A quick example: a scene ends with a character shooting another with a gun and the director wants to do an abrupt blackout. Blacking out...
These different experiences are just from moving one light cue back and forth 2 seconds. Now apply that to sound cues, set transitions, etc... into and out of all scenes, and you'll soon see that transitions tell story.
One big advantage that AR / VR has over theater how "cheap" transitions are to execute. Theatrical transitions are inherently resource-intensive. Type-A stage managers and their huge teams of board operators and crew members swarm around waiting for actors to actuate certain effects (and actors are not actors because they have reliable personalities). Physical labor is draining.
In AR / VR, like in all digital media, resource bottlenecks are either computational or attentional. Good media designers relish working within computational constraints, so that's not particularly new for this tech. The upper limit of how humans pay attention to 3d space, though, is totally new, weird, and exciting.
There's a lot to learn and explore here. Here are a few fun questions to consider
It seems that the best way to answer these questions is to just start clobbering VR experiences with a bunch of random design approaches to see what sticks.
AR / VR Teleportation is super weird in that you're taking two 3d experiences and you're splicing them to be right up against one another. Unlike film cuts, I'm not certain that we'll just get used to it over time. Cinema, after all, is a 2d medium, and it's trivially easy for our 3d brains to contrast multiple 2d spaces very quickly. Seeing that we aren't 4d brains, 3d experiences that do not have transitions can be disorienting.
The transitions in the above teleportation designs intend to address this by providing story and helping the user construct a more wholistic 3d experience. The goes-inta effect converts the user's former point of focus (Hololens gaze point) into an avatar outline, telling the story that the user has instantiated the next experience. The goes-outa effect is the catapulting of the user's body towards that new location, which tells the story of how the user would have to move through space to get to this new spot.
As noted before, this particular design was built for enterprise work. As a design, it is a little more pedantic and dry than I generally prefer building, but the overtness is intended to assert a sense of place-ness and ownership that the user has over the model, which I think it does. A game implementation of these principles would probably involve more inventive figures that contributed to the action of the gameplay. Maybe something more along the lines of these:
In any case, my main takeaway is that the baseline quick fade-in / fade-out may be utilitarian, and there are huge story-telling opportunities to do something more meaningful with those transitions
After doing this work, one thing bothered me: what's up with this fade to black? The perpetual non-choice of black only provides the same non-story over and over again. It's so weird. Like, it seems that nobody even bothers to fade to blue or white or static, or even simply hard cut or anything (please somebody make a star-wipe VR transition).
Now what bothers me about all of this is that my design work above didn't address this at all -- it just added a bunch of flair to distract from it.
That said, I'm now working on a handful of design sketches of transitions that sidestep the fade-to-black thing. Previews to that work below. I'd invite others to start thinking about what those transitions could looks like, also, cuz I'd like to see how other think about this problem set.