Watch AyaBambi Distort Reality In This Unreal Face-Hacking Video

“Nice face. I’d like to wear it sometime.” No, it’s not a Valentine’s Day card from Buffalo Bill. But it might be an icebreaker in 2020.

If you’ve had a bad face day, you can usually keep up appearances on social media thanks to the magical filters of Snapchat and Instagram. If you’re having a really bad case of the fugs, you can just face-swap yourself with your favorite movie star and call it quits. That’s all well and good from the privacy and solitude of your own bed, but when it comes to heading out into the real world, the closest thing to a real-life filter was a good concealer.

But now, thanks to an innovative dream team from Tokyo, IRL Snapchat filters are real. Sort of.

From Haunted Rides To Kat Von D

Projection mapping at Disneyland, 1969

Projection mapping at Disneyland, 1969 | Source

The practice has moonlighted under a few aliases, including “spatial augmented reality” and gruesome-sounding “face-hacking.” But it’s commonly known as projection mapping. In short, the practice involves projecting an image onto a non-flat surface. The earliest adopter of projection mapping was none other than Disney, which first used projection mapping at the Haunted Mansion ride at Disneyland way back in 1969. It’s been used on-and-off since then with limited success. But the capabilities of projection mapping have always been pretty basic … up until now.

WOW labs, helmed by CGI-expert Nobumichi Asai, has been working on an advanced method of projection mapping for the last few years.

So, what is it?

Asai’s projection mapping uses specialized software to track a surface in motion, while a custom-built projector alters the image to adjust to the subject’s movements. The results are a seamless and high-definition projection that works in tandem with the face. For instance, say you wanted to project a zombie mask onto a model. If the model moved, the mask would move with them — without distorting or losing quality. It gives the appearance of a CGI effect in real life.

In 2015, Kat Von D used projection mapping software to promote her new collection of makeup at Sephora. The projection mapping used was 240 frames per second (fps).

The latest production from Asai uses a projector that uses 1,000 fps — the fastest in the world. The results speak for themselves:

WOW studios partnered with the University of Tokyo and TOKYO production studios to create the stunning performance video, which clocks in at just over a minute long.

In order to accommodate Asai’s CGI technology, the DynaFlash projector was custom-built by the Ishikawa Watanabe Laboratory at the University of Tokyo. This 8-bit, 1000 fps projector uses state-of-the art technology to keep up with the movements of the human body. Check out this side-by-side comparison between a standard projector and the DynaFlash.

WOW, TOKYO, and the creators of DynaFlash collaborated with iconic dance duo, AyaBambi, to create this stunning video performance, which shows off the capabilities of face hacking software. It’s unreal.

 

The Uncertain Future Of Face Hacking

Morphing your face into something straight out of Samara Morgan’s home movie collection might be fun for now, but the possibilities of face hacking are electrifying.

As tech continues to get smaller and more portable, it’s easy to imagine projection mapping software being added to a wearable tech device. Magic Leap might fill up your local watering hole with mythical creatures, but face hacking could turn every habitué into Harrison Ford or Marilyn Monroe.

We’re already beginning to explore the ethical implications of “face borrowing.”

Rogue One presented us with Grand Moff Tarkin, even though actor Peter Cushing died in 1994. It also gave us a fresh-faced Carrie Fisher, although CGI isn’t advanced enough yet to defer the effects of “uncanny valley” — the tendency for the human eye to reject almost-human replicas.

But if we could switch up our faces as easily as we can change our Snapchat filters, will we have to license our own looks to stop copycats from borrowing our features? And is it okay to borrow someone’s face, especially if they’re dead?

Non-celebrities don’t often think about things like licensing your looks, but our faces are being collected every day through mediums like Snapchat and Facebook. In the user terms of Snapchat, it writes that is has:

“Worldwide, perpetual, royalty-free, sublicensable, and transferable license to host, store, use, display, reproduce, modify, adapt, edit, publish, create derivative works from, publicly perform, broadcast, distribute, syndicate, promote, exhibit, and publicly display” any content you upload to the app, “in any form and in any and all media or distribution methods (now known or later developed)”

American property law dictates that your likeness can’t be used without your consent. But thanks to the murky ‘tap here’ terms and conditions of companies like Snapchat, you’ve probably already given your consent. Imagine your surprise if one day you walked into a bar, to find someone else wearing your face. Or — if you want to take the Black Mirror approach to it — the face of your dead relative.

Snapchat’s faceswap filter | Source

By sticking to shadowy constellations and embracing eerie distortions, AyaBambi’s performance avoids deep philosophical questions such as those, while providing a mesmerizing example of how projection mapping can be used. At the moment, the technology has both feet firmly planted in the art world. But it raises a myriad of interesting questions about what will happen when technology allows us to change our appearance with the ease of a Snapchat filter.

Comments