Pubs
Search Gear
 

Five Questions: Marc Senasac

July 11, 2017

Marc Senasac is in a unique position when it comes to integrating recording and interactive media. On the studio side, he’s a Grammy-nominated engineer whose work has spanned film, TV, and music recording, with credits that include Chris Isaak, Digital Underground, 2Pac, En Vogue, and Blue Öyster Cult.

Nearly two decades ago, Senasac turned his efforts toward interactive media. Today, he’s music engineering manager of Sony PlayStation World Wide Studios’ PlayStation Music Product Development and Service Group, which provides music production and implementation services to developers and studios that make games for the PlayStation platform. He’s also part of the Sony Worldwide Studios Audio Standards Working Group, which recommends best practices for the development of audio content in interactive entertainment products.

I talked to Senasac last May, following the release of Sony’s long-awaited Farpoint first-person shooter title for the PlayStation VR platform, to learn ways traditional audio production is transitioning to this adaptive medium.

In your role at Sony PlayStation, you support technology pipelines in game development. Right now, it seems like we’re exploring production for this medium at the same time that new tools are being developed. How does this pace inform your infrastructure decisions?

A big part of our group’s role at Play-Station is starting with music production in traditional recording pipelines and overseeing its way into adaptive music systems that run in video games, supporting the player experience. Those AAA games are really massive software applications.

I like to think of the team I work with as bilingual: speaking the language of composers and music creatives, and also speaking software development. The language of software development can be way over on the technical side, and often far away from the creative lexicon.

What are some of the ways creative production roles and workflow are evolving as VR evolves?

I can say that audio workflows for VR—immersive audio—are evolving very quickly in countless directions, in part because the VR visual and sonic development tools are still in their infancy and evolving quickly to support countless ideas and products—all in search of the VR “killer app.” Immersive audio for VR is still the wild west, in a good way, in that sense that everyone doing it is still experimenting to find the best paths toward a moving target. Most of the good VR audio that I know about so far seems to be arrived at with a hybrid of multiple production techniques to convey a good experience for the listener.

From a game development standpoint, this new level of immersion and participation means sound weighs even more prominently in the narrative—but audio cues need to be more convincing than ever. What kind of audio production challenges does this pose?

It’s super cool now that good immersive audio with head tracking in a 360-degree field goes an incredibly long way toward “selling” the VR experience to the player’s brain and creating a sense of “presence.” I’d go so far as to say that VR barely works, if at all, without good immersive audio. It puts some great weight on the shoulders in the audio-professional arena, and I think most audio pros are up for the challenge!

The Farpoint game is pretty ground-breaking. What would you say are some of the biggest challenges of developing a full-length VR game?

On Farpoint, a lot of folks on our team had a hand in the music development, in conjunction with Impulse Gear and the composer, Stephen Cox. Two guys on our team in particular, Anthony Caruso and Rob Goodson, were in deep on the music implementation for a couple of years.

I’m sure everyone involved has their own big list of challenges during the development of a VR game. I know one of the big ones that comes up a lot with music is, “In what space does music play in the game?” Beyond being diegetic, I mean. In the real world we live in, there is no music playing to emphasize the emotions we are experiencing.

Another challenge that comes up is mixing and producing a score for a larger game in headphones. We can use speakers, but in almost all cases the realized VR experience is in headphones. Audio production used to be done with the assumption that most of our audience would be experiencing the work in speakers. Now we’re almost 180 degrees the other way. Even beyond VR, a ton of music produced today will be enjoyed and experienced in headphones first, with speakers being a secondary experience.

What kind of audio tools on the horizon have you excited, from an audio capture perspective or from a mixing perspective?

Spatial audio in headphones is really exciting in that it is working better and better all the time. It’s great to be able to create something for an audience who will be listening in headphones and know that they will not only be able to experience left and right, but in front and behind, as well as above and below. I can’t wait to see where it goes from here.

Keep up-to-date on the latest news
Get our Free Newsletter Here!
Show Comments

These are my comments.

Reader Poll

Are you a gear DIY-er?



See results without voting »