Psyop’s lead Creative Technologist, Brian Kehrer, doesn’t fit many molds. He studied Film at NYU (with a minor in Math), but he’s spent most of his professional life building games and interactive experiences using Unity, a powerful real-time 3D engine and development platform.
Brian’s a lefted-brain right-brainer — or a right-brained left-brainer — and he’s making all sorts of interesting things happen at Psyop. We caught up with him to find out a little more about Unity and his role at Psyop.
Psyop is doing more interactive work, and Unity has become one of the primary tools we’re using. Why is Unity a good fit for Psyop?
At a high level, Unity is a 3D game engine that integrates well with our existing pipeline, is intuitive for our artists, and is capable of very high quality output while keeping development costs reasonable.
More specifically, though, Unity is amazing tool for rapidly prototyping interactive experiences, which is critical for Psyop. It allows us to experiment easily – either building demos for a pitch, or quickly vetting creative ideas. Developing high quality interactive work requires an iterative development cycle, and Unity allows us to focus on the creative aspects of development, rather than spending time reinventing technology.
What is your personal experience with Unity?
I’ve been using Unity since 2006, when I was working on a 3D, browser-based virtual world. At the time, we were pushing Unity to the absolute limit, and their dev team went out of their way to help us out. You could tell they really cared about their product.
In 2008, I co-founded Muse Games, an independent game development studio focused entirely on Unity-based development. Most recently I directed “Guns of Icarus Online,” a 32 player competitive airship combat game, which launched on Steam in October of 2012, after about 2.5 years of development.
For “Guns of Icarus Online,” I jumped between my roles as lead game designer, product manager, and Unity programmer. Some of the systems I developed in Unity included a completely custom UI system, a dynamic music system, character movement systems, and the character animation system.
The character movement and animation systems were particularly complicated, due to the need to synchronize and verify state between clients and an authoritative server, while still offering immediate responsiveness to the client – all on a moving and rotating airship, which, was also being synchronized across the network.
At Psyop I’ve been working in Unity non-stop. Although most of the projects I’m working on here are still under wraps, I’ve been focused on mobile development.
One of the advantages of Unity is that it can publish to multiple platforms. Is that something Psyop is pursuing?
Definitely. Frequently, the term ‘mobile’ gets thrown around as a single platform, but it isn’t. It’s iOS and Android. And even then, it’s iPhone, iPad, and 31 different flavors of Android, each with their own peculiarities. Unity doesn’t solve all of the cross platform problems for you, but it can solve about 95% of them, if you’re careful.
Of course, Mac, Windows, and Linux standalones are also something I think we’ll pursue once the right project comes along. Building a standalone version of a mobile app is only a question of UI / UX differences – it’s very straightforward. Even for our mobile projects that aren’t targeting desktop, we build desktop versions anyway simply for the purpose of capturing video. It’s that easy.
When publishing to iOS, there are some limitations when it comes to real-time 3D. Which of these has proved relevant to Psyop and how are you addressing them?
The biggest limitation is fill rate. The latest iOS devices have competent processors, lots of memory, and a whole lot of GPU vertex transformation power – but when it comes to drawing transparent pixels, we really hit a wall. This limitation has all sorts of side effects artist’s don’t expect.
Lighting, shadows, complex pixel shaders — all of these suffer. So we find ways to cheat to achieve the effect we want.
Having creative directors with a strong vision is really helpful. By finding a few reference frames early, we can focus our attention only the iOS limitations we actually need to work around. Sometimes the boards I see have such a unique look that we literally have to discard all the assumptions graphics hardware makes about lighting and rendering. That usually means we need to write all the shaders from scratch, but it also liberates techniques that would ordinarily be too slow on iOS.
What else do you want to tackle in the realm of Unity? What’s next?
I’m into interactive storytelling opportunities in rea-ltime spaces. To me, the beauty of an interactive narrative experience is the part the user controls. You can unlock a whole new set of emotions with an interactive medium, like pride and guilt, which can lead to a sense of participant responsibility.
There are places traditional media can’t reach, and that is exactly where interactive media should be. I think there is a lot of great interactive narrative work being done in video games (as well as a lot of bad), but it isn’t getting the attention it deserves, probably because most video game plots are based on space marines.