Background & Professional Journey

The path from live performance through research, industry leadership, and the founding of Lifelike & Believable

Download my professional CV

A comprehensive overview of my career spanning three decades

Download CV

My entire career has been driven by a single pursuit: creating believable presence in interactive systems. Whether on a theater stage, in a game engine, or in a shared immersive space, I've worked to understand how authentic, responsive, emotionally resonant presence can emerge from the marriage of artistry and technology.

This journey began in live performance in the late 1980s and evolved through research at NYU's Media Research Lab, founding Improv Technologies, and leadership roles at Electronic Arts, Relic Entertainment, and Microsoft. Each chapter taught me something essential about how to build systems that empower artists, how to synthesize creativity across disciplines, and how technique exists to serve vision. Today, I apply those lessons through Lifelike & Believable, advising studios worldwide on real-time animation architecture, and through Shocap Entertainment, creating live XR experiences that prove technology can be a bridge for authentic human connection.

The unifying thread across all of this work is simple: I believe that when artistry and engineering are properly aligned, they can expand what it means to tell stories and create presence in the modern age. The stage is no longer limited to a theater building. It can be a game, a virtual space, or a shared immersive experience. But the principle remains the same—create something that feels true, something the audience believes, something that connects.

1989–1995

Live Performance

Lighting & Scenic Designer, NYC

After completing my studies in Film & Television production at NYU in 1989, I began my career in the NYC live performance industry. I started that summer as a production electrician and light board operator for the Big Apple Circus, then transitioned into design work, serving as lighting and scenic designer for theater, music, and dance productions. I was the resident designer for the King's County Shakespeare Company and resident lighting designer for The Balinese-American Fusion Dance Company. These years were foundational. Working in live performance taught me something essential that would shape my entire career: the difference between believable presence and artifice. On stage, every moment is live. The audience watches for reactions that feel true, for movements and light that respond authentically to the moment. There is no room for lag, no tolerance for the mechanical, only connection. That understanding of real-time responsiveness, of how to create believable presence in a shared space, became the guiding principle for everything I would pursue in interactive systems.

1994–1999

New York University

Research Scientist, Media Research Lab

My career at the NYU Media Research Lab began with a chance encounter with a high school friend that led to an introduction to Dr. Ken Perlin, already a luminary in computer graphics and animation for his work in procedural texture synthesis. I initially joined the MRL as an artist-in-residence on the 'Immersive Environments' project, integrating interactive sensors and devices into physical spaces to create immersive narrative experiences. Simultaneously, Ken had begun applying his procedural synthesis work to movement, creating an interactive 3D ballet dancer that could be choreographed in real-time through keyboard input. This piece would be showcased at the SIGGRAPH Electronic Theater in 1994. As a former film student I was fascinated by the cinematic potential of computer animation, and as an artist working in live performance, the opportunity to create something with animation's visual diversity combined with live performance's emotional immediacy was irresistible. I began work on what became the Improv Animation System, and by the end of 1995 I was working full-time as a Research Scientist at the MRL. Over five years, I presented this work at conferences and roundtable discussions worldwide, publishing articles in industry and academic publications, including the 1996 edition of Computer Graphics, the field's principal academic journal. Ken and I were awarded a patent for 'A Method and System for Scripting Interactive Animated Actors'. This work would become the foundation for everything I would pursue in the years to come.

Key Highlights

  • Developed the Improv Animation System for real-time character choreography
  • Presented work at international conferences and academic forums
  • Published in Computer Graphics and other peer-reviewed publications
  • Co-invented patent for scripting interactive animated actors

1999–2003

Improv Technologies

Co-Founder & CTO

In 1999, I formed Improv Technologies with Ken Perlin to pursue commercial applications for the technology we had developed at NYU. Over four years, we raised $8 million and grew the company to approximately 40 full-time employees working on three flagship products. Orchestrate 3D was a non-linear animation editing system integrated with existing commercial software including Maya, SoftImage, and 3D Studio Max. Catalyst was a real-time animation layering and blending system, one of the first commercially available middleware packages for the PlayStation 2. Fusebox was a peer-to-peer framework for linking artist tools and workflows across multiple disciplines and locations. The company was the rare startup that stayed true to its core mission of enabling artists, not just chasing valuations. We won a Year 2000 Innovation Award from Computer Graphics World magazine for our work. Improv shut its doors in early 2003 due to the cooling venture market following the dot-com bust. The technology and vision had not failed; rather, the capital environment for tech had become scarce. The experience taught me invaluable lessons about building products for creative professionals, managing teams through uncertainty, and the importance of holding to artistic principles even in challenging business climates.

Key Highlights

  • Co-founded Improv Technologies and raised $8M in venture funding
  • Grew company to 40+ employees developing three commercial products
  • Orchestrate 3D: non-linear animation editing for Maya, SoftImage, 3D Studio Max
  • Catalyst: real-time animation middleware for PlayStation 2
  • Fusebox: peer-to-peer artist tools and workflow framework
  • Won Computer Graphics World Year 2000 Innovation Award

2003–2005

Sun Microsystems

Games Technology Group

In 2003, my family and I relocated to the Bay Area to join Sun Microsystems' newly-formed Games Technology Group. At Sun, I was responsible for developing Java-based APIs and SDKs for game developers, working at the frontier of mobile gaming, a frontier that barely existed in most people's minds at the time. I spent considerable time traveling the world, meeting with mobile operators and device manufacturers to explore the prospect of single-player and multiplayer games on handheld devices when most people still carried flip-phones and PDAs were almost exclusively business tools. The iPhone wouldn't launch until 2007. The experience gave me deep insight into the technical constraints and creative possibilities of real-time graphics on resource-limited devices, and exposed me to the ecosystem thinking required to bring new platforms to market. In many ways, this period was about learning to think systemically: how technology, business, and creativity intersect at the emergence of new platforms. I remained at Sun until 2005, when I left to join Electronic Arts.

Key Highlights

  • Developed Java-based gaming APIs and SDKs for mobile devices
  • Consulted with mobile operators and device manufacturers globally
  • Pioneered early mobile gaming during pre-smartphone era
  • Built expertise in resource-constrained real-time graphics
  • Developed platform ecosystem thinking

2005–2009

Electronic Arts

Product Manager, Animation & Physics, EA Tech

In 2005, I joined Electronic Arts's central technology group, EA Tech, as Product Manager for Animation & Physics. Over four years, I led the design and development of ANT (Animation Toolkit), transforming it from an engine custom-built for the FIFA franchise into a flexible, modular interactive animation framework that became the animation technology core of Frostbite, EA's internal game engine, now used across all of EA's studios and franchises. My role required me to travel to EA's studios around the world, meeting with game teams of diverse backgrounds working on games across genres with wildly divergent mechanics. I had to synthesize the needs and desires of all these stakeholders into a unified vision for how AAA real-time interactive animation could be produced and presented. During this period, I championed quarterly release schedules combined with an internal 'open source' strategy that enabled every team to test their games against the latest code in advance of official SDK releases. I also organized an annual animation conference bringing animators, technical animators, and animation engineers from across EA's studios together to share innovations and concerns. Following ANT's integration into Frostbite 3, I applied the tools I had built to my own use as Technical Animation Director on an unannounced third-person action title. When the project was cancelled after about a year, I left to join Relic Entertainment.

Key Highlights

  • Led design and development of ANT (Animation Toolkit) animation framework
  • Transformed ANT from FIFA-specific engine into modular, studio-wide solution
  • Integrated ANT as animation core of Frostbite game engine
  • Synthesized requirements from diverse game teams across all EA studios
  • Championed quarterly release schedules and internal open-source strategy
  • Founded annual EA animation conference bringing studios together
  • Served as Technical Animation Director on unannounced AAA project

2010–2011

Relic Entertainment

Technical Animation Director

In 2010, I joined Relic Entertainment as Technical Animation Director on Warhammer 40,000: Space Marine, a visceral third-person action game set in Games Workshop's grimdark 41st millennium universe. As technical animation director, I was responsible for defining the animation architecture and pipeline for the game, working closely with the animation team, gameplay designers, and engine programmers to ensure that every movement felt physically grounded and emotionally authentic. This ranged from the heavy, purposeful stride of a Space Marine in powered armor to the frantic scramble of terrified cultists. The challenge of making 40K's massive, armored warriors feel human and vulnerable despite their superhuman capabilities required deep understanding of weight, momentum, and the subtle signals that communicate intent and emotion. Space Marine shipped in 2011 to critical acclaim, praised for its tight controls and visceral combat feel. The game became a commercial success and launched a franchise that continues to this day. The experience reinforced a core belief: technical mastery in service of creative vision is what separates good games from great ones. Whether human or system-based, the animator's role is to make the impossible feel inevitable.

Key Highlights

  • Technical Animation Director on Warhammer 40K: Space Marine
  • Defined animation architecture and pipeline for action-focused combat game
  • Collaborated with animation, gameplay, and engineering teams
  • Created believable motion for armored warriors and diverse enemy types
  • Shipped to critical acclaim in 2011

2011–2014

Microsoft / The Coalition

Technical Animation Director

I was brought into Microsoft as Technical Animation Director for the studio that would later become The Coalition. During my time there, in addition to leading gameplay animation development on an unannounced third-person action title, I worked on the early pre-releases of Unreal Engine 4. I was responsible for providing input on key features that would become part of the UE4 foundation, including State Machine Conduits. We implemented these at Microsoft and they were later integrated into the UE core engine. This period of technical innovation also led to work with Jerry Edsall on player avatar movement systems, resulting in a patent for Player Avatar Movement Assistance in a Virtual Environment. During my tenure, I also served on an advisory board for the Xbox One prior to its release, ensuring the hardware was properly optimized for real-time animation. I remained at Microsoft until 2014, when the studio was announced to be taking over the Gears of War franchise. At that point, I decided I didn't need to work on another action-shooter with huge guys in armor fighting aliens in armor with chainsaws. The other franchise I'd just shipped was Space Marine, after all. It was time to chart a new course.

Key Highlights

  • Technical Animation Director at Microsoft studio (The Coalition)
  • Led gameplay animation development on unannounced AAA title
  • Contributed to Unreal Engine 4 foundational features and architecture
  • Implemented State Machine Conduits, later integrated into UE4 core
  • Advised Xbox One hardware optimization for real-time animation
  • Shaped early UE4 animation systems and best practices
  • Invented patent for player avatar movement assistance in virtual environments

2020–Present

Shocap Entertainment

Co-Founder & Creative Director

In 2020, I co-founded Shocap Entertainment as Creative Director to create and launch commercial live XR events and performances that merge cutting-edge immersive technology with compelling storytelling. The flagship project, Carry Me Home: A Live & Virtual Journey Into the Creative Mind of Singer-Songwriter Didier Stowe, represented a ambitious fusion of live performance, motion capture, and extended reality. The project was funded through an Epic Mega Grant, recognizing its innovative approach to immersive entertainment. In 2024, Carry Me Home won the Numix Prize for best XR Experience, validating our vision that technology can create profound emotional connections through shared, immersive storytelling. Shocap Entertainment embodies the culmination of my career's core mission: using real-time systems and interactive technology not as ends in themselves, but as vessels for authentic human connection and artistic expression. Whether through live XR events or future immersive experiences, Shocap explores how advanced technology can expand the boundaries of what live performance can be.

Key Highlights

  • Co-founded Shocap Entertainment in 2020
  • Served as Creative Director for commercial live XR events and performances
  • Created Carry Me Home: A Live & Virtual Journey Into the Creative Mind of Didier Stowe
  • Secured Epic Mega Grant funding for innovative immersive entertainment
  • Won Numix Prize in 2024 for best XR Experience
  • Pioneered fusion of live performance, motion capture, and extended reality

2014–Present

Lifelike & Believable

Founder

In February 2014, I founded Lifelike & Believable as a way to reach and engage with a wider variety of developers and projects. This was not unlike the role I had with EA's internal studios during my tenure there, but at an industry-wide level. The goal was to democratize access to deep expertise in real-time animation systems, working with studios of all sizes to help them build believable, responsive characters and bring their creative visions to life. Founding Lifelike & Believable also afforded me the flexibility to start Pepper's Ghost New Media & Performing Arts Collective, producing live performances featuring real-time animation. In 2020, I co-founded Shocap Entertainment as a complementary creative venture, exploring the convergence of immersive XR technology with live performance and storytelling. This was the original vision that had led me to leave theater and join NYU's Media Research Lab nearly thirty years before. In many ways, this represents the full circle of my career: the principles I learned in live performance, the tools I helped create through research and industry work, and the collaborations I've built across disciplines, all now channeled into singular missions. I want to prove that technology and artistry aren't separate domains, but complementary forces that, when properly aligned, can expand what it means to tell stories and create presence in the modern age.

Key Highlights

  • Founded Lifelike & Believable Animation Design in 2014
  • Provide consulting and technical direction to studios worldwide
  • Advise on real-time animation architecture and systems design
  • Co-founded Pepper's Ghost New Media & Performing Arts Collective
  • Co-founded Shocap Entertainment in 2020 for immersive XR experiences
  • Pioneered live performances using real-time motion capture and virtual production
  • Synthesized 25+ years of expertise across performance, research, and industry

Unifying Thread

Looking back across three decades, I see a continuous thread connecting each chapter: the pursuit of believable presence in interactive systems.

Live performance taught me that presence is earned through authenticity: real-time responsiveness to the moment, subtle communication of intention, and the discipline of making the audience believe they're witnessing something genuine. When I transitioned to research at NYU, it wasn't a departure from that principle. It was a translation of it. The Improv Animation System was an attempt to bring live performance's emotional immediacy into digital form, to create characters that didn't just animate but reacted, adapted, and felt present.

Founding Improv Technologies taught me to think as a systems architect. How do you build tools that empower artists rather than constrain them? What interfaces unlock creativity across disciplines? At EA, I applied those lessons to a much larger scale: how do you create a shared animation language across dozens of studios with different games, different aesthetics, different technical constraints, all united under a single framework that was both powerful and accessible?

My time at Relic and Microsoft grounded all of that systems thinking in the reality of shipped products. Games had to perform, had to feel responsive, had to be shipped to millions of players. It reinforced what I'd learned in theater: technique exists to serve vision. The most elegant animation architecture means nothing if it doesn't support the emotional and mechanical requirements of the game.

Founding Lifelike & Believable was about returning to first principles, but at scale. I wanted to work with teams who understood that real-time interactive systems are a craft. The care taken in architecture, the thoughtfulness in design, the clarity in communication directly translates to what players experience. Launching Pepper's Ghost closed a loop that began in the 1980s: the chance to merge everything I've learned about interactive systems with the live performance roots that started it all. In 2020, co-founding Shocap Entertainment took this mission further: creating live XR events and immersive experiences that prove technology can be a bridge for authentic human connection. Carry Me Home demonstrated this principle at scale—a live, networked experience that used real-time systems and motion capture not as spectacle, but as a gateway to genuine emotional resonance. Believing in presence, whether on stage or in a game engine or in a shared immersive space, is the unifying mission of my entire career.

Ready to Work Together?

Let's discuss how we can bring your vision to life.

Get in Touch