Interview with VFX Pipeline Developer Carlos Anguiano

interview_icon

Pipeline developer, Carlos Anguiano, offers insight on his role, how he got there, and what it takes to be successful, not only as a pipeline developer but anywhere in the VFX industry as a whole! Here is what he had to say —

G: Can you explain in your own words what pipeline development in film is?

CA: A pipeline developer tends to work mostly at the studio-level. While I work with folks to implement solutions on a particular show, they also need to be accessible to every other show at the studio. The biggest difference between a Pipeline TD and a Show TD is that the show TD is allowed to work more within the bubble of his or her show. They are free to quickly develop solutions for problems at the show level without having to worry about how s/he effects the rest of the studio.

Pipeline development is about making tools, scripts, and APIs that all artists and TDs can use to more effectively get work done, while setting a standard the whole studio can follow. This way pipeline becomes a set of tools and standards that are constantly being improved rather than reinvented on every show. Consistently reinventing the pipeline makes it stagnant and sharing resources across shows difficult if not impossible.

G: As a pipeline developer, what programs or processes do you use on a regular basis?

CA: As a pipeline developer you need to know a little of everything. We are primarily programmers; therefore, a strong knowledge of python, qt, maxscript, .net, Mel, etc, are a must. In addition, a deep understanding of object oriented programing and databases are mandatory. At the end of the day our job is to make other people’s jobs easier by writing code. As a pipeline TD you always have to pick up something new, whether it is a new programing language, library, or a new software package. In addition to this technical side, there’s also the challenge of knowing and understanding how the artist your developing for actually work. A strong knowledge of modeling, shading, rigging, animation, lighting, and compositing, along with understanding on how the work is handed of from one department to the next greatly influences the success in adoption of the tools you write. This knowledge base is probably what makes good pipeline developers hard to find.

G: How did you decide, or fall into the role, of becoming a pipeline developer? What kind of background knowledge/education is necessary to handle this role successfully?

CA: Becoming involved in pipeline was an organic process for me. I started as a generalist with a focus on animation setup (rigging). Early in my career, I worked in smaller studios where one had to do a bit of everything. So I think  this experience along with my background in art (B.A in media arts and animation), gave me a good interest and insight into the big picture of creating rendered images. At the same time, my focus in animation setup had me learning about expression and scripting, which eventually evolved into tool development outside of rigging.

It was because of these experiences that I went from being a generalist at smaller studios to being a full-time rigging and set up artist at larger studios. During this time I was lucky enough to work at companies like Disney, Digital Domain, Rhythm and Hues, and ILM that allowed me to get some great insight into the pipelines that create some of the most cutting edge work in our industry. These experiences motivated me to bring some of the ideas that I learned to smaller studios. I began to do consulting for commercial shops and smaller boutique-style Vfx shops.

One of these studios was Scanline Vfx, where I eventually took the role of Character Supervisor and developed their asset pipeline and  their rigging and simulation pipeline for assets. At Scanline I spent most of my time working on pipeline which was a responsibility I shared with another fellow artist Lukas Lepicovsky. Two years into working at Scanline, Pixomondo reached out to me about joining they’re pipeline development team, which lead to my first official position as a full-time  pipeline developer.

G: What kind of person do you recommend for the role of a pipeline developer? 

CA: Pipeline developers need to be critical thinkers, have good communication skills, love programing, have an interest and solid understanding in the process of creating visual effects. Moreover, they should excel at working in a team and be willing to do the work for the pride and hardly ever for the praise. The most important quality a pipeline developer should have is that they always work for the end user. The moment a developer elevates him or herself over the end user, he is no longer working for the betterment of the studio. If I may be honest, you have to be a little messed up to be a good pipeline developer or show TD for that matter 🙂

G: Does pipeline development take place before post production begins? — So once the pipeline is set into place your job is done — or is it an ongoing process that evolves throughout the life of the project?

CA: A place that has good respect for pipeline is always working on developing optimizing and evolving they’re pipeline. Pipeline is a topic that is very broad; it deals with the everyday artist tools, which is what most people limited to, but also expands to IT, production, and HR. Pipeline is about the big picture of running a business. It helps to ensure the maximum output with the least amount of input necessary. These days it is becoming a deciding factor on whether a studio stays open or goes under.

G: How did you end up working on Star Trek Into Darkness? How long were you working on the film?

CA: My involvement with Star Trek Into the Darkness was through the development of one of several tools I created for Pixomondo world-wide adoption. These tools were part of a larger effort to unify the studio under what I refer to as a modular non-linear asset flow pipeline. The tools that were used to achieve this included: a proprietary hierarchal referencing system for 3dsMax and an easy to use render pass manager system that had full Deadline and Shotgun integration for versioning and batch submissions of elements to the farm. The nature of this type of development is that it’s not usually adopted by the entire studio in one swift move. Instead, a single show that is early into production will be used as the test bed. In this case, it was the crew and supervisor for Die Hard 5 who took the ball and ran with this new pipeline that allowed us to quickly prove the difference that these tools were going to make in our ability to create the work faster with more revisions and hopefully less overtime. They were also critical in helping us workout some of the main bugs within the new workflow.

Due to the success of the Die Hard Show, the Star Trek crew showed great interest in adopting the new tools. However, the asset system proved to be tricky because it meant retrofitting certain parts of the show that had already been approved using the older pipeline. At that point we switched our focus towards moving Star Trek into our new render pass manager tool workflow. The tool was adopted quickly with only small bugs being reported which I was able to immediately address. The scope of Star Trek show pushed the render pass manager to some pretty crazy limits which made it evident that I needed to optimize the code. At the same time the trek artists and artists evaluating the tool in other branches were coming up with a number of exciting feature request which were impossible to ignore. All in all, I was on and off-the show helping with development for around 8 months.

G: Did you encounter any challenges while working on the project? Were you mentally stretched in any particular way, or learn something new as a pipeline developer, that you would like to share while working on Star Trek Into Darkness?

CA: As a developer at Pixomondo the biggest challenge is being able to keep track of how changes effect each and every branch. Changing something that seems insignificant can have devastating effects at another branch. During Star Trek, we were in the process of unifying our pipeline across all branches and all shows to facilitate asset and work sharing. Since each branch has a unique history, and pipeline legacy a developer must tread carefully sense things that he/she takes for granted might not work in other facilities. Hitting these types of walls usually means taking a detour into fixing a different set problems before one can continue to make progress with the original task. It can be a slow process and can greatly test people’s patience at times.

G: What projects are you working on currently?

CA: I am currently taking some time off to develop some pipeline tools. For anyone who’s interested I have some information about it on my blog.

G: Is the role of a pipeline developer your end goal, or is it a stepping stone towards something else? Do you intend to take on another role in the future?

CA: I enjoy being a supervisor and I wouldn’t mind getting back into that sometime. Ideally, I would like to do more consulting, training, independent software development, and remote support.

G: For people striving to become a VFX artist in the industry, can you offer any advice on how they can get their “foot in the door?” Are there any pit falls you would recommend avoiding that you personally experienced?

CA: It seems to me that companies looking to hire pipeline developers are putting great value on computer science or equivalent degrees. Pursuing this area of study is probably a more direct way to getting into pipeline work than a degree in art which I received.

If you go the computer science route, make sure to make time to learn packages like maya, max, and Houdini. Many programmers finish school having learned all there is to know about complex programming such as writing a fluid simulator from scratch, but have no practical knowledge of the software that is used on the industry of how to develop for it. If you can figure that out on your own, you’ll be way ahead of the game.

Royalty_Free_Music_Orange_468_60_Yellow_Dress_zps3d728d61

Compression Apps Pro and Cons

Compressor2

 

 

 

 

 

If you are a seasoned editor or new to video editing, one of the many things that will frustrate you off the bat is dealing with exporting and compression. Sometimes, it can be straightforward if your client gives you requirements for the format they need their outputs in. Other times, you will find yourself playing a game of compression roulette, trying to get good quality in a small file size only to find the format you chose was not compatible with your client’s needs. Thankfully there are many applications such as Apple’s Compressor, Adobe Media Encoder and MPEG Streamclip that are designed to help alleviate your potential exporting nightmare. In my years as an editor, I’ve managed to use all these apps to facilitate deliverables for my clients.

In this article, I will discuss the pros and cons of each application, to provide a better understanding of which app may fare better in different situations. As a disclaimer, the pros and cons are based on my personal experience using them, and may not be exactly the same as your experience will be.

applenotes_02

Apple Compressor (sold as part of Final Cut Studio 3: $999, as a separate app: $50)

Pros

  • Great to use when you need to encode your finished file for a DVD
  • Encoding can be automated by creating droplets
  • Greater and more detailed customization than what is allowed in Final Cut Pro 7 or X
  • Has better conversion for slowing down footage using optical flow

Cons

  • Roundtripping from FCP 7 tends to yield poor results and can cause crashes
  • Encoding to h.264 can be really slow at times
  • It’s not a fully 64 bit application
  • Interface hasn’t changed in version 4 and is still a bit confusing for new users
  • Doesn’t take advantage of all cores on a multi-core Mac

Overall, I would use Compressor if I needed to encode a project for DVD or needed specific customization for a client deliverable. Otherwise, it’s the least used encoding application in my toolbox.

MPEG-Streamclip-logo

MPEG Streamclip (free app from squared5.com)

Pros

  • Can convert to Quicktime, DV, .avi, .mp4 and more
  • Has the ability to open DVD Video TS folders
  • You can batch encode multiple files into one format
  • Preferred app for DSLR users with h.264 footage
  • Works on PC and Mac
  • Can trim, cut and join other movies together

Cons

  • Only converts audio to .aiff which can result in larger audio file size
  • Doesn’t support AVCHD or MXF file conversion
  • Parameters can be confusing for people who aren’t video savvy

Overall, I believe MPEG Streamclip is a must have in your toolkit if you need quick and dirty conversion. It is a highly recommended application among the DSLR community and best part of all is that it is free.

mediaEncoder

Adobe Media Encoder ($50 a month as part of the Creative Cloud)

Pros

  • Can encode to formats of Quicktime, .wav, .mxf and many more
  • Comes equipped with presets for many multimedia needs such web, DVD, broadcast, iOS, Android and more
  • Can be queued from Premiere and After Effects
  • Two pass encoding is available for higher quality output

Cons

  • Learning curve for usage is not as beginner friendly
  • Two pass encoding can be slow if you aren’t using a reasonably powerful computer

Overall, I’ve always found Media Encoder to be my encoding application of choice. The amount of headaches its relieved are second to none. With the next iteration on the horizon with CC, it will only grow stronger and more dependable with time.

That’s my assessment of the popular encoding applications used by video editors. They each possess their pros and cons but I’m a firm believer in using what gets the job done best and gives you the least headaches. There are many other encoding apps on the market but these three tend to be the most used and reliable for the various post production tasks that may arise.

I’m the NLE Ninja with AudioMicro asking you to stay creative.

Royalty Free Music

Interview with VFX Artist Josh Bryson

interview_icon

Visual effects Artist, Josh Bryson, talks about digital compositing, the VFX industry, and his experience while working on BBC’s Doctor Who.

Josh Bryson is currently a digital compositor working at Stargate Studios. He has worked on projects such as Heroes, The Walking Dead, Revenge, 24, and most recently, BBC’s Doctor Who. To see his entire body of work including his upcoming projects, Josh Bryson’s entire body of work can be reviewed on the Internet Movie Database (IMDB) here.

If you would like to learn more about Josh Bryson you can visit his website at www.brysondigital.com.

During the interview with Josh Bryson the topic of the unstable atmosphere of the VFX industry as a whole was brought up and discussed. For those of you who are unaware of the situation currently affecting everyone in the post production world I want to take a moment and provide some necessary context.

In February 2013, Visual Effects Studio Rhythm & Hues  declared bankruptcy. As a result, 254 visual effects artists were laid off. At the 2013 Academy Awards, which took place right around this time, nearly 500 VFX artists outside protested the mass layoffs. Inside, Life of Pi, which has been worked on by the studio Rhythm and Hues, was nominated for multiple awards, one of which was its post production work by Rhythm and Hues. Bill Westenhofer, the visual effects supervisor for Rhythm and Hues, was giving an acceptance speech during the awards and his microphone was almost immediately cut off. Visual Effects artists were in an uproar, criticizing director Ang Lee for not acknowledging the extensive VFX work that went into making Life of Pi a great film.

As a form of protest some visual effects artists and supporters have changed their social media profile pictures to Chroma Key Green, with the following message attached

“Why have so many profile pictures gone green? Well the company behind the Life of Pi’s stunning visual effects, which made the movie possible, Rhythm & Hues went bankrupt as the film just passed the half billion dollar mark in global ticket sales. The 3D & VFX (visual effects) companies that make the Hollywood blockbuster movies possible, sign on to bad deals typically at a loss, the Hollywood production companies walk away with profits, and artists who dedicate their lives to their craft get short changed on salary, over-time and eventually job security. The green is a form of solidarity and protest for change in our industry.”

If you would like to learn more and get up to date with the situation going on within the visual effects industry, there is one name that is recommended among all others – VFX Soldier. As an anonymous VFX artist in the industry, he works to report the issues of the industry and expose adverse practices conducted by both the studios and the post production academies around the globe.

Royalty_Free_Music_Orange_468_60_Yellow_Dress_zps3d728d61

 

 

 

Interview with VFX Artist Eri Adachi

 

interview_icon

Visual effects artist, Eri Adachi, who has worked on projects such as Star Trek Into Darkness, Iron Man 3, Les Miserables, Doctor Who, and Once Upon a Time recently answered some of my questions surrounding the art of compositing and how they got to where they are today.

https://vimeo.com/41829101

G: Can you explain in your own words what compositing is?

EA: Make something fake look real, the art of combining elements.

G: As a compositor, what programs do you use on a regular basis?

EA: Mainly I use Nuke.

G: How did you decide to become a compositor – I see you started in Japan – what did your journey look like to get you where you are today in London?

EA: Actually I started my career in London back in 1999,I started working with a company making visuals for concerts. I worked in the lab making special slides, I really liked the techniques involved and the precision. From there I began working with after effects for the company called Urban Visuals. Once in Vancouver, I had a chance to work on a feature film “Blades of Glory” at Rainmaker (now Method) as a Roto artist in 2006. My VFX career started then.

G: You were a compositor for Star Trek Into Darkness – What scenes or sections of the film were you working on?

EA: I worked on the scene when the Enterprise is under attack, and Kirk is trying to reactivate the nuclear reactor.

G: Did you encounter any challenges while working on the film? Were you mentally stretched in any particular way, or learn something new as a compositor, that you would like to share while working on Star Trek Into Darkness?

EA: Working on shots whose requirements change as you are working on them is one of the biggest challenges for any compositor. Finding a balance between what the 3d department was providing and my own preparation in putting together my shots was one of the biggest challenges on Star Trek.

G: What projects are you working on currently?

EA: I am currently working on 300.

G: Do you have any inspirations? Whether it be from nature or from another artist, where do you draw your creativity from?

EA: My inspiration is the real world, everyday things and how they look, and thinking about how I can use techniques i know and illusion to recreate it in a shot, or use something I observed and make it again.

G: Is the role of a compositor your end goal, or is it a stepping stone towards something else? Do you intend to take on another role in the future? (For example: VFX supervisor, etc)
EA: I am very interested in compositing and whatever work that leads to, but my interest has always been this part of the VFX world.

G: For people striving to become a VFX artist in the industry, can you offer any advice on how they can get their “foot in the door?”

EA: I find that networking is quite important for me in London!

If you would like to learn more about Eri Adachi, you can visit their website at www.eriadachi.com where you can see their demo reel, breakdown, and more!

Royalty_Free_Music_Orange_468_60_Yellow_Dress_zps3d728d61

Swap Slide Transition in Premiere

Premiere_cs3

One of the many tips I learned when I started editing was to be observant of things I see on the screen. When I wanted to learn how to recreate a transition, effect, or animation and there was no tutorial or breakdown available, I would watch the example over and over to fill in the pieces. By doing that, I learned how to create my own effects transitions in various editing applications as well as how to turn those into successful tutorials. What I’ve recently learned how to do in Premiere is how to create over/under transitions that I was used to seeing in FCP 7. The first one I did was a Sliding Page Transition. I was able to break it down by observing a video clip I saw online into its essential elements. In short, it was nothing more than animating the scale and position parameters, switching clips on their original video tracks and adding a quick gradient behind it. Once I put the pieces together, it was simple to recreate it. I took a similar approach for this transition as well.

The next transition I will show you how to do a Swap Slide. This transition involves swapping your outgoing clip with your incoming clip.

Swap Slide Transition Setup

First, you want to have two clips on your timeline like the picture below.

Screen shot 2013-05-26 at 3.11.58 PM

Next, add keyframes for position on both clips. For the clip on track 1, I’ll add a keyframe for position at its default value.

Screen shot 2013-05-26 at 3.42.31 PM

Let’s move 13 frames forward and add another keyframe with the clip moved to right, almost offscreen.

Screen shot 2013-05-26 at 3.44.34 PM

Move 12 frames forward and change the position value back to the default.

Screen shot 2013-05-26 at 3.45.43 PM

Now, we need to add the same amount of keyframes to the clip on track 2 as well but instead of moving it to the right, we will move it to the left. Follow these screenshots as a reference.

Screen shot 2013-05-26 at 3.47.21 PM

Screen shot 2013-05-26 at 3.56.49 PM

Screen shot 2013-05-26 at 4.08.29 PM

The final step in creating this transition is a blade edit and swap the video clips. First, let’s make a blade edit on the second keyframe of each clip.

Screen shot 2013-05-26 at 4.10.01 PM

Move the clip on track 1 to track 2. Do the reverse for the clip on track 2.

Screen shot 2013-05-26 at 4.11.16 PM

If you do all that, you will get a result that looks like this.

There you have it. Another over/under transition for the FCP converts who now use Premiere. If you want the transition to happen sooner, you can change the timing of the keyframes to your liking. If you are PC user, this tutorial may not be relevant as this transition still exists in the Slide category. If you want an option to purchase a package that has this actual transition, you can get the Genarts Sapphire package or BorisFX’s RED package. Both of them offer this transition with in their vast categories. While they are great to have, they can be expensive if you don’t have the budget, so purchase wisely.

I’m the NLE Ninja with AudioMicro asking you to stay creative.

Sound Effects