Thursday, February 21, 2013
Given that this is my first real effects film (I only had 120-odd effects shots in Leap 2) I’ve been doing a lot of studying up on effects and shooting them. When I shot Leap 2, I watched the Bourne films over and over. Now I’m watching a lot of Transformers 1,2, and 3, The Avengers, Captain America, and Resident Evil: Retribution. I’ve also been watching a bunch of independent short films that are effects heavy. Not only am I looking to learn how to shoot for CG, but also what gives these films their epic scale. Most of it comes down to composition and camera movement.
We will be shooting on my T2i, and though it’s an older HDSLR, it’s still running strong. I was a bit worried about doing effects within the limited colorspace I have, but I don’t think it’ll be a problem. I’ve been shooting tests with the kids of the family I’m staying with, tests of shots I’m planing on using in the film. So far, they’re turning out just fine. In the process, I’m getting practice shooting for effects (the hardest part is visualizing the where exactly the CG is in the frame) and I’m learning more intimately After Effects and especially the rotobrush. I think the rotobrush gets a bad rap sometimes for being too sharp, but I’m learning how to tweak it and keyframe it’s feather to handle motion blur.
Software-wise, I’m cutting in Premiere and dynamically linking effects shots into AE. Once in AE, I have access to it’s 3D tracker, the aforementioned rotobrush and Element 3D 1.5. On the last film, After Effects was what made the movie possible. This time it’s AE and more specifically, it’s Element plugin. To complement Element, I’m running the latest version of Blender to animate models I get from a Top Secret website (no, it’s not Turbo Squid). A bunch of the models I’m using have been pulled from videogames, so I’m significantly altering their diffuse texture in Photoshop so they won’t be recognized. Element’s biggest strength is also it’s biggest drawback: In order for E3D to be realtime, it’s rendering engine is basically the same found in video games. This can make it difficult to achieve the photo-realism that I’m going for, so I’m planing to save Element for shots with a lot of motion blur or shots that have elements in the distance.
I’m currently in the middle of breaking down the script right now and for the first time on a movie, I’m scheduling a second unit. But here’s the thing, I AM THE SECOND UNIT. Let me explain. In order to keep costs down, I’ll be shooting everything with the actors at one time. Our schedule is looking to be about a month and a half. This is the first unit. Then, when I’m all done with the actors, I become the second unit director and will be shooting all the plates and elements that I’ll need for the effects. There’s a lot of establishing shots and plates that I won’t have to shoot with actors, so why schedule them then? If I know a shot will have a digital double, there’s no point in having the actor there. We’ve got too much other movie to shoot.
When I wrote the script, I had planned to shoot the film in both Montana and in Spokane, WA. To cover wages, food, props and travel/lodging costs, the budget is sitting at about $18,500 (roughly) right now. A major chunk of that is bringing the actors out to Montana and putting them up for the duration of the shoot. I may be better off finding locations an hour from Spokane and then I can either cut that cost or put the money elsewhere. I haven’t decided on this on yet, still praying about it.