Whether you like it or not, 3D is here to stay. There are close to 30 movies in the format already scheduled to be screened in 2013, and that number is likely to increase. The technology has earned a bad reputation through shoddy post-conversions and dim projections, but there are a few filmmaking teams truly exploring 3D’s artistic potential. Laika, the production company behind Coraline and Paranorman, is one, but there is also the less lauded work of director Paul W.S. Anderson and his collaborators. Anderson’s films are built with 3D in mind, from the screenplay to the set design through to the shooting itself.

FILM COMMENT spoke with Anderson’s frequent director of photography, Glen MacPherson, who has shot all three of Anderson’s 3D features: Resident Evil: Afterlife (10), The Three Musketeers (11), and Resident Evil: Retribution. We discussed the technology’s constant evolution, the challenges that places on the production team, and why you can’t easily fake a punch in 3D.

Final Destination

Final Destination

How did you start working with Paul W.S. Anderson? Did he see your 3D work on The Final Destination (09)?

Yeah, I guess so. I was on another project and they flew me into Toronto to meet with him. I guess they called me in because I had some 3D experience, and he loved Rambo (08) and other non-3D movies I had worked on. I actually shot a film for one of his best buddies, who was the best man at his wedding. So we had friends in common. When you work with someone, you want someone you can hang out with for six months.

Was that the first time you worked in the new 3D process?

I remember the director calling me up, he was going into a meeting for Final Destination 4, and he said, “We’re going to shoot it in 3D,” which made it interesting. After his meeting he asked me if I could meet with the Cameron-Pace group, who had all the rigs. Avatar hadn’t even been released yet, and they were developing all of these new systems. It was still in its infancy. 3D rigs were still fairly new, but pretty sophisticated. The cameras that go on them, at that point we shot with these Sony F23s, small chip cameras. By the time we got to Resident Evil: Afterlife we were with the Sony F35. Things get better and better every year.

Could you talk about the preparation you and Paul put into preparing a 3D project?

Before we shot Resident Evil: Afterlife, Paul and I went to see a lot of 3D movies. We either went to the theater, or set up private screenings at the studio. Paul’s an amazingly quick study, the guy’s really smart. He had never shot a 3D movie, but he did his research and clicked with it. He’s just a natural in 3D. As for preparation, the sets may be a little different. You don’t want to put furniture up against the walls, you want to build some depth into the sets and into the blocking. He storyboards most of the movie, and they are storyboarded to take advantage of the 3D.

Did any of the screenings particularly stand out?

I’m trying to remember all the movies we saw. We watched Journey to the Center of the Earth [from 2008], we saw Coraline [09]. Coraline stood out. That was a terrific movie and a great use of 3D.

Three Musketeers W.S. Anderson

Three Musketeers

You used three different cameras for the three films you made with Paul. Could you talk about the cameras you chose, and why they worked for those specific films?

There’s a difference between the cameras and the rigs. The rigs are metal and software and motors and everything that enables you to control the 3D on set. Then you put basically whatever cameras you want on the rigs. The bigger the camera, the bigger the rig is going to be. We saw a system that was pretty early in its development in Germany that was so huge. You could make a movie with it, but they were making it with a young child actor, and how is that kid going to see who he’s acting with? He’s going to have this massive camera in front of him that you can’t see around. You end up putting little Xs on the camera to act to. We always try to keep the actual 3D rig as small as possible. On Resident Evil: Afterlife we used Sony F35 cameras. At the time they were the best available. They were just coming off the assembly line at that point, it was very new, not a lot of people had used them. But we did a couple of tests. The thing with those cameras, we recorded to tape. You’re always tethered from your camera rigs, you’ve got all these cables going to the recording stations. We had four cameras working and four huge recording stations—there’s a lot of hardware to drag around.

For Three Musketeers we shot in Germany, and the Arri Alexa was just coming out. Again, they were just coming off the assembly line. Nobody really had shot a movie with it. Robert Richardson was going to start shooting Hugo in 3D with those cameras, and I got a sneak peek at his tests at the Cameron-Pace Group, and said, “Wow, this is a great camera.” I didn’t have a chance to test it or anything, but we decided to use that camera. Once again, it was tethered to recording stations and we had to haul all that around Bavaria. But the technology had gotten better and the stations had gotten smaller and more compact and portable. We shot mostly all on location in real castles in Bavaria.

What about Resident Evil: Retribution?

We were shooting in Toronto, and it was a big action movie so the cameras had to be really mobile. There’s no way we’d be able to tether these rigs to recording stations, with car chases, and all the stuff we have to do. We really want to be mobile. So I and my second unit DP (Vern Nobles) put together a 3D system, we call it Cinesail 3D Systems. This is the one time we didn’t go to Cameron-Pace. At the time there was Element Technica, a California company, building very good 3D rigs. We bought the rigs, and the day they were delivered they were merged with 3ality, and became 3ality Technica. 3ality specialized in software for 3D, and Element Technica was a hardware company, and they merged. So we got these new rigs with great software and hardware, and they were very compact. They have a whole range of rigs, and we bought the Atoms, which I guess are named to convey how small the rigs are. We used Red Epic cameras with them. The reason we did that is because you put memory cards into the cameras, that’s what you record on. You are not tethered to anything. We could put those 3D rigs on the cars for the car chase, and just let it go by itself. No supporting equipment. It’s the first time we’ve been able to do that. So when you’re watching the movie, you feel like, “Holy crap the cameras never stop moving—tons of action.”

Just before that I had shot Glee: The 3D Concert Movie [11]. We had 12 rigs, so I used six Arri Alexas and six Red Epics. It was mostly a test for me, so that as we went into post we could compare the two cameras, see how they cut together. In color correction and everything, I wanted to see what the difference was going to be. And although there were differences, nothing negative with either of them. You could cut them together perfectly.

There was no big difference between the two cameras?

Well, with the Arri Alexa you can actually record to a card as well, but it’s a low-resolution recording. It’s good for television and some features, but visual effects companies don’t like working in lower resolution. So actually, for that concert, all the Arri Alexas were tethered with fiber optic cables back to a truck to recording towers. The Red Epics were onstage handheld, we were recording on the cards, no cables anywhere. They definitely react differently to colors and light, but the results with both are spectacular.

Resident Evil: Afterlife

Resident Evil: Afterlife

Could you give a basic rundown of what the crew setup is like when you’re shooting in 3D? Is there a stereographer on the set?

First of all, we put together our own 3D system for Resident Evil: Retribution. One of the reasons was that all the other systems took an enormous amount of people: studio engineers, recorders, all of those guys back there. Using the Red Epics you record in the cameras and go. So all you need is a guy to download all that data onto a hard drive and process it later. That reduced our on-set crew a lot. We’ve never really brought in a stereographer. On Final Destination, Vince Pace was on the set for four days, just to get us started, and we had done some 3D tests. Then we did it ourselves, so I became a 3D expert pretty quickly on that movie [laughs]. You can go to all sorts of seminars and sit there and listen to people talk and they have charts and graphs, but when you’re there you have a dial in your hand and you go, “Hey, that looks cool.” You learn pretty quickly what looks great and what you have to avoid, it really doesn’t take long.

So what are the things you learned about shooting 3D on the set? What to avoid, what looks good…

Boy, there are a lot of things. It sounds complicated when you talk about it, but when you’re there looking at it you go, “Oh, that hurts. Why? I wonder why that hurts…” You can give something too much 3D volume, which your eyes can’t handle, they’ve got to diverge and converge like crazy. You can really get a lot of eyestrain. If your 3D is always huge, I don’t think your eyes could sit through a two-hour movie. So we try to modulate the 3D. For a lot of the stuff it’s almost 2D, it’s really close, the two cameras are very close to each other, so the 3D volume is not that big.  But when the monster shows up, we’ll give it a lot of 3D volume, and that might last for two to three minutes, and then we’ll go right back to give your eyes a break.

You can put things too close to a camera, and that’ll hurt. Because the two cameras have two different perspectives on an image, there could be something in one camera that’s not in the other, and that really messes up your eyes. Your eyes don’t know how to deal with that. With contrast sometimes—there are still things with 3D delivery, what they call ghosting. Your glasses are supposed to cancel out the other eye, but they don’t do it 100%. So if you have something that’s super bright in the background it’ll bleed into the other eye, and that’s annoying. They’ve got all sorts of technology these days, anti-ghosting projection technology. As 3D displays get better, all those problems will go away.

Then there’s quick cutting in 3D. All the action scenes in Retribution, the 3D is very minimal, until the ax comes towards you or something like that. We know the fights will be cut very quickly, so we reduce the amount of 3D. No one thinks at that moment, “Well, where has the 3D gone?” There are programs out there, and charts, for people who are afraid of 3D. “I better have this program that tells me what the right 3D is.” I’m like, “How could you do that?” That’s like a program that tells you to shoot a movie all in medium shot, so that every shot is the same. Our 3D is all over the place.

So you figure out what works once you get to the set? No pre-visualization?

I work with the production designer, and we added those glass panels that come down in the foreground of the sets, with the Wesker character on them, so that we could have foregrounds and backgrounds. We also added desks that rise up into them, that we can shoot through. You need foreground objects and mid-ground objects. We plan like that, but once we get there…

On The Final Destination there was no stereographer. There was me, and my second-unit director of photography, and we both figured it out. On Resident Evil: Afterlife it was the same sort of thing. We brought on a crew member, John Harper, who was the convergence puller, one of the guys who controls the 3D on the set. He took to it really well, and understood it really quickly, so we started letting him do the 3D on his own, and we’d look over his shoulder to approve it. Now he’s become our systems tech, making sure all the equipment keeps working, and he’s also our stereographer.

When I was first starting, stereographer-type guys (they called themselves “depth artists”) would call up to instill fear in everyone. “You don’t want to make a 3D movie by yourself, you need an expert out there.” We do it ourselves. And Paul Anderson is very knowledgeable about 3D, so he’ll ask for certain things—more stuff coming out of the screen, or “Let’s go a little smaller here, because the surprise is coming in 10 minutes.” It’s still a family affair, but since we sometimes have four 3D rigs working, it’s Harper who deals with it on set and he gets the stereographer credit.

Resident Evil: Retribution Milla Jovavich

Resident Evil: Retribution

Could you talk about how you set up that reverse slow-motion shot that opens Resident Evil: Retribution? It’s really gorgeous.

On Resident Evil: Afterlife, which obviously has a lot of slow motion, we had to bring in a special camera: the Phantom Gold that could shoot 1,000 frames per second. They’re a real pain in the ass to work with in 3D. The cameras are big, and they record to laptops. They are constantly recording, so all you do is tell it the out point. If you want to shoot at 300 fps, you cut, and then you have to count backwards, because at 300 fps you get 14 seconds of footage. So hopefully you got the beginning of the take in that 14 seconds, and then you download it, which takes five minutes, and then you watch it and that takes… forever. And they’re very costly.

When I talked to production about renting our rigs, for our system we developed, and using the Red Epic cameras, one of the things we were promised by Red was that these cameras would be able to do 288 frames per second in stereo. It’s pretty complicated. Just to sync the image sensors in the cameras, pixel by pixel, and have to be locked together, pixel perfect. Plus it’s an insane amount of processing. It didn’t exist at that time. They said, “We’ll for sure have it by October.” Which was when we were going to start shooting. So I’m in pre-production meetings with Paul, and he’s like, “Extreme slow-motion, in reverse, four cameras shooting….” And I’m calling Vern, who’s back in Los Angeles putting our system together, and I ask, can we shoot slow motion yet? And he says [whispering], “Not yet.” I’m like, Jesus. Every day we’d have meetings with more slow motion, and I’m wondering when it’s coming.

Two weeks before we started production, that software was enabled, we did some tests, and it worked, and I was able to breathe a sigh of relief. It was pretty tense for me, actually nobody else knew. So we were able to use the Red Epics, at 24 or up to 288 frames per second. All we had to do was flip a couple of switches. That was very liberating. There’s actually a lot less slow motion in Retribution than in Afterlife, but it’s just great for Paul to know that at any time he can change camera speeds. That didn’t exist before this movie.

The budget on the movie… I like to call it the biggest low-budget movie I’ve ever been on. [The Resident Evil team] are really great at giving movies a really big epic look with not a lot of money and not a lot of sets. We shoot a lot of stuff into the ground and into the sky because we don’t have anything out at the horizon, and green screen is $40,000 a shot. They built the deck of the Arcadia, the ship where the reverse slo-mo occurs, up on scaffolding in a parking lot. We predominantly shot down into the ground and up into the sky, into the clouds where they added the Osprey helicopters later. It was heavily storyboarded. I remember when Paul was writing the movie and he said, “I’ve got the first eight pages written, and it’s a spectacular action sequence all in reverse.” So we shot in high speed—not in reverse obviously, we shot the whole sequence normally, and reversed it in post. And we used the same sequence again, sped back up to 24 frames. We had horrible weather in Toronto; it was always rainy and wet and dismal. The sun would peek out for a second, and every time it peeked out, that’s in the movie.

In the Umbrella headquarters set, there are reproductions of Moscow, Tokyo, and New York City. Is it a combination of location shooting and sets?

We sent out or visual effects supervisor with two guys with a 3D rig, and they shot some things in Moscow, in Red Square. I think they were able to clear Red Square of all tourists for an evening or an afternoon. The full production never went to Moscow or Tokyo, or anywhere else. In fact, when I got the script, I remember talking to a friend of mine who had done production design with Paul Anderson before, and I said, “This is going to be great! We’ll be going to Moscow, to Tokyo and Berlin.” And he said, “You’ll be in a parking lot in Toronto with green screen, you’ll see.” And that’s where we were. A parking lot in Toronto, and all those backgrounds were put in later. I don’t know if we sent anyone to NYC, I think so. Tokyo was mostly stills and laser scanning—I don’t think they brought a 3D rig to Tokyo. They shoot stills and do laser scanning of all the buildings and texture mapping. But a very small crew really. A tiny handful of people. That’s the only way we can afford to do it.

What’s it like working with Milla Jovovich? She does a lot of her own stunts, right?

She does tons of it, yeah. It’s the third movie in a row with Milla. She comes in the morning, says “Morning, Glenny!”, gives me a hug and then we start shooting. She does a lot of modeling still, so she knows about light, and if I’m in a tricky situation she’ll stand there on the mark for 10 minutes just to help me out. Super accommodating. She trains for weeks before the production, for all those fight scenes. She has to get into harnesses, and they pull her up in ropes and pulleys and things. And working with the fight choreography, I don’t know how they remember that stuff. It’s like pretty elaborate dance moves, you have to be at the right place at the right time.

Then also, as Paul said, in 3D fight scenes are hard to fake. In 2D films you can put on a long lens, and pretend to punch somebody and miss them by two feet. But as long as they turn their head at the right time and the camera’s in the right place, it looks like they got hit. And then you shake the camera around a little, with a long lens, and make it look very exciting. You can’t do that in 3D. First of all, shaking the camera doesn’t really work—it’s annoying in 3D. The third dimension is depth, so you can see when somebody misses by two feet, even if the camera’s in the right place and they turn their head at the right time. We learned that on The Final Destination. We were doing a near-miss with a car, and it didn’t look like the car was anywhere near the person. I asked the director what we should do, and he said, “Hit him!” [Laughs.] I remember telling that to Paul. So we went and shot some tests on Resident Evil: Afterlife with fights, and yeah, you have to almost hit the person. You have to brush their nose with your fist. On Three Musketeers with the swordfights, they were hitting each other, there were no near misses, it just doesn’t work. And it was the same thing on Retribution. Milla said in one of those behind-the-scenes interviews, “3D is hard! You can’t fake it as well.” There were times she kicked someone’s hat off, she was that close to their head.

Three Muskateers W.S. Anderson

Three Musketeers

Could you talk about working in those Bavarian castles in Three Musketeers?

Oh my God. We starting scouting the castles, and the amount of restrictions they wanted to put on us… Everything had to be three feet away from all walls, and the amount of light—they have tapestries from the 1700s that can’t be exposed to very much light, so they take great pains to keep it very dark inside the castle. So they said, “You want to put lights in here? That’s not going to happen.” And then there’s the equipment we had to haul around, with the recording towers and all that stuff. At one point I said, with the producers, “This is not going to happen. There’s no way we can shoot in this castle.” And Paul said, “We have to shoot in this castle, we have to figure out how to do it.”

We talked them into letting us use small, fluorescent lighting units that don’t put out a lot of light. We had to bring in our lights, and they did some tests to see how much ultraviolet was hitting the tapestries. So it was a really, really low level of light. I could light through the windows, very minimally, but I was not allowed to put scaffolding outside the windows, where there is extremely rare, century-old cobblestone or something. So we had to get these cranes with a 100 foot reach that we could put out in the road and reach 100ft over the cobblestone, with a rock ‘n’ roll truss on it with these lights that you could aim through the windows. It wouldn’t touch anything around the castle. There would be just enough light inside so we could photograph.

And we had long fiber optic cables made that would run from the camera to the recording towers. We had these special wheels made with felt pads for all the dollying equipment, and we had people following every crew member. Guys will be carrying a stand, and it’ll be two feet above their heads, so you’ll hear a lot of, “Whoa, whoa, whoa!”, and he’d look up and be four inches from some precious chandelier [laughs]. It was really painstaking, but it’s really worth it because they are spectacular interiors. You know that big corridor of mirrors that they walk up and down? I think I had one small handheld light. We shot all that with available, existing light. We had to wait until a certain time of day for the sun to be in the right position.

What is your attitude towards films that post-convert their films to 3D?

I can see it immediately when something is post-converted. If I’m shooting a person in 3D, I’ve got two cameras, and they’re spread apart, four inches from each other. One camera is seeing the front of the face, and the other is seeing around the side, more of the ear and all that. That’s what gives you your rounded 3D effect. In post-conversion you can’t get that second perspective. What you can do is cut out the person’s head, duplicate it, and put it in a different space. You’ve got this cut-out person who’s floating from the background. You’re not getting around to see the ear like with two cameras. It’s like those effects they do in still cameras where they cut out of the foreground and move it in the background. I can see that immediately.

They can be done well, but to do it well costs a lot of money, a lot of labor, and a lot of software, just cutting out all of those elements and then building them back into the image. For instance Guillermo del Toro’s new movie Pacific Rim was just announced that it was going to be converted to 3D. I imagine it’ll look pretty good, because the software is getting better and it’s less expensive, but it’s not going to be the same as if he shot it in 3D. Conan the Barbarian was post-converted, and they didn’t give it a second thought when they were shooting the movie—a lot of long lenses and shaky camera. Apparently 30 percent of the movie is still in 2D, because they couldn’t convert some of that footage. I think you want to be on set knowing you’re shooting 3D, you want to block the scene and build the sets appropriately for 3D. If you post-convert, you don’t even think about that. And then you hand it off to a bunch of kids in Bulgaria doing the conversion for you, and you’ve lost all control.

It’s frustrating because 3D gets a very bad rap because of these post-conversions, when doing it the right way doesn’t seem so difficult.

It’s mostly a cash grab in a lot of cases, I think. “Damn, let’s post-convert this and get an extra three bucks per ticket.” It gets a bad rap because there’s a lot of bad 3D out there.

The true capabilities of 3D are not being seen as much, so people get frustrated and write it off.

Exactly. But technology does help things. On The Final Destination we didn’t have the ability to see our 3D on set as we shot it. We had our 3D cameras and we had our two images, and you got really good at knowing that the offset between the images was this much. Then later that night we would look at our dailies from the day before, in 3D. So we got really good at knowing what the two 2D images would look like in 3D. That was a really good learning experience. But on Resident Evil: Afterlife suddenly there were 3D capable televisions and monitors and now the director’s video village was all in 3D. And Paul could say, “What would this look like with a little more 3D,” and we’d do it, and he’d say, “That’s better.” On Retribution, we could deliver 3D images wirelessly to the video village in HD. That’s the way to go for me. We shot the movie in 55 days, as fast as any 2D movie. It doesn’t slow you down, doesn’t take a whole lot of extra time.

And as you said, you didn’t have a massive budget, either.

No! [Laughs.] It’s all gone, man. When we’re shooting, there’s not a penny left. It’s gone before we start shooting.

Is there another project with Paul moving forward?

It’s moving ahead; we’ll see what happens. It’s Pompeii, a 3D period disaster movie. With lots of volcanic ash and lava and in the air. It should be pretty cool in 3D.