Feeds:
Posts
Comments

Archive for December, 2010

Fallen Star – dubbing and audio sync

During the first round of editing for Fallen Star, it became clear the sound was going to be a problem. We had used an external microphone to capture the dialogue, however it wasn’t an appropriate type, nor was it positioned consistently, and so we ended up with quiet dialogue with lots of hiss. Extensive “noise reduction” was considered a possible solution to this, however there was still the problem of echo, and of sound levels changing from one shot to the next. In the end, we had to bite the bullet and accept the fact that we needed to dub the entire thing. You may not have realised that it was actually dubbed, in which case I’m glad! But here’s an example video to illustrate the difference between the original audio and the new audio:

Our process went like this:

– First, the original audio was exported from the project, and saved as a series of sound files. Typically these were broken into sections of one or a few lines, depending on the duration of the shot.

– The sound files were then organised into groups, depending on who was speaking in each one. We then arranged recording sessions in front of a studio-quality microphone in a sound-dampened room, one actor at a time.

– Each actor would listen to an audio clip a few times, memorise the timing, and then repeat the line(s) into the microphone. The new dialogue was saved to a separate file, and we’d move onto the next clip.

– Once all the new dialogue was recorded, each audio clip was opened using a multi-channel audio program, and compared to the original audio, by routing the original dialogue into one ear and the new dialogue into the other ear.

– By ‘scrubbing’ through the audio, I was able to directly compare the old with the new at very slow speed, picking out every individual sound. I aligned them as best as I could, shortening or lengthening each syllable or vowel sound until all the consonants (or any part of the dialogue associated with a notable mouth movement) were aligned.

– The new audio was then saved and imported back into the project. Because the length and timing now exactly matched the originals, I didn’t have to do any further lip-synching.

It was a time-consuming process, admittedly, however the results speak for themselves. We achieved flat sounds without reverb or hiss, consistent throughout the film. Background noises could then be added over the top.

Advertisements

Read Full Post »

When making the DVD, I wanted to have a flashy system of menus to choose chapters and other options. I had the idea to reconstruct our bridge set as a digital model, and use parts of that for the various menu screens. I had made CGI interiors for use in the film itself, but not the bridge. This was purely made for the DVD, which is why I thought I would show it off here too.

Bridge Model (wireframe)

Bridge model (wireframe)


When making the model, I took a few shortcuts. Firstly, I ignored the helm station and the captain’s chair entirely. I had no intention of doing an entire sweep of the whole bridge, and modelling detailed objects is time-consuming, so I left them out. The camera would pan across the walls, missing them out.

Bridge Model (render)

Bridge model (render)


Secondly, when making the two rear terminals, I built one (accurately based on photographs of the physical panel) and then duplicated it for the other side. In reality, both panels were different, but I felt it wasn’t important.

With the 3D models done, I animated a few camera passes and then rendered the final shots. These were then modified to adjust the colour and brightness levels and finally add the special effects and text. The forward viewscreen window would serve as the main menu for the DVD, so I added the planet Earth behind it and made a holographic screen overlay that played on a loop. For the chapter selection screens, I took clips from the film and squeezed them into the little square screens on the rear terminals. All the effects were animated and finally rendered for the DVD format, where I added the buttons and highlights.

See the video for how it turned out:

Read Full Post »

Here is the third and final visual effects showreel from my Fallen Star project, this time focusing on the augmentations made to live-action footage.

Unlike the CGI space scenes or the green screen scenes, live action raw footage took up the majority of the shots used in Fallen Star. But most of that footage had to be modified in some way, whether it was to add laser beams, computer displays or even entire walls. Let’s go through the ones shown in the video…

Lasers! These were straightforward enough – just straight lines given a glow effect. They widen slightly as they fire and narrow again before they disappear, rather than just flicking on and off. They also swipe across in the direction of travel, just slightly. Real lasers are instantaneous, but you can use artistic license with sci-fi weaponry. The good thing about these laser effects is that they’re on-screen for just a few frames, so even if the camera movement is all over the place, it’s a piece of cake to track their motion. I added a glow to the impact points and the gun barrels too, just to jazz it up a bit.

Background enhancements! Our original footage had, for instance, a plain wall where we needed to have a porthole. Thus, I created a porthole and then laid it over the footage, so it lined up with the wall. The difficult part is if and when someone walks in front of these, at which point I had to do some ‘rotoscoping’, masking out the shape of the person frame-by-frame, to obscure the porthole.

Background stabalisation / extension! We had our footage of the briefing room interior, which was to be seen through the long row of windows from the outside. The actual footage showed only a small area of wall and carpet, so I had to extend this outwards to create a large room. A digital set extension, if you will! This was then lined up with the footage and finally inserted into the composition with the CGI exterior. A similar technique was done with the opening bridge ‘zoom-in’ shot, but I had to stabalise the footage to stop the excessive jerking about that our camera was doing. Stabilisation is mostly an automated process; it’s just a matter of finding a constant reference point for the computer to recognise and keep it locked in place.
On a similar note, there were a couple of shots where a door was removed and replaced with the corridor behind it. Most of our corridor shots were done on green-screen, but as you can see in the video, I squeezed an open door into one piece of live-action footage too, masking around the foreground as necessary.

Computer screens! This was where things got difficult. Our screens were physically black (or grey) with nothing on them. Thus, every single shot had to be amended to insert the graphics. The process would have been longwinded enough without having to draw around people’s heads and hands whenever they moved in front of them, but there was no other way. In hindsight, we should have fitted green panels onto the displays; that way, I could have keyed out the colour and saved myself a lot of time. The sheer number of shots where the screens are visible in the background was ridiculous! I’m glad the end result looks good, though. It’s just as well most of our shots had little to no camera movement – that would have made the process even harder!

Transport effect! One of the oldest optical tricks in film is to stop the camera, move something out of the way, and start it up again, to make it look like something has disappeared (or appeared, if you do it the other way around). We did this in the forest, to make our crew appear to materialise out of thin air. Digitally, this is a simple process to fade from one to the other. To jazz it up, I cut around the shape of the actors to isolate their shape, and then brightened them all up so it looked like they were glowing. Then I added sparkly effects over the top, which started small and then filled out into the shape of the actors, before fading away again. Additional glow effects were added to make it look like it was lighting up the forest floor slightly.

Miscellaneous! Little things can make all the difference. The scorch mark on the tree where the laser hit it? That wasn’t real! Although when the camera moved and I had to track it (and blur it) to keep it in the same spot, I did wish it was real. More lasers, this time going behind things – I had to mask out the character, tree or whatever else it was going behind, involving a small amount of that lovely rotoscoping again. Blurry eyelids opening? Those were drawn, with the colour fading from black to dark red to simulate the light filtering through, and the footage behind it is simply blurred. Finally, the squirrel scanner – I used a combination of automatic motion tracking and frame-by-frame adjustment to make sure the graphics stayed lined up with the scanner device. The opacity was set low enough to allow the reflections to come through, adding to the believability.

That’s it, that’s how it was all done. Wasn’t that terribly fascinating? If you pause the video, you might be able to make out some of the gobbledegook I wrote on the computer displays.

Read Full Post »