On 30 Mar 2001 14:09:32 -0800, Paul Rubin
>d_ruether@hotmail.com (Neuman - Ruether) writes:
>> If you are editing in a suitable NLE (Premiere works well for this),
>> multiple cameras are easily synchronized using the sound track
>> visuals (no need to use the tape time-code).
>
>I'm not sure I understand this. At 30 fps, one hour is supposed to be
>108,000 frames, but the xtal oscillators in the cameras aren't perfectly
>accurate. So Camera A might shoot 107,999 frames per hour and Camera B
>might shoot 108,001 frames per hour.
>
>Towards the end of the hour, the two cameras are out of sync by two
>frames.
>
>If I use NLE to sync the frames up, I'm still showing Camera A's video
>a little slower than Camera B's. Let's say I edit the two tapes into
>an hour-long split screen. If frame number 107,000 of both tapes are
>on screen at the same time, then the camera B side is showing a view
>that's a little earlier than the camera A side.
>
>Am I missing something?
Except for particular scientific uses, a couple of frames
of error are not very important, and are easily compensated
for in editing (I've synched three Mini-DV cameras for
up to about 1/2 hour with no problems). In your instance
of a 2-frame error (I have never seen it worse than that),
one could start the "slow-running" camera a frame early and
let it run a frame long at the end, and you would never see
or hear the error, even with the two audio tracks mixed in
equal parts. One can usually find a place to remove a frame
here and there without audible or visual glitch, too. BTW,
physical camera spacing differences can be more important:
if one camera is 3' from a talking subject, and the other
100', the time difference between them would be about three
frames in the audio tracks. With this, I either "slip" the
delayed track ahead, or more usually, synch. the audio
tracks and let the visuals be slightly out of synch.