>"Gary Pollard" <gpollard@allivigator.com> wrote in message

>news:b7tqve$kas9@imsp212.netvigator.com...

 

>> No it doesn't only exist with DV or even only with digital formats. I've

>> been in broadcasting for 20 + years, much of which we've used analogue

>> formats, and we have ALWAYS asked guests not to wear checks or too fine

>> pin stripes for precisely this reason. Even if we are recording directly

>> onto 1" or 2" analogue tape.

>>  Gary

 

On Sun, 20 Apr 2003 22:39:18 +1000, "Native_MetaL" <NativeMetaL@optusnet.com.au> wrote:

 

>ok thanks Gary do you know whether there is any kind of lens filter that can

>be used to reduce the problem?

 

For a wonderful, REALLY complete discussion of this,

see John Dyson's post on this (unfortunately detached from

this thread...), below. The short of it: to reduce artifacting,

either allow no motion in subject or camera; reduce subject

and lighting contrast as much as possible; add a filter to

the lens that reduces detail considerably; or accept the

aliasing (which does vary from camera model to model, and

among formats). (I prefer to choose a suitable camera,

and then accept the consequences...;-)

 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

 

The moire or the more general problem called 'aliasing' can occur

on any sampled video format.  However, when the sampling and

compression are properly implemented, the 'aliasing' will be

well controlled.

 

Here are some facts about the problem (in no particular order):

 

1)   Moire is one manifestation of a more general problem called

     'aliasing.'

     A) Aliasing happens in a sampled system, where the signal has

     higher frequency components than what can be sampled and

     reconstructed.   As a matter of mathematics, the sample frequency

     has to be twice the maximum frequency of the signal that is being

     sampled.  The signal might even have noise components that are

     outside of the desired signal frequency range, and the noise that

     is higher than the range that can be sampled and reconstructed by

     the sampling process will cause aliasing components.

 

2)   Moire (or aliasing) can even be manifest on FM video recorders

     like VHS, SVHS, BetaMax, BetaCam SP or even broadcast 1".  Yes,

     ANALOG formats can produce aliasing because the FM sidebands as

     recorded on tape can wrap around 0Hz (DC.)

 

3)   The pre-filtering needed to minimize the aliasing problem can

     decrease the apparent sharpness and apparent resolution.

 

4)   Sometimes, on non-critical applications, the amount of pre-filtering

     can be minimized to help compensate for potentially low resolution

     or detail from a lower quality source.  The apparent increase in

     detail will be bought at the cost of more aliasing artifacts.

 

5)   The moire and other aliasing artifacts will create a permanent

     imprint on the video signal.  Once aliasing appears, then removing

     it will be much more difficult than removing the aliasing before

     the sampling process.  As a matter of physics and mathematics,

     removal of aliasing after it has appeared in the signal is impossible

     in the general case, but sometimes the signal can be cleaned up a

     little.

 

6)   In sampled picture element TV cameras like CCD units, to avoid

     aliasing, the image on the CCD should be low pass filtered (optically

     remove the high frequency components that cannot be properly sampled

     by the CCD array.)  On older tube based cameras, even though the

     horizontal scan was 'continuous', the scanlines essentially sampled

     (and still does) the video in the vertical axis.

 

     A)  An especially ugly form of vertical aliasing is manifest by

         'interlace twitter.'  Even though 'interlaced' video definitely

         needs some kind of pre-filtering to mitigate the twitter, progressive

         video can be ugly without pre-filtering the video also.  When

         trying to reproduce too much detail, even with progressive video,

         there are ugly stairstepping artifacts and even the possibility

         of larger area 'beat products' that appear as moire.

 

     B)  Even for the old analog TV cameras, the scanlines represent

         a sampling in the vertical dimension, and so sampling issues

         had to be dealt with for highest quality.

 

     C)  For alias free reconstruction, CCD cameras need an optical

         filter that removes the high frequency detail that cannot

         be properly sampled by the CCD imaging array.  If the CCD

         array has too few pixels, sharpness can be improved by decreasing

         the amount of optical prefiltering, but aliasing is made

         worse.  With CCDs, it isn't possible to change the pixel

         size to overlap between adjacent samples to help to filter

         out the excess detail.  With the old common TV camera tubes,

         the spot size could be made large enough to allow the overlap.

 

     D)  One approach used in the old tube TV cameras was to make the

         scanning spot larger than a scanline so that excess vertical

         detail will be removed before sampling.  The problem with

         increasing the spot size was that it would often grow the

         spot in both the vertical (desired) and the horizontal (less

         desired) directions.  The aperture correction helps to compensate

         the video in the horizontal direction for the large (larger than

         one scanline) spot size that mitigates normal spatial aliasing,

         and the uglier time dependent aliasing from interlace.

 

     E)  Some very extreme conditions of aliasing like behavior, (e.g. in

         the case of FM wrap-around on analog VTRs) can look like

         large area waves moving through the video.

 

     F)  Quantization and truncation of DCT coefficients (one of the steps

         in MPEG and DV type video compression) can appear to be similar to

         the optical aliasing effects.  Removing or changing the coefficients

         can create nonlinearities that effectively add distortion signals

         that are high frequency, and can appear similar to too much detail.

         I haven't seen 'moire' from the errors caused from compression

         schemes that remove redundancy by using DCTs, but I have definitely

         seen effects that look like the stair-step form of aliasing.

 

     G)  Even the DV25 compression artifacts appear to produce low frequency

         components that are likely due to high frequencies caused by the

         nonlinearity and 'beating' with the sample frequency (result: aliasing.)

         This means that the 'low frequency' components become a permanent

         attribute of the video signal, and unfortunately, VHS decks reproduce

         the low frequency aliasing very well.  In essence, DV25 (normal DV)

         can produce artifacts that can even degrade the image quality of

         a VHS (yes, just VHS) distribution copy.

 

     H)  Digital recording schemes that have less compression tend to produce

         fewer extra high frequencies (because they have to truncate less

         detail), and so the excess aliasing from the compression record and

         playback is nil.

 

     I)  Very high quality TV cameras have more CCD imager elements (sometimes

         both in the H and V directions), and allow for less optical filtering,

         and can support the more precise electronic digital filtering.  The

         optical filter still has to remove the detail that cannot properly be

         sampled by the CCD imager, but CCDs with lots more elements can allow

         for less of a sloppy optical filter.  With less optical filtering,

         then need for 'compensation' for the optical filter (which will obviously

         remove some desired detail also, alot like the spot size on a tube

         TV camera) by boosting/compensating the VALID high frequencies

         will be lessened.  This will allow for both less noise and less aliasing.

 

     J)  Even HDTV 720p (progressive) needs filtering to avoid the ugly picture

         from stairstepping and moire.  Interlaced video is more critical, because

         the aliasing that jumps around is not just ugly, but it is distracting.

         Interlace filtering isn't as bad as it initially sounds, however, because

         the electronic portion of the filtering can be made dynamic.  It is

         very difficult to create a dynamic 'optical' filter, but an electronic

         filter, especially when using a CCD array that has LOTS more picture

         elements (and perhaps like mine, has double the scanlines), can allow

         for better detail than interlace normally would have, yet also remove

         by far most of the aliasing artifacts.

 

I hope that this gives some help in understanding the effects of aliasing

for simple video applications.  Where aliasing in video can cause

moire, stairstepping and, in certain cases, some dynamic artifacts that

mix aliasing with interlace.  Equvalent kinds of aliasing in audio

applications can cause wierd tweets when an instrument (or multiple

instruments) mix with the sampling frequency.  In cases where the

aliasing effects are extremely obvious, similar to when the stairstepping

can happen in video, with audio, the sound can be 'harsh', even if there

are no extremely gross effects.  It is VERY IMPORTANT to remove the

out-of-band signal components (even if noise or distortion) before

the sampling mechanism is applied to the signal.  One not-so-obvious

case for audio, where out-of-band artifacts should be removed, is to

properly dither the audio before sampling.  A low level dither

(properly shaped) will help to mitigate regular patterns of aliasing

artifacts, and spread them out to be less perceptable.

 

(This was written without a careful review -- it is too late, so

I hope that I am not misleading anyone, and hope that this is adequately

complete.)

        

                John Dyson

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~