"Ed
Anson" <EdAnson@comcast.net> wrote in message
news:3F175D3F.5080604@comcast.net...
>
David Ruether wrote:
>
> "Ed Anson" <EdAnson@comcast.net> wrote in message
news:3F162902.2010101@comcast.net...
>
> [...]
>
>
>
>>OTOH: If the two optical systems gather the same amount of light and
are
>
>>arranged so that each projects the same image onto the corresponding
>
>>sensor, then I agree that AOFBE each pixel will receive the same number
>
>>of photons on each sensor. But to do this would require that the lenses
>
>>have very different f numbers (speeds), with resulting effects on depth
>
>>of focus, cost, etc.
>
> In all of this, the lens, etc. can be removed as irrelevant (and was, by
the
>
> phrase, "all else being equal").
> In
a practical system, all else cannot be equal.
But in
an experimental system, to determine the effect of changing
*one*
aspect only (as here, with a question only of the effect of
changing
the sensor size [only...;-]), you *can* make all else equal.
If
after doing this, you want to add extra questions, that is OK, but
irrelevant
to the basic single question that can be answered easily
by
stripping away what is not useful in answering the basic question.
Or, if
I want to include the irrelevant lens, associated electronics, etc.,
why not
also argue about the color of the camera grip, whether or
not it
is raining during the experiment, etc., etc., etc....;-)
> As
I pointed out before, if identical optics are used then the smaller
>
sensor records a smaller portion of the image than the larger one, so
>
the framed image is not equal. In this hypothetical situation, the
>
smaller sensor captures less light per pixel. In that sense, it is less
>
sensitive.
This is
incorrect as an arguement, since you can easily specify
that
the lens will cover both sensor sizes, that it is a zoom of
constant
relative aperture, and that it can be zoomed to cover
the
same angle of view with both sensors. The lens drops out
as an
issue. If you specify different lenses that cannot be used
equally,
you have made the question of sensor size effect not
directly
answerable by introducing an irrelevant "monkey wrench",
but
that is silly... If you want, you can assume that almost all
zoom
lenses for video can be set to cover the equivalent angle
of view
of, say, a 70mm lens in 35mm, and that all zoom lenses
can be
set to a true f5.6 relative aperture. In this (most common)
case,
the lenses perform equally on the two sensor sizes, and
they do
drop out as issues...
>
But in real life the optics would be adjusted to frame the same image,
>
and that complicates things. To make the image equal, we must make the
>
optics different. So "all else being equal" isn't really possible.
Sorry,
but this is nonsense...
>
> Since this is a discussion of CCD size
>
> alone vs. sensitivity, only the CCD area need be a concern. RGB's
>
> checkerboard in the rain covers the issue well enough, but perhaps easier
>
> to see is a comparison of two photo-electric cell arrays out on the same
>
> day (and therefore exposed to the same light intensity, assuming similar
>
> orientation, etc.). One is 2"x2", the other is 20'x20'. All the
cells in each
>
> array are connected in parallel and therefore produce the same voltage.
>
> Which produces the higher current? (Or, all the cells in each array are
>
> connected in series. Which produces the higher voltage?) In these
>
> discussions, it helps to strip out the irrelevant parts (the equal parts
>
> of the equation that drop out...;-), then reduce what remains to a
>
> familiar and similar situation...
>
All this assumes equal light intensity on both sensors. Indeed, that is
>
one thing that can be made equal.
Of
course, along with all else that is not "sensor size alone"...
There
is such a thing as experimental science...;-)
>
But suppose that instead we use the same diameter lens with both
>
sensors, thereby collecting the same amount of light. In this case we
>
need different focal lengths in order to frame the same image. This
>
makes the total light hitting the sensors equal, so we get the same
>
number of photons striking each pixel.
Totally
irrelevant - see above. Or, suppose we only expose the
smaller
sensor to bright light (no lens) and the larger only
to
darkness. Totally irrelevant again, if we are trying to find out
the
effect of sensor size alone. Adding irrelevant conditions
adds
nothing to the discussion (this is not rocket science...;-).
>
And I think this is the crux of the misunderstanding between the OP and
>
others on this topic. They are assuming different sets of parameters
>
being held equal. I see some validity to both points of view.
None
whatsoever, once the question is simplified, which can
reasonably
be done (otherwise no science can be done...;-).
>
>>My point is: Physics and economics dictate that changing the size of
the
>
>>sensor requires other things to change as well. That can make it a bit
>
>>tricky to predict the performance of a practical system. And none of
> >>this
takes into account the effects of sensor size on the intrinsic
>
>>noise of the device and the circuit used to read it.
Yes, we
agree on this - but it has nothing to do with the basic question
of
whether or not changing the sensor size alone affects the sensitivity
of the
sensor. It clearly does...
>
> Yes. But the question was a simple one of sensitivity vs. size of the CCD,
>
> all else being equal. There are MANY other issues involved in a more
>
> complete discussion of the question of video imaging, but these are
irrelevant
>
> (but interesting...;-).
>
But my point is that the question isn't really that simple.
But it
is. The basic question of whether or not changing the sensor size
alone
affects the sensitivity of the sensor can be answered. It clearly does...
>
All that said, I do agree that larger sensors help to make better cameras.
Again,
we agree on this. But, it was not the answer to the question...;-)
--
David Ruether
d_ruether@hotmail.com
http://www.David-Ruether-Photography.com
Hey, take a gander at www.visitithaca.com,
too...!