Olympus-OM
[Top] [All Lists]

[OM] [OT] size of converted file using RawShooter

Subject: [OM] [OT] size of converted file using RawShooter
From: Richard Lovison <rlovison@xxxxxxxxx>
Date: Sat, 11 Feb 2006 13:44:07 -0500
Has anyone noticed that the size in pixels of a converted RAW file using
RawShooter or SilverFast DC Pro is larger than one converted with Olympus
Studio or Viewer?  I'm not sure if this has been discussed here in the past
but if so, forgive my redundancy.  I came across this peculiarity on the
Four Thirds forum.  Following is part of the thread:

******************************************************************************************************************************************


Well,.. I know that the final RGB image isn't what the sensor measures.
For clarity in the continued discussion I'll explain how I understand how it
works, please correct me if I'm wrong:

The sensor measures the monochrome intensity values of individual R, G, and
B filtered photo-sites according to the pattern you showed. When the camera
converts this information to an RGB-coded image, where each pixel has three
intensity values (one for each of R, G, and B) rather than only one (the
monochrome filtered value that is measured), *it assigns to each pixel
position calculated intensity values of the missing two colours as an
average of the intensity values of those colours **measured by neighbouring
photo-sites**.* This process is referred to as the "Bayer interpolation" and
it can be done in many different ways. (I have seen at least 7 different
schemes for doing it.)
As the interpolation *creates *colour values that were not measured, the
resolution in the final image is lower than what one would expect from
looking at the hard numbers of photo-sites without knowing how the colour
values are created. In order to give as high resolution as possible in the
final image, the camera must use the nearest neighbour photo-sites for the
calculations which means that each RGB pixel is a product of 3x3 monochrome
R, G, and B photo-sites. (This doesn't mean that the camera's resolution is
1/3 of what the sensor photo-site specification states - as Foveon
supporters often claims.)
So, there is no immediate reason to use 2614 x 1966 sensor photo-sites to
produce 2560 x 1920 picture pixels. All that is needed is 2x2 columns and
2x2 rows of extra monochrome filtered photo-sites along the edges - i.e.,
2564 x 1924 sensor photo-sites.

*In conclusion*, *if *Olympus uses more of the sensor's photo-sites, it is
still a riddle to me *what *they are used for.

The reason why I said that Olympus have chosen to discard some pixels is
because the extra image information that other RAW-developers render is not
present in Studio's renditions. Just to convince myself I had a look on one
of my images. Here is a comparison of the lower left corner of the same RAW
image developed with RawShooterEssential (RSE) and Studio:

[image: Click image for larger version Name: corner_comparison.jpg Views: 3
Size: 78.7 KB ID:
226]<http://www.fourthirdsphoto.com/vbb/attachment.php?attachmentid=226&d=1139570299>
(klick for a larger image)

The Studio image is overlaid and aligned *exactly *in top of the RSE image
using prominent singular pixels like those pointed out by the black arrows.
(I used the whole image for this alignment.) As you can see, the information
in the extra rim-pixels rendered by RSE are not present in the one from
Studio. Not even if they are located very close to the border like the
feature pointed out by the white arrow. Here is a blowup of the corner for
those who want to look more carefully (left is RSE, right is RSE with Studio
overlayed):

[image: Click image for larger version Name: corner_comparison_blowup.jpg
Views: 3 Size: 95.7 KB ID:
227]<http://www.fourthirdsphoto.com/vbb/attachment.php?attachmentid=227&d=1139570299>
(klick for a larger image)

 Quote:
  As you can see there are *5 green* filters for every *2 red* and *2
blue*filters. Therefore the
*2614 x 1966* *sensor pixel* area produces the following distribution:

1,142,028 *red-filtered sensor pixels*
2,855,069 *green-filtered sensor pixels*
1,142,028 *blue-filtered sensor pixels
*

  Not quite right, note that the repetitive unit is only 2x2 photo-sites but
they have shown one extra row and one extra column in that figure. So there
are *2 green filtered photo-sites* for every *1 red* and *1 blue*
filtered photo-sites.


 Quote:
  Please forgive me for stating it again—but it is fundamental to discussing
this stuff—there is no direct correlation between this and the *2560 x 1920*
*picture pixels* which each contain all three color channels.

Well, there *must *be a direct correlation - it is just that we don't know
the exact interpolation algorithm used by the camera. However, I am
personally convinced that each RGB image pixel is a result of an array of
3x3 physical monochrome sensor photo-sites (anything more would reduce the
final resolution) which requires a total of 2564 x 1924 filtered sensor
photo-sites.

 Quote:
  *Think about this:* Olympus does *not* state how many *sensor pixels* are
in the picture frame area of the E-1 sensor. All they tell us is that
the *final
picture* has a maximum resolution of *2560 x 1920 pixels* after the raw data
has been interpreted. This is where third-party raw developers enter the
story and manage to use some of the incomplete data outside this area to
interpolate a larger field of view containing more *picture pixels*. Notice
that we're still talking *picture pixels*—we are *not* talking *sensor
pixels*. Again, for all we know the *2614 x 1966* *sensor pixels* listed in
the Kodak document is what Olympus is using to create its *2560 x 1920 pixel
picture*.

First Light, I agree that we know nothing - only Olympus knows exactly what
is going on. However:
What makes you think that the data that the third party RAW developers use
are *incomplete *data? We don't know that either...but we *do know* from the
sensor specification that it will be able to deliver larger images than
those produced by the camera or Studio. The extra image information, as in
the example I showed, surely isn't made up.
Also, we do *not* know that Olympus *uses* 2614x1966 sensor photo-sites to
create 2560x1920 picture pixels. All we know is that they are at hand.

It would be very interesting to hear how the extra sensor photo-sites may be
used by Olympus' RAW engine from someone with *real *insight into how RAW
images are developed.

==============================================
List usage info:     http://www.zuikoholic.com
List nannies:        olympusadmin@xxxxxxxxxx
==============================================

<Prev in Thread] Current Thread [Next in Thread>
Sponsored by Tako
Impressum | Datenschutz