PDA

View Full Version : PF & CA



PixChick
09-01-2004, 08:05 PM
I would like to ask if anyone could elaborate on the differences between purple fringing and chromatic abberation (if they exist). It seems that some people say that it is basically the same thing, and others say they are two different phenomena. I'm interested in the techie details, so don't be afraid of getting too deep! Also, can you tell me what, specifically, causes these two things (or one thing, if they are really the same) to happen? Again, I'd like as much info as I can get. Is there anything that can be done to avoid or lessen the effects besides closing down the aperature (or post processing)? Thanks for your input.

PixChick :)

D70FAN
09-01-2004, 09:16 PM
I would like to ask if anyone could elaborate on the differences between purple fringing and chromatic abberation (if they exist). It seems that some people say that it is basically the same thing, and others say they are two different phenomena. I'm interested in the techie details, so don't be afraid of getting too deep! Also, can you tell me what, specifically, causes these two things (or one thing, if they are really the same) to happen? Again, I'd like as much info as I can get. Is there anything that can be done to avoid or lessen the effects besides closing down the aperature (or post processing)? Thanks for your input.

PixChick :)

Purple fringing is one of the more common types of chromatic abberation.

Stopping down the lens is just a band aid. As you will see it goes right to the design of the lens.

You will find a lot of detail on this subject by simply typing in chromatic abberation in google.

But to get you started:

http://www.maa.mhn.de/Scholar/chromatic_aberration.html

Rhys
09-02-2004, 07:30 AM
Purple fringing is one of the more common types of chromatic abberation.

Stopping down the lens is just a band aid. As you will see it goes right to the design of the lens.

You will find a lot of detail on this subject by simply typing in chromatic abberation in google.

But to get you started:

http://www.maa.mhn.de/Scholar/chromatic_aberration.html

I find it very interesting that the purple fringes on the canons are always (in my experience) on top of the lighter object and never below. That would tend to indicate to me a goof in the software rather than a lens problem.

John_Reed
09-02-2004, 08:15 AM
I would like to ask if anyone could elaborate on the differences between purple fringing and chromatic abberation (if they exist). It seems that some people say that it is basically the same thing, and others say they are two different phenomena. I'm interested in the techie details, so don't be afraid of getting too deep! Also, can you tell me what, specifically, causes these two things (or one thing, if they are really the same) to happen? Again, I'd like as much info as I can get. Is there anything that can be done to avoid or lessen the effects besides closing down the aperature (or post processing)? Thanks for your input.

PixChick :)When you hear "Chromatic Abberation," I think it's a classical description of how light beams of different wavelengths will refract differently through lenses, and supposedly it gets worse for long telephoto lenses. Having said that, I think there are also some other factors at work, maybe particular to digital cameras. Look at the following Cormorant photo, taken with my FZ10 with a TCON-17 telephoto extender attached:
http://john-reed.smugmug.com/photos/8052235-M.jpg
If you look closely at the branch the bird is perching on, there's a red fringe above it. But I don't see the fringe elsewhere in the photo, only on that branch. Why? IMO the branch itself is over-exposed, and the red color isn't really CA, it's more of a "blooming" effect, a "spilling" of red, in particular, into the pixels above the branch. Remember that for this camera, as in most digital cameras using the "Bayer" sensor, 3/4 of the "red" components of the RGB pixels in the final image are interpolated from, in the case of the FZ10, the 1,000,000 or so "red" sites on the sensor. It could be that when there's saturation of colors, as in an "all white" over-exposure, it results in an over-abundance of pixels for which the "R" component is dominant, either because the interpolating algorithm used by the camera over-emphasized the Reds there, or under-emphasized the Blues and the Greens. You can also see that the phenomenon is manifested differently depending on the angle of the branch; the horizontal part to the right of the bird has it pretty strongly, where the diagonal part to the bird's left has much less evidence of it.
I remember distinctly seeing this "Digital CA" in a photo I took years ago with my Nikon 990 of a Sunflower against a blue sky. There was a thin, bright red border along the petals between the yellow and the blue, but in that case, neither was even over-exposed!

Rhys
09-02-2004, 10:54 AM
When you hear "Chromatic Abberation," I think it's a classical description of how light beams of different wavelengths will refract differently through lenses, and supposedly it gets worse for long telephoto lenses. Having said that, I think there are also some other factors at work, maybe particular to digital cameras. Look at the following Cormorant photo, taken with my FZ10 with a TCON-17 telephoto extender attached:
http://john-reed.smugmug.com/photos/8052235-M.jpg
If you look closely at the branch the bird is perching on, there's a red fringe above it. But I don't see the fringe elsewhere in the photo, only on that branch. Why? IMO the branch itself is over-exposed, and the red color isn't really CA, it's more of a "blooming" effect, a "spilling" of red, in particular, into the pixels above the branch. Remember that for this camera, as in most digital cameras using the "Bayer" sensor, 3/4 of the "red" components of the RGB pixels in the final image are interpolated from, in the case of the FZ10, the 1,000,000 or so "red" sites on the sensor. It could be that when there's saturation of colors, as in an "all white" over-exposure, it results in an over-abundance of pixels for which the "R" component is dominant, either because the interpolating algorithm used by the camera over-emphasized the Reds there, or under-emphasized the Blues and the Greens. You can also see that the phenomenon is manifested differently depending on the angle of the branch; the horizontal part to the right of the bird has it pretty strongly, where the diagonal part to the bird's left has much less evidence of it.
I remember distinctly seeing this "Digital CA" in a photo I took years ago with my Nikon 990 of a Sunflower against a blue sky. There was a thin, bright red border along the petals between the yellow and the blue, but in that case, neither was even over-exposed!

The effect seems similar to the halation in film (where the light in an image spreads by bouncing off the silver halide crystals in the film and onto adjacent crystals.

I believe that the problem is most likely exacerbated by the small size of the sensor and hence the close proximity of the individual picture sensors. Of course, chromatic aberation could play a part although I would assume the lens and sensor are designed to complement each other in such a way as the red sensors are situated where the red light focusses, the blue, where the blue focusses and the green, where the green focusses, if the lense were not designed for all three elements to focus simulteneously. Having said that, the depth of field is so great that this might not be such an issue. Indeed, a smaller sensor could mean that a cheaper lens could be used as the depth of focus would be so much greater.

John_Reed
09-02-2004, 11:18 AM
The effect seems similar to the halation in film (where the light in an image spreads by bouncing off the silver halide crystals in the film and onto adjacent crystals.

I believe that the problem is most likely exacerbated by the small size of the sensor and hence the close proximity of the individual picture sensors. Of course, chromatic aberation could play a part although I would assume the lens and sensor are designed to complement each other in such a way as the red sensors are situated where the red light focusses, the blue, where the blue focusses and the green, where the green focusses, if the lense were not designed for all three elements to focus simulteneously. Having said that, the depth of field is so great that this might not be such an issue. Indeed, a smaller sensor could mean that a cheaper lens could be used as the depth of focus would be so much greater.Rhys, I'd have a hard time believing, given the nature of smooth glass lenses and discrete sensor elements, that they could design the lens to direct red light only to red sites, blue light only to blue sites, etc. I think each site picks up the color its supposed to "see" from the filter that it "looks" through, out of the full spectrum of light impinging on the sensor. Your analogy to film halation is well taken though - I'm pretty sure I see less of this sort of "CA" in photos I took with my FZ1, a 2MP camera with larger sensor sites. I just looked through my Smugmug birds for an example:
http://john-reed.smugmug.com/photos/4622487-M.jpg
This shot was also taken with the TCON-17, at full optical zoom, with the FZ1. There are several opportunities in the photo for some of that red "CA," but I don't see any.

D70FAN
09-02-2004, 01:22 PM
Hi Guys,

Good theories, but read the article. CA is prevelent in just about any common glass lens. It's been a problem since the original telescope and eye-glasses (I can even see this when I'm driving at night with my glasses on).

By adding multiple lenses, different glass compounds, and ED glass, most of the CA can be deflected (not eliminated) through careful lens design. It seems fairly independant of the recording media (film or electronic sensor), although focal length also enters into the equation.

John, maybe having f2.8 at 420mm is part of the cause (even though minor). If you stop the same shot down to f5.6 or 6.3 (as is the norm for moderate cost SLR type lenses of this fL), maybe the problem goes away(?).

Anyway, I found the article interesting as I really have never stopped to think about the actual cause of the problem all that much.

John_Reed
09-02-2004, 01:42 PM
Hi Guys,

Good theories, but read the article. CA is prevelent in just about any common glass lens. It's been a problem since the original telescope and eye-glasses (I can even see this when I'm driving at night with my glasses on).

By adding multiple lenses, different glass compounds, and ED glass, most of the CA can be deflected (not eliminated) through careful lens design. It seems fairly independant of the recording media (film or electronic sensor), although focal length also enters into the equation.

John, maybe having f2.8 at 420mm is part of the cause (even though minor). If you stop the same shot down to f5.6 or 6.3 (as is the norm for moderate cost SLR type lenses of this fL), maybe the problem goes away(?).

Anyway, I found the article interesting as I really have never stopped to think about the actual cause of the problem all that much.The article doesn't explain why CA should be so selective as it is in this example, why it chooses to adorn the white branch, and nothing else. Nor why it doesn't appear in the second photo of the Finch, unless there's a subtle lens design difference between the FZ1 and FZ10 that rules out CA for the older one. I'll tell you what I'll do - I'll go out there to the same spot and shoot the Cormorant (or at least the branch, if Cormie isn't there), and just bracket the exposure and aperture so that can be eliminated as the cause.

D70FAN
09-02-2004, 02:06 PM
The article doesn't explain why CA should be so selective as it is in this example, why it chooses to adorn the white branch, and nothing else. Nor why it doesn't appear in the second photo of the Finch, unless there's a subtle lens design difference between the FZ1 and FZ10 that rules out CA for the older one. I'll tell you what I'll do - I'll go out there to the same spot and shoot the Cormorant (or at least the branch, if Cormie isn't there), and just bracket the exposure and aperture so that can be eliminated as the cause.

From my understanding there are so many variables, that it is almost impossible to make a lens that won't have CA at some point in it's total fL. CA is also a reaction to the reflected light of the object at any given time. The tree branch is probably a variable as it changes reflectivity in changing light and shadow. Whereas something like a streetlight is not, as it is a continuous light and reflectivity source.

Just experimenting with my eye-glasses, I can see that natural changing light is hit and miss (mostly miss thank goodness), but I can see purple fringing on the flourescent lighting in my office as I change the angle of sight. Looking directly at the light there is almost none. Looking at an angle causes more CA to appear. Not very scientific but...

...Doh! I shouldn't have done that. Now my eye-glasses are going to drive me crazy. :eek:

Rhys
09-02-2004, 02:36 PM
From my understanding there are so many variables, that it is almost impossible to make a lens that won't have CA at some point in it's total fL. CA is also a reaction to the reflected light of the object at any given time. The tree branch is probably a variable as it changes reflectivity in changing light and shadow. Whereas something like a streetlight is not, as it is a continuous light and reflectivity source.

Just experimenting with my eye-glasses, I can see that natural changing light is hit and miss (mostly miss thank goodness), but I can see purple fringing on the flourescent lighting in my office as I change the angle of sight. Looking directly at the light there is almost none. Looking at an angle causes more CA to appear. Not very scientific but...

...Doh! I shouldn't have done that. Now my eye-glasses are going to drive me crazy. :eek:

This is correct. Any glass will cause light to diffract into its component parts. The trick with lens building (and telescope building) is to combine different kinds of glass with different refractive indexes in such a manner as the diffraction overall is reduced. Plastic lenses - though less prone to diffraction still diffract but aren't used due to other issues.

I wasn't suggesting that use of glass caused the different colours to focus on specific levels such that only red, green or blue fell in a specific area. I was suggesting that if the 3 colours were matched to the lens then a few microns difference in the height of each colour sensor might make a big difference to the individual colour sharpness. This would facilitate the use of cheaper optics and could be achieved easily via layering the sensors.

It could be - since the majority of fringing seems to be magenta or red - that the red poses a particular problem. Indeed, the infra-red lightwaves of about 600nm do focus in a different place from the visible light (which is mostly blue). Look at a lens barrel on a 35mm slr and you'll see a red dot or a red line. That red dot is where infinity will be at the infra-red end of the lens. It's a little before visible infinity which could indicate a greater propensity of the red light to spread. Since the photo sensors are already infra-red sensitive and since the anti-infra-red filters aren't perfect (point your tv contoller at your camera and watch the little light flashing through the camera), it's no great leap of the imagination to see that the red could be spreading. As the darker areas of photos will contain more green/blue, the red won't be so visible and software will control it well when it's supposed to be white. I guess that could cause a problem at the fringes.

D70FAN
09-02-2004, 03:28 PM
Since the photo sensors are already infra-red sensitive and since the anti-infra-red filters aren't perfect

(point your tv contoller at your camera and watch the little light flashing through the camera),

it's no great leap of the imagination to see that the red could be spreading. .

Sorry, doesn't work on the D70. ;)

Rhys
09-02-2004, 05:26 PM
Sorry, doesn't work on the D70. ;)

Of course not... The D70 uses a bigger sensor that isn't quite as sensitive to the problem although I'd like to see how it performs with say a 2000mm lens.

Plus, being a pro-sensor, the build-quality is much better. Hence the IR filter will actually be better. Of course, having a bigger sensor means the individual pixels can be that much bigger and more sensitive so the IR blocking filter can be stronger without cutting out too much visible light.

D70FAN
09-02-2004, 05:41 PM
Of course not... The D70 uses a bigger sensor that isn't quite as sensitive to the problem although I'd like to see how it performs with say a 2000mm lens.

Plus, being a pro-sensor, the build-quality is much better. Hence the IR filter will actually be better. Of course, having a bigger sensor means the individual pixels can be that much bigger and more sensitive so the IR blocking filter can be stronger without cutting out too much visible light.

I was only kidding. Actually, There's a big mirror in the way. ;)

Rhys
09-03-2004, 06:37 AM
I was only kidding. Actually, There's a big mirror in the way. ;)

You can't flip the mirror up and view via the LCD on the back then?

D70FAN
09-03-2004, 09:20 AM
You can't flip the mirror up and view via the LCD on the back then?

Nope. dSLR's do not use the LCD as a viewfinder. Only for menu's and picture review. You can lock-up the mirror for cleaning, but the display is blank. I don't miss not having the LCD as a viewfinder, as I find the TTL image is easier and faster to frame and shoot.

Rhys
09-03-2004, 09:28 AM
Nope. dSLR's do not use the LCD as a viewfinder. Only for menu's and picture review. You can lock-up the mirror for cleaning, but the display is blank. I don't miss not having the LCD as a viewfinder, as I find the TTL image is easier and faster to frame and shoot.

And you can't shoot with mirror lock-up at all?

D70FAN
09-03-2004, 09:46 AM
And you can't shoot with mirror lock-up at all?

No, you can't.