What's new
Photoshop Gurus Forum

Welcome to Photoshop Gurus forum. Register a free account today to become a member! It's completely free. Once signed in, you'll enjoy an ad-free experience and be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

Uneven background illumination used as filter to correct microscope images ?


EVatN

New Member
Messages
3
Likes
0
Dear all,

I would like to correct images taken with our microscope for uneven background illumination.

We take images of paint cross sections that are evenly illuminated in the center of the images but near the edges the images then tend to get darker, almost black.
This is due to an uneven background illumination of the light source in the microscope.

When i take a image of this background illumination, the middle part is white but this changes to almost black at the edges.
I would like to use this background image as a gradient filter (?) or alpha channel (?) to correct the microscope image of my paint samples.
A similar correction can be achieved with Lens Vignetting in Adobe Bridge, see image PaintafterLensVignetting as attachment.

However, i would like to use the image of the background instead. Each magnification / lens has it own 'distortion'.

How can i use the background image as a 'filter' to adjust my microscope images ?

Am greatfull for any suggestion !
Thanks in advance ...

Best wishes,
Edwin

PAINT.JPGMICROSCOPElight.JPGPAINTafterLensVignetting.JPG
 
Last edited:
Just for my own clarification, is this a slide prep on a microscope or just a large sample under a stereoscope?

Either way, adding adequate lighting to the set up would be your best solution.
 
Just for my own clarification, is this a slide prep on a microscope or just a large sample under a stereoscope?

Either way, adding adequate lighting to the set up would be your best solution.

Hei
Thanks for replying !
I know, we should improve our input ... But we have a lot of images that are already taken and we would like to use then now for another project.
The paint.jpg is taken under the microscope with a 250x magnification. We make a smilar image under UV fluorescence light conditions.
At first I considered 'substracting' the paint.jpg from the microscope background illumination image but that doesn't work. The bright pixels should show all the pixels from the paint.jpg but the darker and even black pixels should produce brighter pixels If you invert the background light.jpg then it shows which areas should be brighter ...
Any other suggestions ?

Best wishes and thanks again !
E/
 
This is a common problem in both microscopy and astrophotography. There is a large amount written on this subject, and the image acquisition software that comes with the sensor on most professional level microscopes includes algorithms to deal with nonuniform illumination.

Typically, there are always multiple causes of nonuniform illumination starting with the setup of the microscope (ie, the condenser or laser illumination system, as well as just about everything else in the optical train). These effects are typically minimized by recording and averaging multiple dark and bright frames to correct for differences in the gain of the sensor and transmission of the optical system from point to point across the FOV, differences in light leaks and dark noise of the sensor from point to point across the FOV, stuck (aka, "hot") pixels, etc.

Because precise arithmetic operations are involved in these corrections, they are rarely done in Photoshop, as this is a program primarily for qualitative (not quantitative) image adjustments by photographers, graphic artists and similar folks. This work can sorta-kinda be done in PS, but it's much better to use a program designed for mathematical image processing / analysis such as Matlab's Image Processing Toolbox, Mathematica, ImageJ (by NIH), or custom written software. For your microscopy application, ImageJ is probably your best bet as it has a huge user base among microscopists. I believe Wayne is still around and supporting the program, and even if he isn't, there is a very active community including both discussion forums and an extensive algorithm exchange. For a good introduction see:

http://imagejdocu.tudor.lu/doku.php...ground_illumination_in_brightfield_microscopy

Some of the most common astrophotography software packages also feature the previous corrections, but they typically include other features (eg, spectroscopy related) that you would likely have no need for.

If you want to use a general purpose, commercially available software package, take a look at the relatively new product from Germany:0
http://www.projects-software.com/denoise-projects-professional/ .

I have this package and use it to remove noise (...it's very good...), but I must admit that I have never used the "dark frame" and "flat frame" correction features, so I can't comment on how well they work. However, if you download the manual:
http://www.projects-software.com/wp-content/uploads/2015/04/DENOISE_projects_professional_EN.pdf

and look at the top of p. 9, and section 3.3.7 on pages 25-32 (or just search within the PDF for "correction images"), you will see how some of these corrections are implemented in that package. There is an absolutely breath-taking example of the use of these features in astrophotography on p. 29. Use of these features turn an almost featureless blob of light into a beautiful rendition of the Andromeda galaxy.

If you happen to be familiar with Matlab or Mathematica, just Google either of those names plus terms like "dark frame", "flat frame", "hot pixel", etc., and I'm sure you will turn up many more potentially useful bits of code to do such tasks.

That being said, if it were me, I would just use Image J and forget all the other options, especially Photoshop -- it just isn't designed to perform such tasks.

HTH,

Tom M
 
I just saw what you posted as I was typing my response. One statement you made has me worried: "...But we have a lot of images that are already taken...".

Because these correction algorithms are essentially subtracting / dividing nearly equal quantities, dark and flat frame corrections are almost always done immediately before (or before and after) each data run, NOT after the fact to existing images unless your microscopy setup is completely locked down, bolted in place and no one *ever* adjusts your microscope. This is because the slightest error in these corrections can introduce HUGE erroneous effects on your final image. There have been some attempts made to write analogous algorithms which are effective on existing images, but, as far as I know, they never even get close to what can be done if dark and flat frame correction images are done for each data run.

If your application isn't very taxing, you might get away with using generic dark and flat frames, but I would regard the results of which with lots of caution and a large grain of salt, LOL.

Sorry to be the bearer of bad news.

Tom M
 
I may or may not understand the exactness that you need in the image correction, yet to the first order, this seems as if it is a lens vignetting problem.

The image below shows one or your originals on top and the adjusted image below.
1) Turn image into a Smart Object
2) Apply Camera Raw Filter
3) Go to Lens Corrections Section
4) Adjust Lens Vignetting "Amount" slider (and midpoint if necessary)

In my example below, all I did was set the Amount slider to 80

Hope this helps some.

PAINTadjusted.jpg
 
Dear Tom Mann,
Thank you for your useful and extensive reply !
Have just completed a course in ImageJ ... the way to go.
Best wishes,
Edwin
 

Back
Top