What's new
Photoshop Gurus Forum

Welcome to Photoshop Gurus forum. Register a free account today to become a member! It's completely free. Once signed in, you'll enjoy an ad-free experience and be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

Reply to thread

I'm sorry, but I just don't understand what you are getting at in your last post. 


Of course they are separate numbers.  Your comment:

 is  nothing more than you simply repeating (albeit, in slightly different words) exactly  what I said in my earlier post, and precisely is why they are given  different names, ie, ppi vs dpi. 


As  a concrete example, lets suppose that your file was 2000 pixels wide  and the paper onto which it is to be printed is 10 inches wide. Then the  number of pixels per inch on the paper will be 200.  However, to get  smooth gradations in color and tone, the printer manufacturer may have to restrict the number of the little microdroplets that make up each  pixel from say, a minimum of 600 dpi (at the most coarse, but fastest setting) to 2400  dpi or more at the most fine (and slowest) setting. 


So, yes,  this is exactly why, in a previous post, I said that for a good print  the dpi number typically is many times larger than the ppi number.


Below is a nice graphic that illustrates this difference.   Notice that for every pixel in the file (and on the monitor, if the magnification is set to 100%), there are around a half-dozen or more dots on the inkjet print.

[ATTACH]62792[/ATTACH]


This very nice graphic is from: http://blog.fractureme.com/photography/dpi-vs-ppi-difference/


So,  do you now understand why I responded to your question of why does 90  ppi on a monitor appear much sharper than 90 dpi on a piece of paper?   To be very clear about it, to get the equivalent sharpness of 90  on-screen pixels per inch, you would need many times that number of  on-off microdroplets per inch (ie, printer dpi), say, at the bare  minimum, 600 dpi, and depending on the exact design of the printer,  as  many as 2000 or more dpi. 


Basically, the printer is trying to  represent the (say) 256 possible levels of each of the RGB colors (ie,  almost a continuous tone image), with lots of different on-off patterns  of sub-pixel sized droplets.  It's very analogous to why the photos in  half-tone, bulk printed material (eg, newspapers) look so crude unless  the number of halftone dots per inch is much higher than one would  think.


HTH,


Tom M


PS - I'm sorry to have to do this, but to remove a possible source of confusion, some of the comments made by MrToM earlier in this thread are just flat-out wrong and should be ignored, e.g., "...Who said it wouldn't? If your settings are correct it'll look just as good....but remember the  mediums are completely different...one is light the other is ink...".   (a) they won't have equivalent sharpness, as I've just demonstrated; and (b), the difference between an image rendered in emission (ie, "light") and rendered in absorption (ie, "ink") has next to no impact on sharpness / resolution, rather, the major impact is on things like how very bright tones are rendered (emission can be made almost arbitrarily bright, whereas the whites in a print can only be as bright as the illuminating light), maximum degree of saturation possible, color gamut, etc.


What is our favorite program/app? (Hint - it begins and ends with the letter P)
Back
Top