Sharpening on a high pixel density monitor

Status
Not open for further replies.

camner

Active Member
Premium Classic Member
Joined
Sep 28, 2008
Messages
737
Location
Tacoma, WA
Lightroom Experience
Intermediate
Lightroom Version
Classic
I recently switched to a 27" iMac with a retina display (my current second monitor and my prior monitors were NEC PA242W monitors with normal pixel density displays).

I immediately noticed a change in how things looked when I was sharpening images. On the iMac display, it was much more difficult to see the effect of sharpening. Even when applying extreme sharpening (just as a test), I could not see nearly as well the effect of the sharpening I was applying.

Poking around the web, I learned that this is a side effect of a high pixel density monitor, because the human eye cannot as easily distinguish the changes that sharpening causes at, say, 220dpi vs 96 or 72dpi.

One recommendation I have seen is to sharpen at 200% (or more) rather than at 100%. Others have said that the upsampling the monitor does to display at 200% makes it more difficult to sharpen, because some of the fuzziness one sees at 200% is due to the upsampling the monitor does, rather than being due to image softness.

How do those of you with high pixel density displays deal with this?
 
I noticed you said upsampling; I do what Cletus said and upSCALE on occasion. I use a second monitor at 2.5k so it's often easier to use just that. But it kinda depends on what I intend to show it on; with some many retina devices and 4k now, I have sorta default ted to retina now. It seems even more pronounced when I used pixel shifted images from RAW. With those I can crank the sharpness up to ridiculous levels to bring out some intense detail.
 
Status
Not open for further replies.
Back
Top