Today's GOTD is a printer utility, Photo Print Pilot. It started with XP-Man commenting about inaccurate prints, & how he had hoped this app would include some sort of calibration. It doesn't -- it provides the same sort of features you might find in your camera or printer, in the printer settings, & in the small apps that often are bundled with printers, features to print a couple or a bunch of photos on a single sheet of paper.
FWIW the good thing about Photo Print Pilot is what it's not... Printer makers don't make money off bundled utilities, so they write them as seldom as possible, investing the least possible, so they're often bloat-ware &/or junk that works relatively poorly in the current version of Windows. The problem with printer driver settings &/or cameras &/or printer controls is that all 3 are very much space limited -- the more you can set the more deeply it's nested in whatever menus. That said, the appeal of Photo Print Pilot might well be regional... in my experience here in the US, photo paper already cut to the smaller print sizes is available very cheaply, often free. My Artisan 730 even has a separate paper tray for it -- that sort of thing is common. Full-sized photo paper OTOH tends to be expensive, & means removing the regular paper I've loaded. Why spend more to do more work, since at the least the individual photos have to be cut from the full sheet?
But I digress, & on to the comment XP-Man made about accuracy... For better or worse I'm a hands-on sort of person that likes to touch stuff. Where some people look at cloth to see the pattern, I touch it to see how it feels. I've done an awful lot of printing over the years -- I can't very well touch the photos on-screen, & handling them is impossible. :) Needless to say, from my 1st dot matrix color printer [one of the 1st models available BTW], I've dealt with issues of quality & accuracy. I've seen color management when it was 1st available as something tacked onto Aldus Pagemaker & Photostyler, as well as being used by their arch enemy Adobe [who eventually bought them]. I spent weeks [unsuccessfully] trying to get it to work with CorelDraw! years ago. Here's my take.
If you have [or have had] a dog or a cat, you know there's not a totally identical cat or dog on the planet. If you look up say a Collie, you'll find a general description of the breed, but nothing that describes the Collie you may know or may have known. Unless you create an ICM profile of your very hardware, it can only describe the general characteristics, the same way you can say most Collies stand this tall. Maybe worse, more or less generic ICM profiles are commonly used to avoid adjusting anything, but then not turned off when people go ahead & make those very same adjustments. Here's what's wrong with that...
Any ICM based solely on a brand & model [vs. that exact device] is inaccurate to the extent that mass produced hardware varies, sometimes quite a bit -- the manufacturer doesn't want to spend a lot of cash on something [ICM] with zero return -- the ICM won't be revised when manufacturers change components, designs, suppliers etc. while that product is current. When it comes to monitors or displays, the lighting where you view the display has an awful lot to do with what you see [& don't] -- at a concert they might selectively use inverse sound waves to get the same results you might get with an equalizer in your living room, & oversimplified, you might think of the ambient light waves doing the same sort of thing when/where you're looking at your display. Next, unless you paid an awful lot of money for it, your display can't produce near every color, & what your monitor does show you is based on the mode you select in the monitor's setup, effected further by any adjustments you've made.
Now if you're a pro, doing the job properly means using a very expensive calibrated monitor -- not one that came calibrated -- in a room with precisely controlled lighting. The calibration is performed with something like one of the Spyder devices [spyder.datacolor.com], that measures the actual colors everywhere on the screen -- included software then alters what's sent to the screen so that what you see is accurate. A simple ICM profile in Windows isn't enough. And while room lighting is critical, so is control of all lighting, so it doesn't vary with the time of day. So far you've spent several grand, assuming you did the room lighting [& maybe light proofing] yourself.
OK, so you're running win7, you went to Control Panel -> Color Management & set Windows default colorspace to sRGB -- Good for you. The ICM profile that came with your monitor might say something like it's a bit weak with reds or whatever, but *your* monitor isn't that bad, & besides you've adjusted for that in the monitor's hardware setup menu. So now things look pretty good, & when you open the GOTD page in your browser & hit print, it reasonably matches your screen. But then you decide to make a print of that photo you took this morning. You open the picture in your image editing app, & the 1st thing it does is use your monitor's ICM profile -- you really have little idea what that picture looks like at that point. *Hopefully* your editing app lets you turn the profile to sRGB rather than using your monitor's ICM, & *hopefully* it does that right so Windows listens to it. Now you might see what that picture really looks like, or maybe not...
Many of the same factors that make your monitor's more-or-less generic ICM profile less than wonderful also apply to your camera's ICM, &/or to the default ICM profile you editing software applies to every picture. Net Result = you still may not know what that photo you took really looks like. If you're lucky you can turn off both types of ICM, setting the photo to sRGB. Now, finally, that image should look pretty much the same on every other display set to sRGB, & that's the best you can hope for. And, as long as you save your edited image with sRGB [often you have to save it with some sort of profile attached or embedded -- no choice], any adjustments you made will be permanent. If you didn't do all that, you really didn't see what you were adjusting, nor did you see the effects, so who knows, especially since you can't anticipate what ICM related settings anyone else uses?
All right, so I covered editing, but you want to make a print. You've got your monitor, Windows' default, your image editor's, & your photo's ICM all set to sRGB -- if you don't, again who knows. Printer makers often [usually] try to improve on what we print. Many [most or all?] Epson printer drivers default to a vivid setting, hoping to make up for an average photo -- they also hope to compensate for the fact that pictures on a lit screen have to look different than light reflected off ink on paper [it's simply physics]. I found Canon printers cheap & reasonably reliable, but dumped the brand when we went to one shared printer because their built-in image enhancements couldn't be turned off. Pictures looked more vibrant, closer to what you saw on-screen, but when I tried both low & higher end models a couple of years ago, none of the Canon's could print some shades of red, e.g. you couldn't accurately print a picture with tomatoes in it -- apples yes, tomatoes no. Prints from the Epson I went with look duller [without adding a bit of flair in editing], but they're accurate.
If you're lucky you can turn color management off in your printer's settings, & it will print the photo just like you see it on screen, though you have to allow for colors that are impossible to match using 4 colors of ink, & you have to account for the difference it makes looking at reflected rather than direct light. However there is a decent chance that you can't completely get rid of every color enhancement built into your printer, so you have to adjust the printer settings, hopefully saving those settings somehow, e.g. as some sort of a profile, but even if you can't, once you know what they are you can write it down & apply it every time you use that quality setting with that type of paper. To explain what paper has to do with it, to get away from individual dots you can see [albeit you might need a magnifying glass], ink jet ink can react with the paper. Printers can also flood the surface with ink, relying on a surface that's sort of like peach fuzz [on a much smaller scale], to keep the ink from flowing around too much. When you get into that sort of chemistry there are lots of things that can effect it, but for now I think it's enough to say your printer does its job differently depending on the paper & quality setting, so if you save printer settings, save them for each paper type [& maybe brand] & quality setting.
You can of course try to use the printer's factory ICM profile, and sometimes it may even work, providing of course you don't mind whatever enhancements or adjustments it performs. Note too that you should probably set the ICM profiles for printing in Windows & in your editing software if/when you want to use the printer's ICM. Note too that scanners also have ICM profiles, with most of the gotchas monitors & printers exhibit. To those that say ICM is some super important feature.....
If you take a picture of a flower outside [using daylight rather than artificial lighting], the picture should look like the flower on your camera's screen, & the same on a non-adjusted [as in just plugged in, brand new] PC/laptop monitor, without using ICM. You might be able to plug your camera into that monitor directly, & if that's possible, see a picture that looks like that flower [no PCs, no ICM]. Likewise a new printer should print a reasonable likeness of that flower at any resolution & on any type of paper it can handle, as long as nothing's altered, i.e. without ICM. If you uploaded that image to get a print from the local drugstore, a color copy of that print using an all-in-one should look reasonably like the print, all without a PC or Windows or ICM. Most current monitors are the same thing basically as an HDTV without the tuner -- you think the cable company or the cable box worries about that HDTV's ICM?
The folks who make your monitor realize that if it distorts colors they won't sell very many -- printer companies have always known the same thing. We'll buy a monitor or HDTV, plug it in, done. Same with printers. A biz isn't going to calibrate every monitor, nor are they going to do more than make sure shared printers work. Printer makers know that if for example handouts for meetings & presentations are miles off, not only will people be upset, but their brand is out the door. The end result is that printer & monitors *just work* for most purposes as-is.
Some briefest history FWIW, they started seriously working on color profiles in the early 90's -- serious apps back in the day like Pagemaker came with several basic ones. At the time you were talking about pros used to color checking hard copy manually -- something ICM never has replaced -- but you were also talking about monitors & graphics cards that compared to a decade later were still in their infancy. For the consumer it hasn't really evolved too much beyond that, e.g. you generally use sRGB or Adobe's RGB 1998. Corel tried managing color by including a calibration setup in CorelDraw! that encompassed the monitor, scanner, & printer -- they also sold packages including a cheaper Spyder type device. Several years [versions] ago Adobe included a app that ran with Windows to adjust & manage your color calibration. Neither proved to have a long & productive life. There's some discussion now about eliminating gamma settings &/or adjustments as sort of irrelevant to current displays.