I don't actually know, but perhaps seeing something as pixelated requires being able to resolve the edges of an individual pixel.
A better test would be to see if people can detect a difference in the pixel "phase" (for lack of a better word) of some pixel-drawn object (that is, show identically shaped objects that have a different internal pixel organizations because of their position on the screen). Do an ABX-style test, see if you can tell the difference.
I'll bet you can tell the difference on a classic display. I'll bet you can't on the iPhone 4+.
> Oh for heaven's sake ... Vision and visual acuity is not this simple.
You just gave me flashbacks to the old chestnut of humans allegedly seeing a fixed "fps", and it conveniently topping out at the same nice round number that the nearest TV and/or theatre can produce. I bet that's still being propagated daily on forums.
I love my retina displays, but Apple's marketing annoys me as I can still see pixellation at times. This is a quick game to see if you can dispute the marketing claim.
Printing is a whole different ball of wax. As the printing process involves layering inks to achieve colors, the ability to create different colors is a function of that printers ability to lay down ink, so typically the color gamut of the printer will be "lousy" at its highest resolution (as low as 8 colors) and at a smaller "effective" resolution will cross the point where it can print all the colors you would expect. This second number is often referred to as the 'screen resolution' not because of CRT screens but because of silk screen printing. So early HP printers would have a "resolution" of 300 DPI but a screen resolution that was closer to 120 dpi.
In the photography business people talked about 'lines' which was the smallest width line you could have in contrasting colors on the film that the film could reproduce. Hence those test patterns you saw of groups of lines in standard image targets.
CRTs used a shadow mask, a way of blocking the electron beam around the different colored phosphors. You were limited to 1/3 to 1/4 the shadow mask resolution as you needed at least three phosphor patches to reproduce colors.
I quoted Apple's Retina claim on the site: ‘The Retina display’s pixel density is so high, your eye is unable to distinguish individual pixels’. (They have many variations on this).
So for me a real retina display would make a single white pixel on black background invisible at normal viewing distances to a normal eye.
A main reason why your definition isn't used much is because it is highly dependent on the relative brightness of the dot and its background. You can see a dot that has a millionth of the area of an iPad pixel, as long as it radiates the same number of photons on your retina [edit, make that a thousandth or so. Something a million times as bright might kill the cell it lands on before it can send out a signal. http://www.displaymate.com/iPad_ShootOut_1.htm#Backlight_Pow... claims 7W for an ipad backlight; that's over 2W for a million pixels. You wouldn't want to shine a 2W laser into your eye. If you don't believe that, google 2W laser pointer on YouTube]
This is why @2x is silly. The only way to stay ahead of the constant march of technology is with vector graphics. Otherwise, your @2x image is going to look silly on an @3x display (or zoomed in).
The "Retina" display is just a marketing term which means that a person with 20/20 vision cannot distinguish individual pixels at a certain distance. Movie screens do not need to have a high ppi ratio because you view the screen from a greater distance. If you have better than 20/20 vision, or press your face up against the screen, you will be able to resolve individual pixels quite easily.
While I appreciate that you're specifically referencing Retina claims and therefore you really do need a "Retina" display for it to matter, I'd suggest against actively locking out everybody else. It's generally an interesting question about a given display and one's personal eyesight even without the hook of testing a marketing claim.
[locate individual high contrast pixel] != [ability to distinguish individual pixels]
Vision and visual acuity is not this simple. See, e.g., http://www.michaelbach.de/ot/lum_hyperacuity/
I don't actually know, but perhaps seeing something as pixelated requires being able to resolve the edges of an individual pixel.
A better test would be to see if people can detect a difference in the pixel "phase" (for lack of a better word) of some pixel-drawn object (that is, show identically shaped objects that have a different internal pixel organizations because of their position on the screen). Do an ABX-style test, see if you can tell the difference.
I'll bet you can tell the difference on a classic display. I'll bet you can't on the iPhone 4+.
[edit: made the comment slightly nicer ... :/]