Of course you can pull the DP connections out, that's not the problem. The problem is that when you put that display on a windows machine, you end up with a bunch of legacy windows software and web graphics content that looks unreadably small, followed unhappy users who return those fancy DP panels to your retailers.
Of course, the same thing is true of iOS and OS X software, which is precisely the reason "Retina" (as distinguished from arbitrary 200+ DPI hardware) displays are always shipped in resolutions that exactly double those of pre-existing devices. So even though the legacy software doesn't handle it Apple, by controlling the framework, is able to make it work more or less seamlessly.
Dell and HP don't have that freedom, they need help from Microsoft, and Microsoft, well... At least Windows 8 does high DPI natively, though the MS decision was basically to jettison legacy visuals and hope no one runs the old software anymore.
I have computers in datacenters that I can effortlessly SSH into when I am feeling the (frequent and chronic) urge to Dick About With Computers And/Or Software.
However, the one that connects to my display(s) is simply a tool for running Chrome, Skype, FaceTime, Mail.app, TweetDeck, Spotify, and SSH, and, as such, should not ever break or otherwise hinder the incredibly simple tasks which ARE ITS ONLY JOB.
This is why the only computers I own that I touch on a regular basis all have fruit on the back. It's a very simple calculus.
The same reasons [0] are why I have a Chromebook. CrOS is limited, sure, but the limits make the thing bloody near invincible (and amazingly productive).
[0] Hangouts are luckily the VoIP of choice for most I work with
Think about what your laptop is doing that we would have had no hope of doing with desktop Linux just a few years ago (if ever, depending on the hardware).
Sharing access to video capture and audio hardware between software written (sort of) by Apple, Microsoft (Skype, so -ish), Spotify, etc. would have been unthinkable. You would have been restarting your primary machine if you made the mistake of starting any two apps that tried to grab your QuickCam at the same time.
<-- Down votes go here and by all means, please don't respond.
Interesting. My personal workstation runs Windows and I too effortlessly SSH and RDP into many headless or virtual machines to dick around and (occasionally) to do actual work. Hey, I also run Chrome, Skype, TweetDeck, Spotify, SSH and the like and I have been for years...on the same install of Windows. It's like your calculus doesn't work at all in my universe.
Actually...I don't think you're doing calculus at all.
Also, have you ever seen the source of Chrome? It's not that simple. Lay-people think it's simple, but software developers know better.
"A browser" is simple from a "a process that talks to the network and draws things on screen" standpoint. I'm not suggesting that Chrome isn't a complex application - just that it's self-contained and very well tested and doesn't do anything special with the OS or hardware (drivers, USB, video modes, etc).
My workflow would work on Windows, too, aside from the fact that it'd suck. (Example: Unplug an external USB audio interface while a song is playing. OSX skips a beat and then plays via internal speakers, at the previous internal-speaker-volume-level.)
I could travel with luggage that isn't Pelican/Incase, too— but I don't.
I laughed at the "incredibly simple" bit myself. I'm sure it must seem that way to people who live at the top of the stack doing javascript hackery and MongoDB maintenance. It's amusing how few people, even in this world of open source everywhere, have ever bothered to do something as simple (in an only-slightly-ironic sense of "simple") as build an OS distribution. Android might be the "simplest" of this kind of thing, and it's still a 10G+ tree.
I had a Thinkpad with a 15.4-inch 1920x1200 screen, back when such wonders existed, and I had no big problems with its 150 DPI running Windows XP, even less with Windows 7. Yes, there were some really old applications that wouldn't react to system-wide DPI settings, but that was a small minority.
I think there is simply no excuse for the regression in laptop resolutions - I can't even find a laptop that will give me 1200 lines anymore, they give me "full HD" at best, which is less. And we can see proof in all the comments saying that they would buy this guy's hack to plug a hi-res screen into a laptop.
I had a Compaq in 2004 with the same display. And indeed, windows works pretty well up to about 150-170 DPI. But I have a 10" Acer tablet with a 1920x1080 screen sitting next to me, and the legacy windows desktop is basically unusable. Yes, it can be made to work if you are tolerant of VERY SMALL BUTTONS, largely unreadable text in graphics assets, and know where to find all the font settings. But it pretty much sucks, and no one sane would try to use it that way.
Of course, the same thing is true of iOS and OS X software, which is precisely the reason "Retina" (as distinguished from arbitrary 200+ DPI hardware) displays are always shipped in resolutions that exactly double those of pre-existing devices. So even though the legacy software doesn't handle it Apple, by controlling the framework, is able to make it work more or less seamlessly.
Dell and HP don't have that freedom, they need help from Microsoft, and Microsoft, well... At least Windows 8 does high DPI natively, though the MS decision was basically to jettison legacy visuals and hope no one runs the old software anymore.