It sucks to be a technology enthusiast. Attempting to move quickly through progressive improvements (but practically warranted, nonetheless) in hardware is a loss-making business. Getting somebody to buy over old hardware is such a challenge, I'd probably earn more as a coolie carrying sacks of rice off the pier for the amount of time spent. The alternative then, to getting one's "money's worth" out of superceded hardware is to relegate them to other lesser areas of usage.
And so I did for my pair of unsellable Philips 190CW7CB monitors, which were replaced by high-resolution ViewSonic VX1940w last year. This pair ended up in the current office I am stationed at. One to supplement my laptop so I get dual-display goodness just like home, and the other to replace the miserable 15" monitor attached to an equally depressing Penitum4 desktop that serves as the workstation connected to the corporate network. (We are not allowed to connect our own computers to the network)
Life became good. Half way.
It was a breeze on my laptop. But for this aging desktop, there was something not quite right with its NVIDIA Geforce2 MX adapter; it simply could not recognise the monitor's native resolution of 1440x900. The latest drivers I could obtain for this venerable GPU was 93.71, circa 2006. The boundary resolutions offered were 1360x768 and 1600x900. Why would an obsolete GPU offer HD video aspect ratios of 16:9 instead of computer desktop ratios of 16:10 is completely beyond me. In the following week I was forced to to squeeze a 1600x900 signal onto a 1440x900 area of pixels. Never mind that it look skinny, the text was a mesh of blur. Still readable, thankfully, but soon proved to be the quickest way to develop migraine.
Life became bad. And Panadol quarterly net income went up.
It seemed I was doomed to destroy my eyesight faster than mangekyou sharingan users. Being mindful of all the blood my eyes were tearing out, I initially tried to update the drivers to the "Default Monitor" with Philips' definition, but Windows XP blissfully rejected it stating it could not find anything "better" than it had. Snob. What else could I do? Then the Lord opened my eyes to something that was blind to me in the past.
Custom Timings! I can tell the card to output the video signal at a rate other than the stock options! I have always been an NVIDIA customer, but now it is time to become a fanatical supporter. After much trial and error, I finally got the dang video card to output at a rate appropriate for the monitor's consumption.
So much for plug-and-play devices. Long live manual explicit tuning.