Getting DVI Working?

Hey,
I just got my new system, but I can’t seem to get DVI working on this Samsung 930bf connected to a PCIe GeForce 6600GT… When I first installed Windows, DVI was working fine (in fact, I only had the DVI cable connected, no VGA). But this was with the Windows graphics card driver.

As soon as I installed the nVidia ForceWare drivers, DVI stopped working. It came with version 70-something, but even updating to the latest version 81.85 didn’t help.

DVI works fine during startup (before Windows loads), then the DVI signal cuts out and the monitor automatically switches between Analog and Digital, trying to find a signal. After a few tries, it goes into standby. In the end, I just plugged in the VGA cable, and used analog.

Right now, I’ve got both DVI and VGA cables plugged in, running on analog. I press the button on the monitor to switch to Digital, and it automatically switches back to analog (because it doesn’t find a signal on the DVI cable). During the boot screens, however, I can switch to Digital just fine. It works… As soon as it gets to the desktop, the digital signal drops out and it switches to analog.

Since I’ve got both DVI and VGA cables plugged in, the ForceWare driver shows two monitors - VGA as #1, DVI as #2. I figured, maybe I need to switch the primary display to #2, then switch the monitor to DVI, that way, there’ll actually be a signal on the DVI cable… No luck.

I even tried cloning 1 and 2, so there should be a signal on both outputs, but nope - still just switches back to Analog after displaying a black Digital screen.

I tried unplugging the VGA cable, forcing it to use DVI, but it doesn’t detect a signal - just alternates between analog and digital for a while, then turns off.

I’m lost…

Any help would be appreciated :slight_smile: