# DVI vs VGA/SVGA

1. Oct 4, 2007

### Staff: Mentor

Just got a new high-end workstation at work, but the graphiscs card has two DVI slots (new video HD protocol) and we have only VGA monitors at the moment. The builder should have used a card (XFX) with VGA and DVI slots. But what's done is done.

Mild inconvenience.

So I quickly learned that there are such things at DVI to VGA adapters, whereby the VGA monitor plugs into the VGA (female) end, and the DVI (male) end plugs into the DVI slot on the card.

DVI monitors are a little bit more expensive. Just something to consider when purchasing a new computer.

Last edited: Oct 4, 2007
2. Oct 4, 2007

### mgb_phys

Remember the DVI-VGA converter doesn't actaully convert DVI-VGA.
The DVI standard has optionally the VGA signals on spare pins, the four extra pins alongside the sideways blade connector.

There is no requirement for the graphics card to have the VGA signal and it is often only present on a single output. It should be labelled DVI-I if it has both DVI(digital) and VGA(analog).

3. Oct 4, 2007

DVI to VGA adapters aren't too expensive (about $15 at best buy... possibly cheaper at (say) Dell). So, are you going to run a dual-monitor setup? 4. Oct 4, 2007 ### Astronuc ### Staff: Mentor Not at the moment. There is no need. The box was constructed with a card that had two DVI slots rather than one DVI and one VGA/SVGA slot on the same card. Since we didn't have a DVI monitor, but only VGA, I was faced this morning with having to buy a monitor, since I called a couple of nearby places and the either didn't have a DVI-I adapter or didn't know what it was. But I found a DVI-I (Belkin) adapater (not converter) at Staples, which cost$30. That saved me a $60 DVI cable and ~$240 for a DVI monitor.

The workstation is essentially a computational engine for running simulations. We access it across a LAN for I/O, so we don't need a fancy monitor. The output is further processed on other machines.