Strange D-SUB <-> DVI adapter/reduction (6,10,11 PIN DVI)

postcd

Weaksauce
Joined
Nov 24, 2016
Messages
96
dvi-d-sub.png


Hello, i have a reduction where the DVI female connector has only 11 PINs (on the image highlighted in red). On the image are also other pins, but i used that image only to illustrate which pins my DVI connector has (only those in red), on other side of the reduction is D-SUB connector male like on the right of above image.

Here is the DVI scheme of pins which suggests per my understanding that the RGB (video signal) will go thru and it will be analog, not digital (unsure how bad it is) and it will not support resolution above 1920 × 1200 at 60 Hz. For higher resolution i would have to use full/dual DVI which means it would have to have full field of 24 PINs and more.

I want to ask what is the meaning and use of my connector. When i searched for images i have not found any reduction like mine, all has more pins than me. I currently have no opportunity to test if it transfer the video data and the quality difference between mine and standard single link DVI-D/DVI-I?

Dual vs Single DVI - "Dual link DVI has more pins and allows for a higher resolution and faster refresh rates. Single link can display up to 1920x1080 @ 60 Hz and dual link can display up to 3840x2400 @ 41 Hz." src
So if one's monitor allows higher resolution then 1920x1080, then it is good to get dual DVI that has full field of 24+ PINs.

My monitor is standard res.1920x1080 so i am ok with single link. I have analog DVI connector per my understanding and so my only doubt is i think the difference between analog and digital signal quality? What do you think?
 
Last edited:
Looks like it's a cheapened DVI-A to VGA adapter where the unused pins are removed to reduce cost (VGA only needs 10 signals to work). But I think your image is incorrectly marked, it should look like this for it to work as a DVI-A adapter:

upload_2018-10-26_5-31-13.png



The resolution limits that apply to Dual-link vs Single-link only applies to the digital part of DVI. The resolution capabilities of the analog signal are dependent upon the graphics card itself and the monitor connected to it, and the image quality is extremely sensitive to cable quality. Typically, VGA was limited to 2048x1536 @ 85Hz
 
The major problem with using VGA with an LCD is the LCD has to synchronize with the input without the benefit of a pixel clock. This can result in a blurry image (usually fixed by invoking some "auto adjustment" option in the monitor settings).

So, use a digital input if possible.
 
The resolution capabilities of the analog signal are dependent upon the graphics card itself and the monitor connected to it, and the image quality is extremely sensitive to cable quality. Typically, VGA was limited to 2048x1536 @ 85Hz

This was interesting information, thank you for sharing that. In my case i am having:
- AMD RX 560 gaming graphics with the DVI out
- and on other PC i have Core i5 integrated graphics with the HDMI out on the motherboard

The monitor is https://eu.aoc.com/en/products/e2260swda and has D-SUB and DVI-D (digital only, dual link)

Will this setup (result in good sharp resolution image likely?

and i am looking for a cheap device that will allow me to use multiple computers connected to one monitor (& USB keyboard+mouse), though that monitor has only one digital input (DVI), so far i found only a 15usd device that support VGA & USB ports, not DVI.

invoking some "auto adjustment" option in the monitor settings). ... So, use a digital input if possible.

Thank you

Btw.: If my graphics card and mentioned monitor have DVI slots/connectors for full 24+ pins (dual line for higher resolutions) pins, should i buy new cable if i currently have only DVI-D with single line signal (lower resolution - but one that is maximum my monitor supports)?
 
Last edited:
Back
Top