Why do cards still have DVI ports?

I am trying to figure out if both HDMI and DisplayPort will become dominant, or just one of them.
 
I am trying to figure out if both HDMI and DisplayPort will become dominant, or just one of them.

AMD, Intel, NVidia and a few other big corporate people agreed 5 years ago that DisplayPort would replace VGA, and HDMI would replace DVI. The big PC hardware companies see a place for both.
 
Hdmi is resolution limited until a new version comes out, I need divi to run my 2560x1440 I use 2x divi and one dp to give me 7680x1440.
 
I am trying to figure out if both HDMI and DisplayPort will become dominant, or just one of them.

Display port is the VESA standard port, royalty-free use.

HDMI is a port developed by the HDMI Consortium comprised of major manufacturers who do charge, and therefore make money for using it.

Since HDMI makes money (also called the HDMI Tax) for a select group big display makers it is unlikely to go away, though in reality Displayport does do everything.
 
Far more commonplace - not that many displays have Displayport compared to HDMI - HDTVs for example always have HDMI but pretty much never displayport. If one output's all you get, HDMI is definitely the one to go with. Nobody's likely to be running 4K off a chromebook, so HDMI has everything you need, pretty much.
 
Too bad my Korean panel needs dual link DVI, I'll be stuck using it for a while. Hopefully GPU manufactures keep it for a little while, I need a GPU upgrade.
 
I wouldn't be surprised to see non-reference versions come with dual-link DVI even if it's dropped from reference next gen.
 
Too bad my Korean panel needs dual link DVI, I'll be stuck using it for a while. Hopefully GPU manufactures keep it for a little while, I need a GPU upgrade.

AMD's upcoming refresh should still keep DVI, but Fiji will certainly not. (3 DP + 1 HDMI)
 
Gotta use an Nvidia card, I use CUDA daily. I will probably get the best $300-ish GPU I can get if it starts becoming rare, otherwise I'll have to hope there's a good coverter for other cables to dual link DVI, but from what I've seen, there's very little dual link DVI options for anything.
 
Far more commonplace - not that many displays have Displayport compared to HDMI - HDTVs for example always have HDMI but pretty much never displayport

The thing is, is that compatibility issues are solved with adapters. I've had to deal with those things since my first video card. If someone still has a CRT with only VGA input, then they should have to buy a DP/HDMI to VGA adapter. People who are buying PC parts today should not have to 'suffer' because someone somewhere hasn't bought a monitor since 1995.

Yea,yea, first world problems...
 
That only makes sense if you're removing the less commonly used interface.
HDMI and DVI are still far more common than displayport. Furthermore, if I had the choice I would use DVI over the other two interfaces any day of the week. I'd much rather have a chunky connector and cable than deal with the overscan issues of HDMI or the stability issues of displayport. Neither of course are the fault of the interface, rather graphics drivers' implementation of them but the fact remains, DVI is something you just plug in and forget about. It 'just works' - neither HDMI or Displayport can profess to achieve this in a PC environment.
 
That only makes sense if you're removing the less commonly used interface.
HDMI and DVI are still far more common than displayport. Furthermore, if I had the choice I would use DVI over the other two interfaces any day of the week. I'd much rather have a chunky connector and cable than deal with the overscan issues of HDMI or the stability issues of displayport. Neither of course are the fault of the interface, rather graphics drivers' implementation of them but the fact remains, DVI is something you just plug in and forget about. It 'just works' - neither HDMI or Displayport can profess to achieve this in a PC environment.
Such is the folly of data packet-driven digital A/V signals. I do like the design of the DP connector, though. It's thick, long, and locking. By contrast HDMI is short, narrow, and reliant on friction. I am so happy not having to deal with the screw locks on VGA or DVI connectors anymore.
 
It's still good for those who run Non G-sync monitors which can run at 144hz. I see they are limiting them to one on more modern cards then 2 DVI ports.
 
That only makes sense if you're removing the less commonly used interface.
HDMI and DVI are still far more common than displayport. Furthermore, if I had the choice I would use DVI over the other two interfaces any day of the week. I'd much rather have a chunky connector and cable than deal with the overscan issues of HDMI or the stability issues of displayport. Neither of course are the fault of the interface, rather graphics drivers' implementation of them but the fact remains, DVI is something you just plug in and forget about. It 'just works' - neither HDMI or Displayport can profess to achieve this in a PC environment.

Frankly, all of those standards are pretty crappy. We should have broadcast engineers dictating this stuff but Hollywood gets involved and requires a bunch of BS like HDCP and we end up with consumer interfaces designed by non-broadcast engineers which are to work with Hollywood's demands.

Personally I'd rather we use HD-SDI or something. And of those we have for consumers, I do prefer Displayport over HDMI, but it does seem unlikely to catch on outside of PCs. But HDMI was just way too terrible.
 
Why do cards still have DVI: Because DVI is the ultimate digital connector. Plus it can also carry analog signals and passively adapted to BNC or VGA. DVI has NO BANDWIDTH LIMIT. DP, HDMI and the like HAVE LOW, FIXED BANDWIDTH LIMITS. Choosing between DVI or DP/HDMI is as simple as a choice as deciding whether you want limitless bandwidth or if you want severely limited bandwidth. HDMI and DP are objectively inferior to DVI. Anyone claiming otherwise does not know what they are talking about.
 
Why does a typical Chromebook these days come with HDMI instead of DisplayPort?

Because people are plain dumb. I will never buy a laptop with an HDMI port. I am currently using the Chromebook Pixel, which uses mDP.
 
Why do cards still have DVI: Because DVI is the ultimate digital connector. Plus it can also carry analog signals and passively adapted to BNC or VGA. DVI has NO BANDWIDTH LIMIT. DP, HDMI and the like HAVE LOW, FIXED BANDWIDTH LIMITS. Choosing between DVI or DP/HDMI is as simple as a choice as deciding whether you want limitless bandwidth or if you want severely limited bandwidth. HDMI and DP are objectively inferior to DVI. Anyone claiming otherwise does not know what they are talking about.

Well this guy disagrees with you.

Analog has limits just like digital. When referring to video signals, the maximum frequency of the DAC is a limit. And I can't even find any notes on the DAC on any modern video card. The highest I've seen back in the good ol' days of ridiculous-resolution CRTs was about 400 MHz. Modern 4K displays can beat that for total pixel bandwidth, and DisplayPort 1.3 will blow that away completely.

"Analog has no fixed bandwidth limit" the same way digital has no fixed bandwidth limit - the wire has no limit, it's solely based on what you've got at both ends of that wire.

If you plug an IBM XGA adapter in to an IBM 8514 monitor - you'll have a pretty firm limit.

For reference, I have an old Sun high-end CRT that I use through BNC connectors to multiple systems via adapters (including VGA.) This sucker can do high high resolutions (2048x1536 @ 85 Hz,) at amazing quality.

But my cheapo 1440p LCD still looks better for many uses.
 
Well this guy disagrees with you.

"maximum frequency of the DAC is a limit"

What counts as the DAC's frequency limit? Unlike digital connections, there is no clear answer. The maximum frequency of a DAC is not a fixed, hard limit. A 400MHz DAC will usually take at least 500MHz, maybe even 600MHz, until the signal starts getting noticeably degraded. And it isn't until you REALLY start pushing a lot of data through the DAC (talking AT LEAST double what it is rated for) that it the degradation becomes significant enough to warrant using a lower resolution.
Digital pretty much works or doesn't. If you go over what the transmitter is rated for, you get an absolutely useless signal.
 
Still using DVI here. Heck I still have VGA on the servers. Have slow DSL too, but I live out in BFE Ohio. I also have no need for HDMI especially when DVI is perfectly functional.

I read a report a couple years ago, and much (65%) of the USA is still on dialup, and will likely remain on dialup. Providers will not spend the hundreds of millions to give service to a few hundred people. The village I live in is seasonal and only has a population of barely 800 and Time Warner will not get service out here (it ends juuuuuust outside of town) so all we have is slow dialup, dsl, or satellite.

People need to remember just because you don't use it, that its still not in use by a bunch of other people. I know more people who have and use older stuff than any of the newer stuff.
 
thread needs to be bumped...

11212591_101538556869twu11.jpg


^ this card just got revealed :)
 
I'm still using DVI but intend to upgrade to a displayport Ultrasharp with my next video card upgrade.
 
For me,I bought one of those cheap Korean 1440k monitors - getting HDMI meant having a scaler which means much larger processing/input lag. I'm not sure if that's a limitation of HDMI or just the way that monitor is bundled
 
Back
Top