resolution problem w/ dvi>hdmi cable

ne0-reloaded

[H]ard|Gawd
Joined
Jul 1, 2003
Messages
1,216
I just bought a dvi>hdmi cable for my Samsung LN-S3251D LCD, and im having some problems with the resolution. I was using a vga cable before, but i thought a digital connection would provide a better picture. When i plugged that cable in, using 800x600 was the only resolution that would give me the most viewable space(even though theres about 1.5" of black space surrounding it). with the vga cable, the entire screen was filled no matter what res i used. Also, using my normal res of 1360x768 or 1280x720 makes text unreadable, and makes the actual viewing area noticeably smaller. using 1280x1024, fills up more than the entire screen, but it puts lines on the display (looks interlaced). is there any way to make the display run at 1280x720 or 1366x768, and have it fill the entire screen?

heres my setup if anyone needs to know:
9800 pro
cat 6.3 w/ CCC
MCE 2005 SP2

thanks
 
I am outputting from a 9800pro to an LCD projection tv. Somewhere in the resolution settings of the CCC is an area where you can create a custom resolution. That is where I solved my resolution problems. I am at work right now so I can can't check the actual details. I also played a little with the overscan on the actual TV.
 
how are you connecting your pc to ur display. ive found out that the problem may be because im connecting to hdmi. hdmi only accepts HD resolutions. since 1366x768 isnt hd, the hdmi input scales it down to 1280x720. if it was a dvi connection, it'd be no problem. the only thing im confused about is why the viewable area was reduced so much while using 1280x720. scaling i can understand, but why the reduction in size?
 
I currently use component, as I was having some problems with the dvi out from the 9800. Using a 6600gt, dvi to hdmi worked just fine. I have to run a custom resolution regardless of dvi or component. It is something like 11 something by 8 something. Just under the 1280*720 standard. I am not sure why it shrinks it down so much for you. I just kept tweaking until I got something that filled the screen properly. Movies and the desktop look great now.
 
I toyed around in the CCC and found a resolution that was better. it's 11something by 7something, it fills up the majority of the screen, but theres still some black bars around the edges. id still like to use 1360x768, since thats closest to the displays native res, but it just wont work under hdmi. im not sure what ur display is, but could you see if you can do 1360x768 on either card. i dont want to go back to vga, but thats the closest to 1x1 pixel mapping i can get

why are u using the 9800 over the 6600? from what ive been reading the 6 series nvidia cards and purevideo provides the best quality for dvd playback, as well as increased scaling. i was gonna buy a 6600, but if its not gonna look any better i think ill stay with the ati card.
 
Due to some budgetary concerns I actually sold my newer rig based on the 6600gt to a coworker who wanted a mid range gaming rig. I wanted to sell off my old rig and keep the new one, of course, but I couldn't sell him an old agp system, particularly since he wants to upgrade in a year. So unfortunately I can't tinker any to help you out. I do know that the 6600 was much easier to use and setup than the 9800. Have you gone into the TV menu to try to reduce the black border witht he over/underscan?

The 6600 looked great, but so does my 9800. Since I didn't do a side by side comparison, I can say if one looked better than the other. I use the Purevideo decoder for DVD's, and I am looking to start using ffdshow to further tidy things up. To be honest, I think things look so good right now, that I don't really feel the need to keep tinkering. I am focusing on streamlining everything so that my wife can use it a little easier.

Sorry I can't be more help.
 
Back
Top