New Samsung 4k for everyone.

Did you get a chance to try pc mode with 1219 firmware?

I have a second js9000 heading to me, love the first so much that I decided to get another for a different room possibly. If I update the firmware on that one it will be at 1220 I assume so I will run tests on that one to see what kind of numbers I get.

Anyone know if firmware for samsung tvs can be update via USB? If so can I get firmware 1219 somewhere??

I find it odd that I post this and the next day Samsung has a firmware update out...

There are firmware downloads on Samsung's support website. I have never used them though. There must be a way to flash them (presumably via USB) but I Hve never tried.

I also don't think they post EVERY release. Last time I looked there was just one file there, and it wasn't the most recent (forget the details)
 
Did you get a chance to try pc mode with 1219 firmware?

I used it extensively on my desktop, but never in a game or anything where input lag was of importance. I never really noticed it being more responsive, but that doesn't mean it wasn't. Input lag is one of those things you usually only notice if it is too high :p

I will have to check how I feel about PC mode in game in 1220 when I get home.
 
HUGE NEWS!!!

Firmware 1219 for JS9000 in pc mode has 0ms Input Lag!!!!



I just bought the 55js9000 yesteray and used my Leo Bodnar input lag tester on it with firmware 1209. Game mode got 22.8ms and pc mode got 42.2ms.

I then decided to update the firmware of the tv to 1219 and retest the lag. Game mode stayed the same but pc mode went to 0ms!!! This is with UHD Color on!!!

Before people say my lag tester is broken or something, its not. I had tried this 30 times now to confirm and went back and forth between game and pc mode.

I have tested about 40 tvs with this lag tester and about 15 pc monitors. This is the lowest lag test I have ever seen! The second lowest was the Samsung s24d590pl which got 9.3ms on the middle bar and the Benq BL3201pt which got 10.2ms.

In the end even with numbers I was in shock so I tried some of my most frame intensive games for input like sf3 third strike and burnout paradise. To my amazement these games felt like playing on a crt when it comes to input responsiveness!!

Below is the pic on firmware 1219 in pc mode with 0ms input lag and then a photo of game mode and the lag it gets.


PC Mode UHD color on:



Game mode UHD color on:




Ready for SFV :)


Please let this be legit!

EDIT: I wish the bodnar could test at 4k.
 
Last edited:
Could it be that anything below one frame registers as 0.0?

Possibly, but I have seen Leo Bodnar results on 60hz panels under 16.67ms before, so my thought is probably not.

I can't imagine this is a real result. I feel there is some test method or equipment issue at play here. Something the TV is doing must somehow be confusing the leo bodnar unit.

Maybe asking the man himself would be pertinent here?

One thing that comes to mind is that Samsung knows that people now use lag testers, and has figured out a way to detect one and enter a cheat mode when it is used?

(I hope this is not the case, but you never know, especially in the TV industry where all the players play fast and loose with the specs)
 
First the good news

http://www.amazon.com/gp/product/B00988GMLG

This upscaler actually work. I simply use a DVI to HDMI converter into the scaler, and a HDMI cable from the scaler to the TV.

Then the bad news

The PC and the scaler negotiates 1366x768 by HDMI and 1360x768 by DVI. Both 60Hz.

Using just the DVI cable directly into the TV, results in a no signal input again.

1360x768/60Hz is officially supported by the TV. It also works under windows only. It is the protocol negotiation that is broken, in the TV. The upscaler box, negotiates just fine with the PC even when not logged into windows.

I will ask my retailer for a refund for the box I had to purchase, to get this to work. That is going to be interesting.

Setup

Luckily for this case, the DVI initializes first on the 980ti. I need to change the input to the scaler, to see the screen. And back again entering windows. I just switch between HDMI1 and HDMI2.

I do loose one HDMI input.

This is the 65" 7005. Firmware 1213.

Note that if you get this particular product from Amazon, there are no cables supplied. You need one DVI to HDMI cable, and one HDMI cable to get this to work. That is on top of the direct cable from the PC to the TV.

If you live outside the US, like I do, you will need to get your own 5V/1A power supply.

And I look forward to the next firmware update, as my TV is no where near that miracle 0.0msec lag.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I am looking at the samsung 48" js8500, I have a 980ti which will run some games 4k 60fps but probably would need to run newer games at 1080p until a better GPU comes out... What is this TV's scaling like with 1080p? Does it look exactly the same as a 48" 1080p native TV would look? Also does scaling to 1080p add input lag? Thanks
 
I am looking at the samsung 48" js8500, I have a 980ti which will run some games 4k 60fps but probably would need to run newer games at 1080p until a better GPU comes out... What is this TV's scaling like with 1080p? Does it look exactly the same as a 48" 1080p native TV would look? Also does scaling to 1080p add input lag? Thanks

Honestly, anything other then 4k looks like crap. This set needs PC/444 or Game/422 with UHD/4k to really appreciate and get this TV to look its best. I tried to scale it to my older Dell U3011 and 2560x1600 looks blown up and washed out. I would recommend getting another 980 Ti if you can to enjoy this TV
 
Honestly, anything other then 4k looks like crap. This set needs PC/444 or Game/422 with UHD/4k to really appreciate and get this TV to look its best. I tried to scale it to my older Dell U3011 and 2560x1600 looks blown up and washed out. I would recommend getting another 980 Ti if you can to enjoy this TV

Disagree on the second 980TI. I play PS4 in 1080P and it looks pretty damn good as the scaler was built for 1080P source. 2560x1600 is not a native resolution. 1080P is. However, compared to 4K, 1080P will not look as good. Im playing Witcher 3 with adjusted settings and its playing 4K 60fps fine. 4K non-ultra settings look much better than 1080P at ultra because of 4x the resolution. If you need 60fps at 4K you can definitely do it. There is no rule that says you must play a game at Ultra everything. It's really difficult to tell the difference between ultra and non-ultra at 4K. Huge difference between 4K and 1080P though.

I don't understand the ultra everything or bust mindset. I remember when 30FPS was possible with the latest and greatest video cards and ultra or uber settings were for graphics cards 2 years in the future. Boy have things changed.
 
I don't understand the ultra everything or bust mindset. I remember when 30FPS was possible with the latest and greatest video cards and ultra or uber settings were for graphics cards 2 years in the future. Boy have things changed.

For me the reasoning goes like this:

The artistic design of the game was intended to be displayed with all the bells and whistles. When you shut those off, you are not getting the experience as intended. You wouldn't put up with watching a movie, if many of the special effects were disabled, so why so wuick to accept it in a game?

My expectations are that with all in game settings maxed, 16x AF and at least 4x MSAA (or FXAA/MLAA) I should get a fixed 60fps synced to the refresh, and never drop below it.
 
Zarathustra[H];1041800182 said:
For me the reasoning goes like this:

The artistic design of the game was intended to be displayed with all the bells and whistles. When you shut those off, you are not getting the experience as intended. You wouldn't put up with watching a movie, if many of the special effects were disabled, so why so wuick to accept it in a game?

My expectations are that with all in game settings maxed, 16x AF and at least 4x MSAA (or FXAA/MLAA) I should get a fixed 60fps synced to the refresh, and never drop below it.

Sorry, totally disagree with that analogy. Special effects is a totally different concept than AA, bloom, etc. AA was created to remove jaggies which are hardly noticeable at 4K. Vignette, etc. are not even close to movie special effects. Turning one off doesn't affect the quality much, but losing special effects on a movie makes it unviewable. Hair works is the biggest quality difference that you can make out with your eyes yet turning it off barely barely makes a difference to me, other than increasing frames by 20%.

Difference strokes I guess. I have no problem spending more money for things that make a difference. To me, 700 more so I can turn on those special effects, I'll pass.

Edit: settings that harm your fps but barely makes a difference visually: shadows, grass, water, grass density, Foliage, motion blur, etc. I've done comparisons, turning them to ultra and back down. I could not make out the difference or it's very subtle.
 
Last edited:
AGREED. Something happened with firmware 1219 for pc mode that gives it a 0.0ms. I can say that the feeling in pc mode now feels much more responsive then game mode and game mode is 22ms...

So I never tested this in 1219, but I just did some comparisons in 1220, by simply moving the mouse pointer rapidly around the desktop.

To me game mode still feels more responsive I can't say by how much, but it is noticeable to me.
 
Zarathustra[H];1041772537 said:
Shift might be the wrong word.

The blacks become lighter around the edges. This may very well be because I sit only 2.5 feet from a 48" screen, so I have view angle issues around the edges though.

The weird thing is that they are much more noticeable in Game mode than in PC mode. I don't notice it in game at all, only on a plain black desktop, so it really isnt THAT noteworthy.

This is usually how - when I am looking at the desktop - I notice that I forgot to switch back to PC mode after ending a game session :p

So I figured out what was causing this. It was local dimming set to standard. I must have turned it off in PC mode, but forgotten in game mode.

I turned it off, and I like it much better.
 
Disagree on the second 980TI. I play PS4 in 1080P and it looks pretty damn good as the scaler was built for 1080P source. 2560x1600 is not a native resolution. 1080P is. However, compared to 4K, 1080P will not look as good. Im playing Witcher 3 with adjusted settings and its playing 4K 60fps fine. 4K non-ultra settings look much better than 1080P at ultra because of 4x the resolution. If you need 60fps at 4K you can definitely do it. There is no rule that says you must play a game at Ultra everything. It's really difficult to tell the difference between ultra and non-ultra at 4K. Huge difference between 4K and 1080P though.

I don't understand the ultra everything or bust mindset. I remember when 30FPS was possible with the latest and greatest video cards and ultra or uber settings were for graphics cards 2 years in the future. Boy have things changed.

So, you spend 2 grand on a high end 4k TV, and at least $650 on a graphics card to play and lower settings? Most of us [H] brethren are more high end, bleeding edge type of people. I am an eye candy whore, and it bugs me when I have to dial the settings back on a game I want to play. Call it OCD or narcissistic, but I like what I like.


Zarathustra[H];1041800182 said:
For me the reasoning goes like this:

The artistic design of the game was intended to be displayed with all the bells and whistles. When you shut those off, you are not getting the experience as intended. You wouldn't put up with watching a movie, if many of the special effects were disabled, so why so wuick to accept it in a game?

My expectations are that with all in game settings maxed, 16x AF and at least 4x MSAA (or FXAA/MLAA) I should get a fixed 60fps synced to the refresh, and never drop below it.

Agreed...That's why it sucks that Witcher 3 for whatever reason, I can't hit above 40 FPS in SLI. I heard it's a software issue, because whether I drop the settings and turn off hair works, or crank up the settings, I stay at 40 FPS
 
I'm also saving for a Tesla model 3. You guys spend money on whatever floats your boat. However, the argument that you're not hard unless you spend 1300 on two video cards is beyond lame. If you're a graphics whore, you can turn ultra on everything and play at sub 60fps. Or you turn down settings and play at 60fps. Or even turn on everything and play at 1080P at 60fps. However, pushing dual 980TI as a prerequisite for 4K... Nope ain't gonna buy it. Post pictures of 4k ultra everything vs 4k at high and tell me there are huge gigantic differences. Did you all try it yourself or are you imagining that it look awful?
 
Agreed...That's why it sucks that Witcher 3 for whatever reason, I can't hit above 40 FPS in SLI. I heard it's a software issue, because whether I drop the settings and turn off hair works, or crank up the settings, I stay at 40 FPS

I've been having inconsistency issues in Red Orchestra as well.

I have side monitors so I usually load the GPU stats so I can view them during play. My framerate will frequently drop below 60fps, while the GPU's are loaded at like 55-60%... And there is no way it's the CPU either, it's clocked at 4.8Ghz right now...

SLI may be better than crossfire, but still the best solution is a single powerful GPU. Hopefully next generation will bring one sufficient to play the games I want to play with a single GPU.
 
I'm also saving for a Tesla model 3. You guys spend money on whatever floats your boat. However, the argument that you're not hard unless you spend 1300 on two video cards is beyond lame. If you're a graphics whore, you can turn ultra on everything and play at sub 60fps. Or you turn down settings and play at 60fps. Or even turn on everything and play at 1080P at 60fps. However, pushing dual 980TI as a prerequisite for 4K... Nope ain't gonna buy it. Post pictures of 4k ultra everything vs 4k at high and tell me there are huge gigantic differences. Did you all try it yourself or are you imagining that it look awful?

No one said it wasn't "hard". Different people have different priorities, and that is fine. I just don't like spending a ton of money on half measures, so if I have already spent 2 grand on a monitor, I'm going to spend an additional $650 to get another GPU to make it shine.
 
I can't find a way to get in BIOS
Anyway, i intent to upgrade my motherboard / CPU / PSU
Do you think that, with a new motherboard / CPU / PSU, i would have a chance, if there is a second ( onboard ) gpu ?
 
I can't find a way to get in BIOS
Anyway, i intent to upgrade my motherboard / CPU / PSU
Do you think that, with a new motherboard / CPU / PSU, i would have a chance, if there is a second ( onboard ) gpu ?

I believe Patch 1213 is hosed. I remember the BIOS missing when I had my GTX 670. I also remember it kind of missing on/off with my first 980TI. It's probably because of the firmware. Samsung is obviously continuously improving their set as indicated by the ever improving input lag, and the BIOS being largely resolved by 1219. If you have a secondary monitor, use it temporarily until they release 1219 for you.

If you want to upgrade your system, then go for it. However, I don't think that's going to resolve the issue if the firmware is the cause of the missing BIOS on HDMI.
 
I believe Patch 1213 is hosed. I remember the BIOS missing when I had my GTX 670. I also remember it kind of missing on/off with my first 980TI. It's probably because of the firmware. Samsung is obviously continuously improving their set as indicated by the ever improving input lag, and the BIOS being largely resolved by 1219. If you have a secondary monitor, use it temporarily until they release 1219 for you.

If you want to upgrade your system, then go for it. However, I don't think that's going to resolve the issue if the firmware is the cause of the missing BIOS on HDMI.

One of the members at AVForums got a response from Samsung's tech support:

https://www.avforums.com/threads/samsung-uexxju7000-owners-thread.1947994/page-36#post-22499776

Samsung is aware of the issue so that's something.
 
Pumped to see all the reviews for all of these.

Im still waiting for hardware to come out and not having to spend $600 on a GPU to run 4k.
 
...and not having to spend $600 on a GPU to run 4k.

You may need to wait a few generations for that.

Currently a single $650 980ti is not enough for anything but older/less graphically intense titles, or really low settings.

Keep in mind I can't get acceptable frame rates at 4k in Metro 2033, a 5 year old game today with two 980ti's in SLI.

I'd say the entry point for 4k gaming today is at least 2x 980ti's, or $1,300 in GPU's. Some in here haven't even been happy with that, opting for a third.

Remember, based on pixel count alone, we are talking 4 times as much work as 1080p, the work required to maintain frame rates at higher resolution is rarely linear with pixel count, you usually need more power than a linear pixel count based estimation would suggest.

Also add to that that multi GPU solutions rarely scale well, so if you go beyond the ability of a single GPU, expect it to get expensive.

If you want to play new top of the line titles at 4k with a single $350 GPU I think we are a LONG way off.

I'm not hopeful for single GPU 4k gaming in the next generation (Pascal). Typically top end performance growth amounts to ~30% per generation, and I'm not sure 30% over Titan X is going to cut it. Generation after that, a single GPU may be viable, but it will likely be the top of the line $1,200 Titan, not the mid range stuff. Give it another generation or two after that (so 4 gens from now) and we are probably going to be comfortable with mid range GPU's at 4k.

That is - unless titles in the mean time get more demanding :p

This whole thing reminds me of when I first got my Dell 2405FPW back in early 2005. I had a 6800GT at the time, and it could not handle Half Life 2 at 1920x1200. Eventually it made me tired of games that I couldn't play at the screens natural resolution, so I just kind of stopped playing until 2010 when I got into games again...

And it also reminds me of when I got my 30" Dell U3011 and my video card struggled with 2560x1600..
 
Last edited:
Pumped to see all the reviews for all of these.

Im still waiting for hardware to come out and not having to spend $600 on a GPU to run 4k.

I wish video card prices would go down but I just don't see that happening. If you look at the 28nm graphics market, the prices have climbed dramatically over the last three years despite improved yields and economy of scale. TSMC, which manufactures all the graphics cards (For now) also has Apple contracts that are far more lucrative than anything GPU related. TSMCs focus is on making money, not making GPUs. On the plus side, CPUs are almost irrelevant now so the money you used to spend on mobo/cpu upgrades can now go into more video cards. Give it another 2 years and the CPU will be a total afterthought unless developers can start utilizing the onboard GPUs to great effect. I expect the 14/16nm generation of GPUs will be even more expensive. The hardware requirements on the GPU side for things like the Oculus Rift are extremely steep (Minimum requirements are R9 290 & GTX 970).

It blows my mind that I bought my Sandybridge for $220 Canadian in January of 2011. Paid $80 for Ram and $180 for the motherboard. At 4.4ghz it's still plenty to drive even a GTX 980TI. Would spending $800 for a new Skylake setup yield more 4k performance than adding another 980TI? HELL NO, not even close. At most I might get a couple more FPS in a few titles.
 
Pumped to see all the reviews for all of these.

Im still waiting for hardware to come out and not having to spend $600 on a GPU to run 4k.

Just like when 2560x1600 was the top res and needed two GPUs to run anything playable. Any pixel count above 2560x1600 will require a top end card, no way around it.
 
I have a question, guys : every firmware update on the Samsung 4k tv's, reduced the input lag ?
Thank you
 
I have a question, guys : every firmware update on the Samsung 4k tv's, reduced the input lag ?
Thank you

I haven't noticed much difference in the last few. At least not in game mode.

PC mode dies seem a little better as of late, but it is tough to tell.

When it has changed, all recent changes have been positive though. Not since the very early firmwares on some 6xxx models has input lag worsened from a patch. (And even on those, the most recent firmwares are better)
 
I have a question, guys : every firmware update on the Samsung 4k tv's, reduced the input lag ?
Thank you

Typically, no. You have a much better chance of reduced lag by buying newer models that are designed with less lag. The better 4K TVs optimal for gaming have not yet been made but the industry is slowly moving that way. That's why many have opted to wait when it comes to 4k.
 
Am I the only one here that's happy gaming at 4k with just a gtx 960!?!?!

Seriously though, I'm able to run Diablo 3, Fallout 4 New Vegas, Skyrim, Arkham City, Deus Ex 3, and World of Warcraft all at 4k even with a bit of AA with just a gtx 960 and a 4 year old computer (Q9550 oc'd to 4ghz with 4 gigs of ram). I imagine with newer games I would need some more horsepower but I would think 1 gtx 980 ti would be sufficient.
 
It all depends on the performance that you're willing to live with and what level of gfx detail you're willing to live without. Some people are only content with a constant 60fps and you're definitely not going to get that in the newer demanding games with a single card, be it a GTX 960 or a Titan X.

Most of the games that you listed are older, so they probably run decently at 4K on a 960. Certainly playable, though I wonder what kind of performance you're seeing. Not that it really matters, as long as it looks and feels good to you. I know I won't maintain a solid 60fps with my 980 SLI in every title, but I haven't loaded up a game that made me feel like the performance was lacking, either.
 
Zarathustra[H];1041801045 said:
I'd say the entry point for 4k gaming today is at least 2x 980ti's, or $1,300 in GPU's. Some in here haven't even been happy with that, opting for a third.

I disagree with that statement.
And Metro is not a good benchmark, the game was made to make everything cry.
There's no way I'm going to use that game to justify getting two 980Ti's.

99.9% of the games today run fine at 4k on ultra everything. Even The Witcher 3 and GTA V (yes I own both).
No 60 fps on ultra, but most people actually don't need 60 fps. Those who absolutely need 60 (120, 144) fps in a game are a minority.
You're saying 60 fps or a bust, but you're saying that because your TV can't do more than 60hz. If our Samsungs supported 120 refresh rate you'd jump on 120 fps bandwagon, which kinda makes the point moot.

I'm fine with 40 fps in many games, I'm currently running both of those games at about 40-45 fps on a single 980Ti.
 
I have right now an older CPU, Intel I7 920@2,67 GHz, 8 GB DDR3, GTX 980Ti.
Do you think that the CPU bottleneck the gpu ? I do not like to oc, i want to upgrade soon the cpu + mobo + psu.
And one more question : do you guys, use HDMI 2.0 cables, for 4 K ?
Is there any advantages vs 1.4 HDMI cables @4K ?
 
Last edited:
I second Nebell.

980ti SLI is for hardcore gamers. The Samsung line of TVs are not for hardcore gamers. There is lag, 60Hz, and only HDMI. No G or Free sync as well. The only thing these TVs do have, is great size.

There is an trade-of for size with these TVs. If you can live with those, a single 980ti strikes a great balance with these trade-ofs. If you need high end gaming that is. If not 960, 970, or 980 will actually fit your needs as well. After all, the 960 do just fine gaming at 1080p.

Also, using 980ti SLI with these TVs, and you start pumping a lot of heat. A heat-bomb of a PC is not the best of moves, for such usage. Not in a living room.

My TV typically consumes 115W, while the DAC is at a constant 70W. The power amplifier maxes out at 400W, hopefully a lot less typically. The list goes on. The PC for typical desktop work, is strolling along at below 140W, even overclocked to 4.5Ghz.

Adding everything up, and throwing a 980ti SLI in there, and it easily ends up closing in on a 1000W, while gaming. An overclocked 980ti offers minimum FPS that are very close to stock 980ti SLI, while keeping heat down.
 
... And one more question : do you guys, use HDMI 2.0 cables, for 4 K ?
Is there any advantages vs 1.4 HDMI cables @4K ?

I use the Audioquest Cinnamon cable at 1m. I would not use anything worse than that. Using a cable not deigned or fully supporting 2.0, seems strange, if you plan to push HDMI 2.0 to the very limit, as 4k/444/60Hz do.
 
Okay, Nilsen, and what's the advantages with this Audioquest Cinnamon cable, vs some regular HDMI 1.4 cables ?
Where do you see some improvements ?
 
I have right now an older CPU, Intel I7 920@2,67 GHz, 8 GB DDR3, GTX 980Ti.
Do you think that the CPU bottleneck the gpu ? I do not like to oc, i want to upgrade soon the cpu + mobo + psu.
And one more question : do you guys, use HDMI 2.0 cables, for 4 K ?
Is there any advantages vs 1.4 HDMI cables @4K ?

No such thing as HDMI 2.0 cables. You need cat 2 cables. If it's within spec, it should work.
 
I have right now an older CPU, Intel I7 920@2,67 GHz, 8 GB DDR3, GTX 980Ti.
Do you think that the CPU bottleneck the gpu ? I do not like to oc, i want to upgrade soon the cpu + mobo + psu.
And one more question : do you guys, use HDMI 2.0 cables, for 4 K ?
Is there any advantages vs 1.4 HDMI cables @4K ?

Have you looked into a Westmere Xeon? Roughly 100 bucks (and make the overclocking jump) and you'll find that Skylake won't be that much of an upgrade.
 
Have you looked into a Westmere Xeon? Roughly 100 bucks (and make the overclocking jump) and you'll find that Skylake won't be that much of an upgrade.

or find an EVGA SR2 mobo, put 2 of these X5675s inside and wait another 6 years until Intel releases something with 12 cores 24 threads running at 4.2 to 4.4 GHz:D
 
Have you looked into a Westmere Xeon? Roughly 100 bucks (and make the overclocking jump) and you'll find that Skylake won't be that much of an upgrade.

It would certainly be an upgrade, but Westmere cores have a pretty huge IPC deficit to Skylake.

It hasn't been much from generation to generation, but the single digit percentages add up over multiple gens.

I would agree, as a "hold me over" strategy, $100 isn't a bad investment, but a Westmere if you can get it to 4.4ghz will still be significantly slower than a Skylake at 4.7ghz.

I'd say the skylake would be as much as 75% faster in some benchmarks.
 
Am I the only one here that's happy gaming at 4k with just a gtx 960!?!?!

Seriously though, I'm able to run Diablo 3, Fallout 4 New Vegas, Skyrim, Arkham City, Deus Ex 3, and World of Warcraft all at 4k even with a bit of AA with just a gtx 960 and a 4 year old computer (Q9550 oc'd to 4ghz with 4 gigs of ram). I imagine with newer games I would need some more horsepower but I would think 1 gtx 980 ti would be sufficient.

How does look Fallout3/New Vegas/Skyrim dialogs and texts with gaming mode?
 
Back
Top