During the course of our research, we developed the following proofs of concept (PoCs):
A PoC that demonstrates the basic principles behind variant 1 in userspace on the tested Intel Haswell Xeon CPU, the AMD FX CPU, the AMD PRO CPU and an ARM Cortex A57 [2]. This PoC only tests for the ability...
Yeah, not sure I see understand your reasoning. The black bars are annoying, but the back of the wall is not? Either way, the breakage of the screen kills immersion. The fact that you can accept the bezel and the back of the wall but not the black bars just means that you believe the bezel...
Man, can't run away from elitist attitudes. Sites that agree with MY critical assessments, they're good! Sites that disagrees, it's a shill site.
My opinion is best...because science!
You've all said your piece. Now move your trollin' along. Go find these perfect 40" monitors that don't...
At the risk of bringing this topic back, which I should let it die, but this is my last statement to defend myself. I did not propagate any false information. PWM artifacts are one form of motion blur (not clarity), and the smearing of DC backlight is also another form of motion blur, which is...
So did you read the link I posted up there that shows that SCIENCE behind strobing to eliminate the "apparent" appearance of motion blur? Don't confuse your opinion with facts.
Do you own a Samsung or are you just trolling? The 6 series has bad PWM blur. I admit that, which is why I returned...
BOTH have motion artifacts due to slow pixel response of LCD. None are better than the other. The improved backlight of the 7 series and up alleviate motion artifacts as shown via low PWM. However, the smearing of non-strobing backlight is not "better." They are both bad, but it's up to the...
What is motion blur? Non clarity of motion is motion blur. The smearing effect is not motion clarity which is caused by non-PWM monitors. I interpreted as the same, even if that's not the exact words he used.
I'm open to the definition of what "motion clarity" is.
And you know what else is funny? He claims that PWM causes motion blur which I don't believe is true. Motion blur is a result of LCD sample and hold technology and its response rate. Non-PWM has an awful smearing blur, whereas PWM has a different blur with less smear. PWM decreases the affect of...
The sine waves are definitely different, with series of intermittent steps to smooth it out. I believe you said the 7 series was not as bad as the 6 series, and there's a reason for that.
6 series PWM: http://i.rtings.com/images/reviews/ju6500/ju6500-backlight-large.jpg
7 series PWM...
I've noticed the latest firmware made the screen brighter, as if the gamma went up. I noticed it when playing games. I wonder if they messed with the backlight?
I sold my PS4. The grass isn't greener on the other side. Witcher 3 ran atrociously on it and stutters like a mother. Not only that, but the graphics are turned down by default. On PC, you have an option, but you people bitch that your system can't run them at high so you run to the PS4 which...
Believe it or not, I chose the Samsung TV over a bunch of 4K monitors due to its great 1080P. Most older games don't scale well past 1080P. I hook up my PS4 to it. Most monitors look awful at non-native resolution which was why I returned them all. Also, sometimes I need 1080P/60FPS for...
I play games on my 40" in 21:9 format all the time. It's basically a 34" 3440x1440P for when I feel like widescreen, and regular 4K 16:9 when I feel like normal aspect. I can also do 16:9 1440P in the middle of my giant screen whenever I feel like it. There's no need to pigeon-hole myself into...
I believe Patch 1213 is hosed. I remember the BIOS missing when I had my GTX 670. I also remember it kind of missing on/off with my first 980TI. It's probably because of the firmware. Samsung is obviously continuously improving their set as indicated by the ever improving input lag, and the BIOS...
I'm also saving for a Tesla model 3. You guys spend money on whatever floats your boat. However, the argument that you're not hard unless you spend 1300 on two video cards is beyond lame. If you're a graphics whore, you can turn ultra on everything and play at sub 60fps. Or you turn down...
Sorry, totally disagree with that analogy. Special effects is a totally different concept than AA, bloom, etc. AA was created to remove jaggies which are hardly noticeable at 4K. Vignette, etc. are not even close to movie special effects. Turning one off doesn't affect the quality much, but...
Disagree on the second 980TI. I play PS4 in 1080P and it looks pretty damn good as the scaler was built for 1080P source. 2560x1600 is not a native resolution. 1080P is. However, compared to 4K, 1080P will not look as good. Im playing Witcher 3 with adjusted settings and its playing 4K 60fps...
I went through MSI, Giggabyte, and Zotac. I have yet to find a card that can maintain 1500+ at 4K on the Witcher 3. Maybe my bad luck or maybe most of these 1500+ claims are on 1080P.