LG 48CX

Big Navi is getting unveiled in just 1 week from now. If I still can't get a 3080 by the time it launches then I'll just go team red and test out freesync lol. Maybe it won't have any stuttering issues who knows.
 
Big Navi is getting unveiled in just 1 week from now. If I still can't get a 3080 by the time it launches then I'll just go team red and test out freesync lol. Maybe it won't have any stuttering issues who knows.

That would be really funny. But honestly I would love to see it after the awful Ampere launch.
 
Vsync can't to my knowledge do large range vrr, no. Very few would ever have bought a single g-sync monitor then.
But i tried to disable v-sync, and that changes the mode from VRR to fixed and hides the VRR option in windows (and to even turn VRR back on again, g-sync needs to be turned on first), so v-sync is either a dependency, or by some strange magic the windows VRR keeps g-sync alive after it is disabled, but without glitching.

Further things that point to it truely working:
*BFI (truemotion) is grayed out, if it was just v-sync, it should be possible to enable.
*Also consider what's more likely, than an option called VRR does VRR; or nvidia stealth-adding VRR to v-sync without mentioning it?
*green button spam says VRR
Vsync (in general) has no limits of frame rate or refresh rate. It's not dependant or limited by VRR in any way

Vsync matches the GPU to the monitor. It will only output full frames. Thus no tearing. But if a fully rendered frame isn't ready in time for the next refresh, it will display a duplicate. Thus stutter. And input lag

Gsync matches the monitor to the GPU. As soon as a fully formed frame is ready, it displays it. Stutter is greatly reduced (only frametime stutter) as well as input lag. That's why people pay for it

You want Vsync on when you use Gsync to eliminate tearing when the frame rate exceeds the VRR range. But you can always leave it off. They're separate settings not dependant on the other. They enhance eachother but don't rely on eachother

I think the most likely option is that Windows VRR does exactly what Microsoft says it does. They gave it that name when they released it and clearly outlined its purpose. If the name is misleading, then it's a bad name. But regardless of the label they gave the toggle, it toggles what they said it does
 
Last edited:
In regard to the linus video.. I scrubbed through it and only found a brief mention of 200Hz while he was actually messing with the monitor on his desktop and his subjective few words on how smooth he felt it was. I just want to remind people that the desktop is pretty much always at the max fps, e.g. 200fps at 200Hz monitor. On a 4k screen with very high+ to ultra settings on demanding games, let alone adding things like RTX and hairworks, mods, etc. you are probably going to get much reduced frame rate averages let alone minimums. You can dial some settings in/down but some games will be lucky to get 100fps average.

For example

Horizon Zero Dawn ultimate quality 4k
RTX 3090 ~ (59fps) - 79 fps average - (..99?)
RTX 3080 ~ (50fps) - 73 fps average - (..93?)

Assassin's Creed Odyssey Ultra High DX11, TAA
RTX 3090 ~ 94 fps average (prob something like 74 <<<< 94 >>>>114
RTX 3080 ~ 87 fps average

Metro Exodus Ultra, DX12, TAA
RTX 3090 ~ 89 fps average (prob something like 69 <<< 89 >>> 109
RTX 3080 ~ 76 fps average

Dirt Rally 2.0
RTX 3090 ~ 80 fps average (prob something like 60 <<< 80 >>> 100)
RTX 3080 ~ 70 fps average

Borderlands 3 Bad Ass, DX12, TAA
RTX 3090 ~ 76 fps average (prob something like 56 <<< 76 >>> 96)
RTX 3080 ~ 65 fps average


For reference....

......25ms / 16.6ms <<< 13.3ms >>>> 11.1ms / 9.52ms

at...40fps / 60fps <<< 75fps >>> 90fps / 105fps


So on most of the heavy hitting games you aren't even hitting 120fpsHz average so nowhere near120fpsHz solid... (let alone 200fpsHz or higher on high rez monitors capable of that).
If you dial your settings in(down) to get 100fps average you'd be getting around
(70) 85fpsHz <<<< 100fpsHz >>>> 115 fpsHz (117 capped).

DLSS could help with some games but then people will be tempted to use RTX/raytracing for even more eye candy so they still aren't likely to be getting or maintaining 120fpsHz. That is in general but there are some games with much higher frame rates of course but they are the exception (Doom Eternal 3090 = 204fps , 3080 = 177fps).

I have high hopes for Cyberpunk with DLSS. I'll turn raytracing off in order to get 100fps average or so if I find I have to though.
 
Last edited:
Those numbers for AC Odyssey are too high (I have clocked a lot of hours on that game). It's about 60fps average (70 for the 3090) at 4k on ultra. Some places in the game run a bit better of course but some places will also have you dropping lower. The benchmark on that game is actually good and representative of real gameplay.

But yeah even by lowering some settings VRR is critical to have a good 4k experience - and 120hz is plenty for the time being. Still wouldn't mind a super high hz mode for older games or competitive games I play on the lowest settings. Like Diabotical (arena shooter) I'm playing at 250fps on the CX right now with my 1080 ti.
 
120Hz is definitely plenty for now. In a few more GPU + CPU cycles we will have hardware that's capable of doing over 120fps at 4k and hopefully LG will have 240Hz OLEDs by then.
 
G-SYNC Off + Windows VRR On + frame rate limit just results in Vsync stuttering. It's smooth only at a locked 120 fps. The TV reports Fixed mode instead of VRR mode, as expected. The Windows VRR setting is just a compatibility setting.

NVIDIA supports non-validated FreeSync monitors (VESA Adaptive Sync) over DisplayPort which doesn't use their database of timings. Unchecking Display Specific settings is probably taking a similar code path for HDMI VRR.
 
Last edited:
Stupid questions: I just went ahead and ordered this 48in monitor.. I currently have a GTX 1080 TI. What's the best I can get out of this with GSYNC? 1440P 120hz? Also, can I use my existing HDMI 2.0 cables until I am able to upgrade to an RTX 3090 + HDMI 2.1 cable? Thanks
 
Stupid questions: I just went ahead and ordered this 48in monitor.. I currently have a GTX 1080 TI. What's the best I can get out of this with GSYNC? 1440P 120hz? Also, can I use my existing HDMI 2.0 cables until I am able to upgrade to an RTX 3090 + HDMI 2.1 cable? Thanks

The bad news: you won't be able to utilize G-Sync/VRR as it's not supported on that series.

The good news: you WILL be able to run full 4K at 120Hz...but not at 4:4:4 without a GPU that supports HDMI 2.1. I'm currently running 4K/120Hz at 4:2:0 on my 1080Ti, but I find it perfectly acceptable for anything except small red text which is extremely distorted. But it's rare for me to encounter red text, so I'm dealing with it until I can get a 3080. I could also get around it by running 4:4:4 at a lower resolution or refresh rate, but it's not a big enough problem for me to bother switching.
 
Stupid questions: I just went ahead and ordered this 48in monitor.. I currently have a GTX 1080 TI. What's the best I can get out of this with GSYNC? 1440P 120hz? Also, can I use my existing HDMI 2.0 cables until I am able to upgrade to an RTX 3090 + HDMI 2.1 cable? Thanks
Unfortunately no G-Sync support on the 1080 Ti with this TV, that requires 20 series and up. Highest res you can get is 4K 120 Hz 4:2:0. Yes you can use your HDMI 2.0 cables.
 
Those numbers for AC Odyssey are too high (I have clocked a lot of hours on that game). It's about 60fps average (70 for the 3090) at 4k on ultra. Some places in the game run a bit better of course but some places will also have you dropping lower. The benchmark on that game is actually good and representative of real gameplay.

But yeah even by lowering some settings VRR is critical to have a good 4k experience - and 120hz is plenty for the time being. Still wouldn't mind a super high hz mode for older games or competitive games I play on the lowest settings. Like Diabotical (arena shooter) I'm playing at 250fps on the CX right now with my 1080 ti.

I was just going from review sites but yes that is even farther away from 120fpsHz solid (8.3ms) or 117 capped (8.55ms) .

From what I looked up the tick rate of diabolical is 125 which is around 8ms per tick, or 8ms between each frame of updated game world delivered. There is more to the prediction code than that and in relation to other player's own server/tick rate relationships but in general while you could experience higher frame rates locally if you were at 200fps (solid for example) on a 200Hz monitor (5ms per frame) or 240fps solid on a 240Hz monitor (4.16ms per frame), unless you were playing on a LAN vs other players on the same LAN (and the game allowed higher local game server tick rates) you wouldn't be seeing any newer game world states any sooner from the server's 8ms world state slices including your opponents and teammates actions and location states among other things. You'd still get more blur reduction and motion definition locally which is appreciable even just from an aesthetic viewpoint though. Whether it would provide any actual benefit scoring wise would be debatable since that tick rate I quoted isn't even factoring in your ping times (+ ??ms) and using interp_2 (tick rate in ms x2) to avoid 250ms penalty on any packet loss.

For example a good server game with 128 tick servers and using interp ratio 2 (to avoid huge 250 ms hits on missed packets) would have 15.6ms interpolation + 25ms to 40ms (your ping). So say 41ms to 56ms just for your own actions not counting lag compensation between other players. Lets say 56ms for now on the higher 128tick servers (though most games are much longer tick). 56ms is 6.6 frames of time on a 120hz monitor at 117fps solid (8.5ms per frame). So you aren't seeing new world updates for every 6 or 7 frames, maybe worse in relation to syncing with your next local (8.5ms) frame draw.

On a more traditional 64tick , 22tick, or 12 tick online game the numbers go up by a lot:

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames

If you set interp_1 then the tick interpolation time would be halved (minus 1 frame, 2 frames, 5 frames, 10 frames respectively) - but any lost packet at all would hit you with a 250ms delay /8.5ms per frame = 29 frames.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

For reference, tick rates of some common online games:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick
 
Last edited:
I don't think 250fps helps me much and I am no professional anyway, but I can't use g-sync yet on the CX because of the 1080 ti - so it's the next best thing to have super low latency and good smoothness+nearly invisible tearing.

If I could use g-sync, I'd be playing at a locked 117fps+g-sync+v-sync any day. I am totally addicted to the perfectly smooth and tear free experience of VRR, it's a night and day difference to my eyes (but I also adore OLED picture quality). Been using g-sync since 2014 and this is the first time I find myself gaming without it since that time. It's rough, you have no idea how much this botched Ampere launch hurt me lol

Anyway I was just giving an example of a potential use case for 240hz+ on a 4k display. And heck, for that type of game it could even be limited to 1080p I wouldn't mind. But yeah, I'm only asking for flawless OLED 4k 120hz VRR at this time. That will content me for a long while.
 
This is a TV (LG cx48) I wholly agree about the dearth of monitors in that size range though.
Pricing is why... we're already in a place where the 55" was $1200 recently. Once you go lower the profits just dry up. I do have the Gigabyte Aero OLED these days so computer monitor usage is coming along. I do have the 55" CX and just sit further away than I normally would.
 
I'm all for high Hz. I'm hoping along with AI upscaling that perhaps eventually a very low latency interpolation is developed to multiply frames x 3 , x10 etc.. will come into play someyear. That way the very high Hz screens won't just be a number most games can't reach or maintain. Once you get a decent 100fps frame rate (even using DLSS/ AI upscaling), with some future low latency and artifact free interpolation you could theoretically multiply that x3 to 300fps. 300fpsHZ would have greatly reduced blur (even if the motion definition was still near 100fps). 100fps x10 to hit 1000fps on a 1000Hz monitor is the future goal because at that point you'd be a 1ms or 1px blur like a graphics professional crt (fw900).. which is essentially "zero" blur.

This kind of blur charted below would happen when moving the entire viewport at speed. On slower mouselooking and slight movement, slower turning speeds, etc.. it wouldn't necessarily exhibit as much blur. At 120fpsHz (Solid) it's more of a vibration blur "within the lines". The more you sink toward 60fpsHz it becomes a smearing blur outside of the lines.

www.blurbusters.com
KlIRG0B.png

For now on older and less demanding games I'd still appreciate higher (than 120) hz a lot aesthetically considering the blur reduction. I just don't agree with the competitive edge margins some people claim, especially when it comes to online games with the whole chain of tick rate, ping times, server prediction code, etc. between all players and the server... and even moreso for games where you have a lot of demanding graphic detail where you are perhaps dialing in settings plus relying on VRR to ride a roller coaster of frame rates. Hopefully that would mean keeping the middle and high range nearer to 100 - 1117fps on a 120hz screen to utilize the higher hz but you still wouldn't be at 117fps SOLID 2/3 of the graph. Even that is a challenge to achieve on some games so an over 120hz screen for those games is irrelevant.
 
Last edited:
I was just going from review sites but yes that is even farther away from 120fpsHz solid (8.3ms) or 117 capped (8.55ms) .

From what I looked up the tick rate of diabolical is 125 which is around 8ms per tick, or 8ms between each frame of updated game world delivered. There is more to the prediction code than that and in relation to other player's own server/tick rate relationships but in general while you could experience higher frame rates locally if you were at 200fps (solid for example) on a 200Hz monitor (5ms per frame) or 240fps solid on a 240Hz monitor (4.16ms per frame), unless you were playing on a LAN vs other players on the same LAN (and the game allowed higher local game server tick rates) you wouldn't be seeing any newer game world states any sooner from the server's 8ms world state slices including your opponents and teammates actions and location states among other things. You'd still get more blur reduction and motion definition locally which is appreciable even just from an aesthetic viewpoint though. Whether it would provide any actual benefit scoring wise would be debatable since that tick rate I quoted isn't even factoring in your ping times (+ ??ms) and using interp_2 (tick rate in ms x2) to avoid 250ms penalty on any packet loss.
Do a lot of people actually get high Hz monitors with the sole purpose of competitive gaming? I'd be very surprise if there were many that are on a level where that would make a significant difference. The only reason I make the point is because I've seen competitive players use 60Hz to no significant detriment (e.g. YouTuber Rocket League Gaming) and basically said that the major benefit of higher Hz really just comes down to preference of smoother motion. The idea that people should make their monitor purchasing decisions off the promise of "owning newbs" rather than what actually looks the best to me seems propagated by companies that are just trying to sell gimmicky monitors. That's why a lot of once high-end monitors (e.g. Acer Predator 27" 1440P 144hZ G-Sync) are pretty junky and have relatively poor IQ. Still many high end monitors are still like this, and get sold to impressionable gamers who's preferences are basically dictated to them by the companies they buy products from.
 
Do a lot of people actually get high Hz monitors with the sole purpose of competitive gaming? I'd be very surprise if there were many that are on a level where that would make a significant difference. The only reason I make the point is because I've seen competitive players use 60Hz to no significant detriment (e.g. YouTuber Rocket League Gaming) and basically said that the major benefit of higher Hz really just comes down to preference of smoother motion. The idea that people should make their monitor purchasing decisions off the promise of "owning newbs" rather than what actually looks the best to me seems propagated by companies that are just trying to sell gimmicky monitors. That's why a lot of once high-end monitors (e.g. Acer Predator 27" 1440P 144hZ G-Sync) are pretty junky and have relatively poor IQ. Still many high end monitors are still like this, and get sold to impressionable gamers who's preferences are basically dictated to them by the companies they buy products from.

I would say it depends on the person. One of my friends plays no better at 144Hz than he does at 60Hz even though he can see the difference in overall smoothness and clarity. At 240Hz yeah there's definitely very few people who would significantly benefit from it. I think from nvidia's own research, wasn't the % difference in gaming performance going from 144Hz to 240Hz in the single digits? And I think that testing was done on professional players as well. Nothing really noteworthy for those of us who just play for fun and not for prize money. 240Hz looks better than 120Hz of course, but doesn't magically makes all of us better players.
 
Imho the last 2 comments nailed it. Professional Gamers playing for prize money in competetive shooters is the group that actually benefits from those extreme high refresh rate monitors.

For casual gaming and heck, even competitive racing it's completely irrelevant.

I was competitively into iRacing a few years back (would still be if time would permit it) and I was racing on triple-40" 60Hz 1080p TVs. Not once did I wish that I had a 360Hz 32:9 monitor.

For playing competitive shooters on the other hand I'd love to have my old CRT back. No LCD/OLED on the planet can currently match what my 30 year old CRT had to offer for that specific purpose - and that's the void those high refresh monitors (hopelessly) try to fill.

Right now I want a big fat gaming display for casual, relaxed gaming and a 120+Hz 8k HMD and a graphics card that could drive that for competitive racing. At least one of those wishes morphed from a unicorn into reality thanks to LG. For the other ones I just pray to the availability gods..
 
I would say it depends on the person. One of my friends plays no better at 144Hz than he does at 60Hz even though he can see the difference in overall smoothness and clarity. At 240Hz yeah there's definitely very few people who would significantly benefit from it. I think from nvidia's own research, wasn't the % difference in gaming performance going from 144Hz to 240Hz in the single digits? And I think that testing was done on professional players as well. Nothing really noteworthy for those of us who just play for fun and not for prize money. 240Hz looks better than 120Hz of course, but doesn't magically makes all of us better players.

Imho the last 2 comments nailed it. Professional Gamers playing for prize money in competetive shooters is the group that actually benefits from those extreme high refresh rate monitors.

For casual gaming and heck, even competitive racing it's completely irrelevant.

I was competitively into iRacing a few years back (would still be if time would permit it) and I was racing on triple-40" 60Hz 1080p TVs. Not once did I wish that I had a 360Hz 32:9 monitor.

For playing competitive shooters on the other hand I'd love to have my old CRT back. No LCD/OLED on the planet can currently match what my 30 year old CRT had to offer for that specific purpose - and that's the void those high refresh monitors (hopelessly) try to fill.

Right now I want a big fat gaming display for casual, relaxed gaming and a 120+Hz 8k HMD and a graphics card that could drive that for competitive racing. At least one of those wishes morphed from a unicorn into reality thanks to LG. For the other ones I just pray to the availability gods..

I'd just like to point out again that professional gamers in studio competitions play on LAN games vs other players on the LAN.

Most of the test I've seen, including a big linus one with 3 different people including linus taking turns playing, were comparisons of local (LAN) gameplay I think, running the same map over and over vs bots to see the exact same timings as a bot crosses a blind corner's threshold.

Once you frame it in online gaming you have to factor in
--- the online latency/ping (15ms - 25ms best case for most people if the server even allows you to choose manually??) data transmission relationship to and from server
.... + server code/prediction intlerp at around 15ms to start with outside of your ping on a 128tick servver (most game's servers are much longer ticks so 30ms+ in those cases)
....everyone elses action states with their own latency compensations
....the quality of the game/server's net code

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

You are probably at multiple frames of local fpsHz between when you can see the actual server state "slice" get updated. That also goes for when your actions and positions resolve to everyone else. The net code can do some trickery but in raw numbers lets say for example you are playing a 128tick server and you aren't willing to risk a 250ms disconnect penalty for any lost frames so are using interp_2 (the default).
That would mean 15ms interp + say 25ms ping (being generous) = 40ms between server state updates, which is 5frames at 120fpsHz solid or 5.8 ~ 6frames at 144fpsHz..

So I highly doubt it matters from a scoring perspective. When even a 125 tick or 128tick server is serving you world state slices multiple frames behind your display's own fpsHz even at a solid 120fpsHz or 144Hz I think it's more about aesthetics. And a lot of game's are a lot longer ticks than that.

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames

If you set interp_1 then the tick interpolation time would be halved (minus 1 frame, 2 frames, 5 frames, 10 frames respectively) - but any lost packet at all would hit you with a 250ms delay /8.5ms per frame = 29 frames.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)
For reference, tick rates of some common online games:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick
 
I'd just like to point out again that professional gamers in studio competitions play on LAN games vs other players on the LAN.

Most of the test I've seen, including a big linus one with 3 different people including linus taking turns playing, were comparisons of local (LAN) gameplay I think, running the same map over and over vs bots to see the exact same timings as a bot crosses a blind corner's threshold.

Once you frame it in online gaming you have to factor in
--- the online latency/ping (15ms - 25ms best case for most people if the server even allows you to choose manually??) data transmission relationship to and from server
.... + server code/prediction intlerp at around 15ms to start with outside of your ping on a 128tick servver (most game's servers are much longer ticks so 30ms+ in those cases)
....everyone elses action states with their own latency compensations
....the quality of the game/server's net code

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

You are probably at multiple frames of local fpsHz between when you can see the actual server state "slice" get updated. That also goes for when your actions and positions resolve to everyone else. The net code can do some trickery but in raw numbers lets say for example you are playing a 128tick server and you aren't willing to risk a 250ms disconnect penalty for any lost frames so are using interp_2 (the default).
That would mean 15ms interp + say 25ms ping (being generous) = 40ms between server state updates, which is 5frames at 120fpsHz solid or 5.8 ~ 6frames at 144fpsHz..

So I highly doubt it matters from a scoring perspective. When even a 125 tick or 128tick server is serving you world state slices multiple frames behind your display's own fpsHz even at a solid 120fpsHz or 144Hz I think it's more about aesthetics. And a lot of game's are a lot longer ticks than that.

Latency isn't the only advantage of going from 144Hz to 240Hz so even if the latency advantage is nonexistent, the other advantages of moving to 240Hz are still there. Just again, those advantages won't make the majority of us better players even if we can see and appreciate it like the reduction in sample and hold blurring. My Omen X27 offers better sample and hold motion clarity than my CX but does that mean that I play Warzone any better at 1440p with higher frame rates and less sample and hold blurring on my Omen vs my CX at 4k with lower frame rates + more sample and hold blurring? Nah.

1603405381350.png

1603405410299.png
 
Last edited:
"The TV market is not really an area we follow at TFTCentral as we are focused primarily on desktop monitors."

Times change :p
 
120Hz is definitely plenty for now. In a few more GPU + CPU cycles we will have hardware that's capable of doing over 120fps at 4k and hopefully LG will have 240Hz OLEDs by then.
Really hoping we don't need to wait till after Hopper/RDNA3 to get those 4K frames locked above 120fps. Let's see what Hopper/RDNA3 MCM design can do, who knows, maybe it will blow monolithic die designs out of the water.
 
Thing is games and tech will always push the frame rates back down , like RTX/raytracing and hairworks, even physx does for example. Just scene complexity alone and view distances with animated objects and shadows/lighting throughout the distance can crush frame rates if devs were to push the limits higher. There is a reason corridor shooters tend to have much higher frame rates for example. The challenge for devs is to whittle games down to fit "realtime" (often using background and view distance tricks).. not the other way around.

So what I'm getting at is you can eventually get a gpu to play today's games with their current graphics limits/caps at over 120fps but the eye candy dial will just get cranked up even farther in future game generations, as well as there sometimes being patches and/or mods to the previous generation's games that are made to utilize the higher graphics power. It's an arms race and the gpus are always going to lag behind since the highest graphics fidelity possible is in a render farm for cgi work rending frames very slowly across multiple pcs/cpus. I think at some point there is going to have to be a really good form of interpolation developed to multiply a healthy frame rate of say 100fps multiple times, as mentioned in this blurbuster's article:

https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/

However AI upscaling at least seems promising for some higher frame rates though it's being marketed somewhat as a counterbalance to the RTX/raytracing hit on fps. I'm more intrested in 100fps average or higher. Now I'll have to seek out DLSS supported titles rather than SLI titles :rolleyes:.
 
Yea I think it's more software magic than hardware grunt that will get us those super high framerate + super high res combos on recent titles.
 

I really hope LG considers adding some sort of BFI + VRR feature on their next sets. BFI is absolutely freaking amazing when used under the right conditions and I would immediately upgrade to a C11 if it had a "BFI-sync" function. Yeah it's probably never going to be compatible with HDR due to killing brightness levels but if I had to choose between having HDR but then negating all that image quality out with sample and hold blur vs having perfect motion clarity in SDR instead, I would choose the latter.
 
Big Navi is getting unveiled in just 1 week from now. If I still can't get a 3080 by the time it launches then I'll just go team red and test out freesync lol. Maybe it won't have any stuttering issues who knows.

I'm sure the bots won't fuck that launch up at all either......
thefbomb-182.gif
 
Just go my CX48 and it's defaulting to 350% in the "change the size of text, apps, and other items." At this scaling, txt is super sharp but way too big. However, if I drop it down to anything less, I can start to notices some fuzziness around the txt. What is everyone running at?
 
Just go my CX48 and it's defaulting to 350% in the "change the size of text, apps, and other items." At this scaling, txt is super sharp but way too big. However, if I drop it down to anything less, I can start to notices some fuzziness around the txt. What is everyone running at?

1. Make sure you are running 444/RGB chroma with the latest firmware update. Older firmware downsampled to 422 (At 120Hz, 60Hz is fine).
2. Make sure you have screenshift disabled.
3. Make sure you have the input labeled as PC in order for text to display properly.
 
1. Make sure you are running 444/RGB chroma with the latest firmware update. Older firmware downsampled to 422 (At 120Hz, 60Hz is fine).
2. Make sure you have screenshift disabled.
3. Make sure you have the input labeled as PC in order for text to display properly.
Done all that. Why is it so huge?

EDIT - Turning off quickmode fixed it.
 

Attachments

  • 10-23-2020 4-40-15 PM.png
    10-23-2020 4-40-15 PM.png
    1.2 MB · Views: 0
Done all that. Why is it so huge?

You'll have to sign out of windows after changing the scaling value for it to take effect. If you still have fuzzy text after doing all the above then I'm not sure what to do after that. You could try toggling screenshift on and then back off as it's been bugged for a while now.
 
Man coming from years of LCD use to OLED, i'm not sure how I feel about everything looking so dark and blackish, the whites (in explorer and brower windows) seem subdued. And I've got the brightness turned up to like 80! Does it just take a while to get use to?
 
Man coming from years of LCD use to OLED, i'm not sure how I feel about everything looking so dark and blackish, the whites (in explorer and brower windows) seem subdued. And I've got the brightness turned up to like 80! Does it just take a while to get use to?

Your brightness or OLED light level is 80? The OLED light setting in particular can have a huge effect. Mine's set to 25, and yeah it took a little getting used to but at night any setting much higher than that would cause me to squint when white windows were displayed, lol. And the default setting was eye-searing. But, I was coming from a previous OLED, not an LCD. If you've been running your LCD at an extremely high brightness then maybe that's it. But I assure you this thing gets PLENTY bright for most people unless your room lighting is insane or something. Check your picture mode settings and also maybe cycle through the different picture modes to see what they look like.
 
Back
Top