LG Ultragear 27" OLED 240hz 1440P 27GR95QE-B

Sorry to hear it didn't work out, LittleBuddy, and hope you find one you like! (After trying and returning two other monitors before this one, I know how frustrating that can be.)

It'll be interesting to see what that mini-LED will be like! Hopefully you can find something until then that works in the interim.
 
sgupta you gave up your PA32UCG for the 27 inch OLED? thats insane my man all you had to do was just add the local dimming option to the shortcut menu in the OSD then it would of been just a click away to turn on/off.

I run the PA in 24x7 HDR as im lazy, i still havent seen pops as good as the PA32UCG for HDR and i own a C2 for the bedroom and a QN95A for the lounge.
 
sgupta you gave up your PA32UCG for the 27 inch OLED? thats insane my man all you had to do was just add the local dimming option to the shortcut menu in the OSD then it would of been just a click away to turn on/off.

I run the PA in 24x7 HDR as im lazy, i still havent seen pops as good as the PA32UCG for HDR and i own a C2 for the bedroom and a QN95A for the lounge.

The PA32UCG had quality control issues which would have prevented me from keeping it anyways (3 dead pixels, a stuck pixel, something in the screen [eyelash or plastic shaving] - I eventually got the latter to disappear, but who knows if it would have come back). It also didn't like to come back from standby properly, which I'm unsure if was something with the monitor or just general behavior.

Even so, I really didn't want to have to turn off local dimming for desktop use since that's the majority of what I do, and I had some (albeit minor) viewing angle issues in the corners with such a big display since I sit fairly close. I didn't expect the local dimming to be that distracting to me for desktop use.

I'm glad you love the UCG and I agree the HDR on them is phenomenal and pretty unparalleled, but for me, this OLED has been a much better fit (price aside, but the savings were really nice too). Even if the UCG was perfect, I don't really regret the trade and think I made a good choice.

If you mainly view HDR, though, or need its capabilities, I agree the UCG is a beast, and the color calibration was fantastic out of the box. I also like that it came with a colorimeter. It's a great monitor [assuming you get one without the QC issues] - just turned out to not be the best fit for me.
 
The PA32UCG had quality control issues which would have prevented me from keeping it anyways (3 dead pixels, a stuck pixel, something in the screen [eyelash or plastic shaving] - I eventually got the latter to disappear, but who knows if it would have come back). It also didn't like to come back from standby properly, which I'm unsure if was something with the monitor or just general behavior.

Even so, I really didn't want to have to turn off local dimming for desktop use since that's the majority of what I do, and I had some (albeit minor) viewing angle issues in the corners with such a big display since I sit fairly close. I didn't expect the local dimming to be that distracting to me for desktop use.

I'm glad you love the UCG and I agree the HDR on them is phenomenal and pretty unparalleled, but for me, this OLED has been a much better fit (price aside, but the savings were really nice too). Even if the UCG was perfect, I don't really regret the trade and think I made a good choice.

If you mainly view HDR, though, or need its capabilities, I agree the UCG is a beast, and the color calibration was fantastic out of the box. I also like that it came with a colorimeter. It's a great monitor [assuming you get one without the QC issues] - just turned out to not be the best fit for me.

Thats fair im on the PC from the moment i wake to the time i go to sleep from grading work to then leisure having a OLED as a PC monitor well makes little sense in my case especially when it will be used 14hrs a day.
 
Thats fair im on the PC from the moment i wake to the time i go to sleep from grading work to then leisure having a OLED as a PC monitor well makes little sense in my case especially when it will be used 14hrs a day.

Yup - very understandable - I'd be super worried about burn-in with that kind of use. Mine gets its usage, but not nearly that much.
 
Yup - very understandable - I'd be super worried about burn-in with that kind of use. Mine gets its usage, but not nearly that much.

I think if you really are determined to make OLED work as a PC monitor you need two screens a decent LED for productivity/grading color and then use the OLED for content consumption (HDR gaming/Movies) that way the bulk of the usage is done on the LED and you have OLED for leisure/entertainment.
 
  • Like
Reactions: XoR_
like this
I think if you really are determined to make OLED work as a PC monitor you need two screens a decent LED for productivity/grading color and then use the OLED for content consumption (HDR gaming/Movies) that way the bulk of the usage is done on the LED and you have OLED for leisure/entertainment.

That'd be neat, but I only have room for one screen with my current setup unfortunately. If I did have that setup, if anything I'd probably prioritize sRGB/SDR work on the OLED since that's where it shines most IMO, then do HDR stuff on the LED since it can get brighter. But as a display that does everything, this seems to handle it all pretty well so far.
 
Here are my Settings after a week of adjusting to the monitor. I like my monitors dark so what works for you would probably blind me. I can run any dark game without problems with the settings Darktide or Deadspace remake still look fine
I do have the gamma slightly turned up in game to compensate for the lower brightness. The problem was when I first had the monitor I had Windows 11 desktop gamma down all the way so it was screwing with the overall gamma on the monitor.

Gamer 1

60 Brightness
60 Contrast
Gamma Mode 2
Color Temp Cool
Black Equalizer 50


Nvidia settings

50B
50C

.81 gamma

Found lowering the gamma gives stuff better contrast in games just upped the brightness contrast to 60 from 46 because a few games were too dark went with gamma mode 2 instead of 3.
 
Last edited:
I returned my display today for a refund. I had a flickering issue with G-Sync + HDR enabled and it was making me feel sick and couldn't get used to gaming at 1440P, too much aliasing for me lack of detail.

TFT Central Put out a settings video if anyone is interested


The display was pretty awesome though, I wish 4K displays were as fast as this LG. So I guess now I'll wait for LG's 1560 zone mini-led.


Thank You for posting this I was wondering what was going on in some games. I disabled Gsync and the flicker goes away I mean my PC is good enough where I really don't need Gsync I had Freesync monitors and just used Nvidia cards and that worked for me
without having cross capatibilty. I was getting flicker in Wild Hearts I didn't know what was going on.
 
Thank You for posting this I was wondering what was going on in some games. I disabled Gsync and the flicker goes away I mean my PC is good enough where I really don't need Gsync I had Freesync monitors and just used Nvidia cards and that worked for me
without having cross capatibilty. I was getting flicker in Wild Hearts I didn't know what was going on.
Gsync shouldn't cause flicker normally, and it has nothing to do with having s beefy computer or not as to its benefits. It synchs the monitor's refresh with output video card frames to eliminate tearing.
 
The monitor probably wasn't tested before LG left it in wild luckily Little Buddy posted that otherwise I wouldn't have figured it out. It could be a Nvidia Driver problem or a Monitor problem.
The Flicker was bad with Gsync enabled basically made anything unplayable like a little stobe light going off in the background.
 
Last edited:
I'm thinking that's why it's Freesync Premium rather than Freesync Premium Pro, since VRR and HDR don't work properly together all the time.
 
It seems like VRR without GSync works well for me, at least in the games I've tried so far?
I turned VRR off in the monitor just to experiment and noticed a lot more lag, so it's definitely doing something, even if GSync isn't on.
I find all the distinctions between these things a tad confusing.
 
I just had this problem loading up the game Wild Hearts other games were fine Gsync is only really needed if your PC is slow and can't produce a steady frame rate.
Say you only get like 30-40 and notice tearing in the game Gsync will make it smooth but if you already have a bunch of 144+ FPS your not going to notice it.
 
Last edited:
I just had this problem loading up the game Wild Hearts other games were fine Gsync is only really needed if your PC is slow and can't produce a steady frame rate.
Say you only get like 30-40 and notice tearing in the game Gsync will make it smooth but if you already have a bunch of 144+ FPS your not going to notice it.
Wrong, as explained above.
 
Ok I see that you can have Adaptive Sync something new to me on the Monitor and have it off in Nvidia settings software side so maybe it's built inside the monitor where you don't need to enable it with software.
 
So looking at the review on RTINGS it appears that you cannot adjust the white balance while using sRGB mode? Ugh... Why do monitor companies lock certain features out so arbitrarily.
 
So looking at the review on RTINGS it appears that you cannot adjust the white balance while using sRGB mode? Ugh... Why do monitor companies lock certain features out so arbitrarily.

In sRGB mode, the available adjustments are Brightness, Contrast, and R/G/B channels (and Picture Reset).
Grayed out controls are Sharpness, Gamma, Color Temp, Six Color, and Black Level.

I don't see "white balance" anywhere specifically. But I agree it's odd certain settings are locked out in certain modes. I also wish certain features like DAS or the ability to not use ABL (as Gamer modes appear to do) were toggleable in other modes. That said, I'm pretty happy with my settings at this point even with the unavailability of certain options.

Calibration modes can also be calibrated to sRGB if you have the equipment (colorimeter), but only Brightness is adjustable there (though you specify things like a target gamma and color space when you calibrate).
 
I'm thinking that's why it's Freesync Premium rather than Freesync Premium Pro, since VRR and HDR don't work properly together all the time.
It's like FreeSync Premium Pro requires specific brightness level, input lag, response time, and other monitor's specifications to match the certificate. It's strange the Dough/Eve version of this META-MLA WRGB is FreeSync Premium Pro certified while LG's own monitor doesn't.
 
I picked one of these up tonight from the local Microcenter. Sold my LG C2 OLED to a friend. I found with the 42" C2 I was getting headaches and eye strain from extended gaming sessions (5+ hours). Decided to drop down to 27", I also picked up a regular IPS 27" too (LG 27GL83A) as a secondary monitor. I imagine I was sitting close to the 42" C2, my desk is 28" deep. The 27" OLED fits much better.

I have the monitor in "Gamer 1" mode and I turned off the energy saving feature. To me, text looks just as clear, if not clearer than it did on the C2, I don't see what the big deal is about "fuzzy" text on this monitor. I didn't think I would notice the difference between 240hz and 120hz, I was wrong. For desktop use, while it's not as big as an upgrade from 60hz to 120hz, it definitely feels much smoother, definitely noticeable. I havent had a chance to test games yet.

I personally LOVE the matte coating on this monitor. The C2 OLED would drive me crazy with the reflections, I've never been a fan of glossy monitors. Anytime I would game I would have to close the blinds in the room, middle of the day and I have the light on. No more! Will report back with more feedback when I've had a chance to use it a bit more.
 
I have the monitor in "Gamer 1" mode and I turned off the energy saving feature. To me, text looks just as clear, if not clearer than it did on the C2, I don't see what the big deal is about "fuzzy" text on this monitor. I didn't think I would notice the difference between 240hz and 120hz, I was wrong. For desktop use, while it's not as big as an upgrade from 60hz to 120hz, it definitely feels much smoother, definitely noticeable. I havent had a chance to test games yet.

Congrats!

I too think text looks good on the monitor. I just assumed I wasn't particularly sensitive to text fringing since it does seem to be a common complaint with these. I actually work with text quite a bit on here, and while I can notice it, it just doesn't seem to bother me.

Yeah the 240hz is really nice. I accidentally switched it to 60 the other day in Windows and I was like "WTF is wrong??" hehe.

Also, I've been playing a game not using DLSS [on my video card] and thought the performance was good. Playing around, I enabled it the other day (I always thought it was unnecessary/could introduce artifacts if your system was beefy enough to run without it) and got a HUGE performance boost, to the point I can't even believe how responsive this monitor can actually get. (It was decent before, but this is a whole other level as far as how good/smooth it feels.) Pretty insane! I didn't use the frame counter but I'm quite sure I was hitting the 240 with DLSS on. It was like butter - definitely not going back to DLSS off.

I personally LOVE the matte coating on this monitor. The C2 OLED would drive me crazy with the reflections, I've never been a fan of glossy monitors. Anytime I would game I would have to close the blinds in the room, middle of the day and I have the light on. No more! Will report back with more feedback when I've had a chance to use it a bit more.

I get the matte criticisms and would love to compare to glossy someday, but I don't mind the matte finish either tbh.
 
Also, I've been playing a game not using DLSS [on my video card] and thought the performance was good. Playing around, I enabled it the other day (I always thought it was unnecessary/could introduce artifacts if your system was beefy enough to run without it) and got a HUGE performance boost, to the point I can't even believe how responsive this monitor can actually get. (It was decent before, but this is a whole other level as far as how good/smooth it feels.) Pretty insane! I didn't use the frame counter but I'm quite sure I was hitting the 240 with DLSS on. It was like butter - definitely not going back to DLSS off.

lol. Yeah my 4090 is totally overkill at this point, was great for 4K but of the games I've tested so far tonight, none are pushing the 4090 that hard at all. Oh well, guess I wont need to upgrade for a while.
 
In sRGB mode, the available adjustments are Brightness, Contrast, and R/G/B channels (and Picture Reset).
Grayed out controls are Sharpness, Gamma, Color Temp, Six Color, and Black Level.

I don't see "white balance" anywhere specifically. But I agree it's odd certain settings are locked out in certain modes. I also wish certain features like DAS or the ability to not use ABL (as Gamer modes appear to do) were toggleable in other modes. That said, I'm pretty happy with my settings at this point even with the unavailability of certain options.

Calibration modes can also be calibrated to sRGB if you have the equipment (colorimeter), but only Brightness is adjustable there (though you specify things like a target gamma and color space when you calibrate).
According to Rtings, they still couldn’t reign in the color as much as the sRGB mode. I think the color space was still 112% of sRGB after calibration.

Really annoying. I don’t want to drop $1K on a monitor to have even less calibration options than my $219 monitor from 2019.
 
According to Rtings, they still couldn’t reign in the color as much as the sRGB mode. I think the color space was still 112% of sRGB after calibration.

Really annoying. I don’t want to drop $1K on a monitor to have even less calibration options than my $219 monitor from 2019.

Hmmm...I haven't seen RTINGS review this one yet - curious where you saw that?

I just did the LG Calibration Studio calibration when I did it for sRGB, but it seemed to have a dE of well under 2 for everything. Visually, not too much difference switching between sRGB and Calibration - the main difference is I could set target brightness and gamma.
 
Hmmm...I haven't seen RTINGS review this one yet - curious where you saw that?

I just did the LG Calibration Studio calibration when I did it for sRGB, but it seemed to have a dE of well under 2 for everything. Visually, not too much difference switching between sRGB and Calibration - the main difference is I could set target brightness and gamma.
https://www.rtings.com/monitor/reviews/lg/27gr95qe-b

Scroll down to Color Accuracy (Post-Calibration). This is the first time in memory where I've seen a monitor score worse than the pre-calibration. Are they doing something wrong? Also - the LG Calibration Studio calibration that you did - is it a true hardware calibration? Are the settings you apply to the screen done on hardware and thus hold for whatever display you use?
 
https://www.rtings.com/monitor/reviews/lg/27gr95qe-b

Scroll down to Color Accuracy (Post-Calibration). This is the first time in memory where I've seen a monitor score worse than the pre-calibration. Are they doing something wrong? Also - the LG Calibration Studio calibration that you did - is it a true hardware calibration? Are the settings you apply to the screen done on hardware and thus hold for whatever display you use?

Oop thanks for the link - for whatever reason I couldn't find the review on the site (but I was browsing on my phone).

Yeah those are odd results, but they're clearly calibrating with their own method since I know the LG Calibration Studio only calibrates to the Calibration 1 and Calibration 2 presets. I think the Gamer modes default to a wider color space, so not knowing how they calibrated, it's hard to say.

LG Calibration Studio is streamlined hardware calibration - I used an XRite colorimeter (I think the iDisplay Plus; it was borrowed). It's a very simple calibration, but worked well. (My only other experience with calibration was calibration of my TV with CalMan for Sony, and that was a similar but more complicated process that took quite a bit longer, though it also calibrated more presets, etc.)

For LG's software, you basically hook up the colorimeter, pick the color space (sRGB and P3 are available...I think maybe another but don't recall...I went with sRGB for Calibration 1 and P3 for Calibration 2), pick the color temperature (6500K in my case), gamma (I went with 2.2), and brightness target (I went with 160 nits, which equated to 83 on Calibration 1). Then it runs through the color testing process and and saves the calibration to the monitor. The whole thing takes under 15 minutes. Yes, the Calibration settings are saved to the presets in the monitor so you could hook it up to any device and still have them. It also reminds you when you select that preset of the calibrated Brightness (in both nits and recommended setting for that nit level), color temperature, and gamma for that preset (though not color space). You can also adjust brightness after calibration, but it'll still remind you of what you initially calibrated to. For example I bumped it up one notch to 84 brightness since I wanted just over 160 and the results were just under, but it still shows calibrated to 163 nits at 83. I'm very happy with the calibration; I think this has to be a lot closer than whatever they got in the Gamer modes, but it is very close to the sRGB preset.

The only thing I wish it did have was the ability to turn on the less aggressive auto brightness limiting or turn on DAS for the calibrated presets. As it stands, the Gamer modes seem to have less ABL than other modes and also enabled DAS, and you can't do either in the Calibrated modes. THAT SAID, in actual gaming, I haven't noticed ABL to be an issue, and I don't really notice a performance hit or lag from not having DAS either, so real-world, these aren't so much problems as niceties.
 
Also, I've been playing a game not using DLSS [on my video card] and thought the performance was good. Playing around, I enabled it the other day (I always thought it was unnecessary/could introduce artifacts if your system was beefy enough to run without it) and got a HUGE performance boost, to the point I can't even believe how responsive this monitor can actually get. (It was decent before, but this is a whole other level as far as how good/smooth it feels.) Pretty insane! I didn't use the frame counter but I'm quite sure I was hitting the 240 with DLSS on. It was like butter - definitely not going back to DLSS off.

OLED scales very linearly in motion blur, where quadrupling frame rates give you 1/4th the display motion blur;

That's why for OLED, makes DLSS a very good strobeless motion blur reduction technology -- future variants that use reprojection-based frame generation will have even less input lag and will make tomorrow's 1000fps 1000Hz displays practical. Increasing frame rates by 10x on an OLED reduces motion blur by 90% on an OLED without requiring strobing.

BFI will still be important for retro material and low framerate material, but the more games that can make BFI obsolete, the better. I hope game engine makers implement reprojection-based frame generation technologies.
 
BFI for modern titles too? If you want to turn on Ray Tracing...
 
BFI for modern titles too? If you want to turn on Ray Tracing...
Yep.

But raytraced are modern titles, and raytraced applications can access reprojection-based optimizations; where 100fps is converted to 1000fps in a lagless manner (assuming UE5 integration, for example).

I do think that by the end of the decade, I envision that one can simply use lagless reprojection techniques (where frame generation APIs have access to mouse/keyboard/6dof motion input), you can convert raytraced titles to 10x framerate.

Instead of today's 2:1, 3:1 and 4:1 frame generation that is very laggy, tomorrow's 10:1 frame generation that is lagless.

Let's conceptualize 100fps being reprojected to 1000fps, where you have 100fps of original frames, and 900fps of reprojected frames, but every single frame has input-read awareness;

Reprojection is not interpolation; reprojection can access input reads (mouse input and keyboard strafes), for freshest low-lag 6dof coordinates; that's why it was invented for virtual reality but should be brought to PC gaming;

Retroactive reprojection can even undo frame rendering latency (10ms frametimes) to reprojection latency (1ms frametimes), if you're using a dual rendering pipeline (original frames pipeline and reprojection pipeline).

Reprojection works bidirectionally along the time dimension, so the 10 input reads that occurs after a frame is generated, can be used to fast-forward (reprojection) it to the future while waiting for the next fully originally rendered frame to be finished. Intially, the laglessness would be client-side (e.g. mouselook, strafes, turns, pans, etc), at least until per-object reprojection became available (e.g. movement of enemies). Technically, it shold be feasible to reproject to keep gametime:photontime continually relative.
Inputread-aware frame generation technology can reduce latency, Linus Tech Tips confirmed it:



This video, is kinda the early bird sneak preview of the technology making the 4K 1000fps UE5 raytraced future possible on approximately RTX-4090-class performance (or only barely more than that)

That demo is even downloadable!

The detail level of the underlying graphics doesn't need to be low detail -- as reprojection is absurdly low GPU overhead. I reprojected 30fps to 360fps on a mere Razer laptop RTX 2080; a whopping 12:1 reprojection ratio. But, I also discovered, 100fps -> 360fps, reprojection artifacts drop dramatically. If you're starting from a low triple-digit framerate (e.g. 100fps) to a much higher frame rates, the artifacts are even less than DLSS 3 motion interpolation. So, experiment with reprojecting 100fps to 360fps on your 360Hz monitor, and see if you can easily tell it apart from native 360fps 360Hz -- it's very very hard to do so.
 
Last edited:
BFI would be cool if they made to work with VRR - which shouldn't be that hard to do in theory.
With however issues like fluctuating brightness even without BFI we might not see BFI & VRR at the same time for some time to come...

Re-projection idea is cool but its much easier to make shader-less demo with few primitives than modern game with it. That said this tech already exists in VR world and I saw it years ago on Oculus Rift along with frame-rate interpolation. It made games running at 45 fps roughly feel like 90 fps. Even input latency was not an issue. What however was were artifacts. Some times subtle and at places somewhat jarring. Still I could see this tech adopted for games for normal displays. Especially with BFI could make real difference.
 
Re-projection idea is cool but its much easier to make shader-less demo with few primitives than modern game with it. That said this tech already exists in VR world and I saw it years ago on Oculus Rift along with frame-rate interpolation. It made games running at 45 fps roughly feel like 90 fps. Even input latency was not an issue. What however was were artifacts. Some times subtle and at places somewhat jarring. Still I could see this tech adopted for games for normal displays. Especially with BFI could make real difference.
While I am doing a fair number of strobe/BFI contracts behind the scenes, I am pressing hard for brute framerate-based motion blur reduction to gain much more traction by the end of the decade. Feeding off a small percentage of my contract work, I'm actually spending a significant amount of time/funds towards the future 1000fps 1000Hz ecosystem, since I've developed a new Blur Busters Master Plan all around it. So I plan to release some white papers/articles sometime this year about it. Reprojection is actually easier to deploy than one thinks -- the biggest problem is integration of inputreads into the reprojection API, so would ideally need to be done as sort of a UE plugin;

What I found that;
- ASW 2.0 was much more artifactless than ASW 1.0 because it used the depth buffer; so the jarring artifacts mostly disappeared; and
- Sample and hold reprojection has no double-image artifacts, it's simply extra motion blur on lower-framerate objects (unreprojected objects).

Tests with the downloadable demo (which is more akin to ASW 1.0 quality, rather than ASW 2.0 quality). Doing 100fps as the base frame rate, is above the flicker fusion threshold also eliminates a lot of reprojection artifacts. To understand the stutter-to-blur continuum (sample and hold stutters and blur is the same thing) -- see www.testufo.com/eyetracking#speed=-1 and look at 2nd UFO for at least 20 seconds. Low frame rates vibrates like a slow-vibrating music string, and high frame rates blur like a fast-vibrating music string.

Once reprojection starts off with a base frame rate above flicker fusion threshold (e.g. 100fps), the vibrating stutter is blur instead; and the sample and hold effect ensures there's no double-image effect. Now, apply the ASW 2.0 algorithm instead of ASW 1.0, and the artifacts is less than DLSS 3.0.

Some white papers will come out later 2023 to raise awareness by researchers, by developers, by GPU vendors, etc. With OLED having visible mainstream benefits even in non-game use (240Hz and 1000Hz is not just for esports anymore) if it can be done cheaply, with near zero-GtG displays. People won't pay for it if it costs unobtainum, but reprojection is a cheap upgrade to GPUs. And, will help sell more GPU s in the GPU glut, and also expand the market for high-Hz displays outside esports.

120Hz vs 240Hz is more mainstream visible on OLED than LCD, and the mainstreaming of 120Hz phones, tablets and consoles is raising awareness of the benefits of high refresh rates. One of the white papers I am aiming to write, is about the geometric upgrade requirement (60 -> 144 -> 360 -> 1000) combine with the simultaneous need to keep GtG as close to zero as possible. Non-flicker-based motion blur reduction is a hell lot more ergonomic; achieved via brute framerate-based motion blur reduction; reprojection ratios of 10:1 reduces display motion blur by 90% without strobing, making 1ms MPRT possible without strobing -- and I've seen it with my eyes. It's the future; but I need to play the Blur Busters Advocacy role to educate the industry (slowly) over the years -- like I did with LightBoost in 2012 to, um, spark the explosion of official strobe backlights.

The weak links preventing Hz-vs-Hz visibility caused by the spoonfeeding of GtG-bottlenecked jitter-bottlenecked refresh rate incrementalism means 240Hz-vs-360Hz, only a 1.5x difference, is literally throttled to only about 1.1x because of other various factors (slow GtG, jitter factors). Average users don't care about that. Even high-frequency jitter (70 jitters/sec at 360Hz) vibrates so fast that it's just extra blur; and that's on top of GtG blur mess on top of pure nice linear-blur MPRT.

Now, in a blind test, more Average Users (>90%) can tell 240Hz-vs-1000Hz on an experimental 0ms GtG display (monochrome DLP projector) in a random test (e.g. www.testufo.com/map becoming 4x clearer) much better than they can tell apart 144Hz-vs-240Hz LCD. You did have to zero-out GtG, and go ultra-dramatic up the curve, but just because many can't tell 240Hz-vs-360Hz or 144Hz-vs-240Hz, doesn't mean 240Hz-vs-1000Hz isn't a bigger difference to mainstream! (Blind test instructions were posted in the Area 51 of Blur Busters Discussion Forum).

On a zero-GtG display (where all motion blur is MPRT-only), for 240fps 240Hz versus 1000fps 1000Hz, the blur difference is akin to a 1/240sec photograph versus a 1/1000sec photograph (4x motion clarity difference):

display-persistence-blur-equivalence-to-camera-shutter-690x323.png.webp


As an indie that plays the Refresh Rate Advocacy role -- I try to find ways to convince companies. If the companies can see a market to pushing frame rates AND refresh rates up geometrically (and catch even a bigger % of the niche). The high Hz market is slowly becoming larger and larger.

I'm really focussed on firing all the cylinders of the weak-links motor. 240Hz OLED just solved the GtG bottleneck, reprojection will solve the GPU bottleneck, and 1000Hz OLED is being targeted before the end of the decade.

But yes, BFI is going to be perpetually important for games that can't use reprojection, and legacy material (I love saying "60 years of legacy 60fps 60Hz material"), as I'm pushing REALLY hard to multiple parties to add BFI to OLED. It's a challenge, since some don't think BFI is needed.

I'm doing both routes.
 
Last edited:
IMG_20230317_112006428.jpg

Diablo 4 looks Amazing on this but not really sure if there are darker areas in the game running 50 percent brightness on gamer 1 60 Contrast gamma slightly lower in NV control panel but guess what there is a in-game gamma slider
 
While I am doing a fair number of strobe/BFI contracts behind the scenes, I am pressing hard for brute framerate-based motion blur reduction to gain much more traction by the end of the decade. Feeding off a small percentage of my contract work, I'm actually spending a significant amount of time/funds towards the future 1000fps 1000Hz ecosystem, since I've developed a new Blur Busters Master Plan all around it. So I plan to release some white papers/articles sometime this year about it. Reprojection is actually easier to deploy than one thinks -- the biggest problem is integration of inputreads into the reprojection API, so would ideally need to be done as sort of a UE plugin;

What I found that;
- ASW 2.0 was much more artifactless than ASW 1.0 because it used the depth buffer; so the jarring artifacts mostly disappeared; and
- Sample and hold reprojection has no double-image artifacts, it's simply extra motion blur on lower-framerate objects (unreprojected objects).

Tests with the downloadable demo (which is more akin to ASW 1.0 quality, rather than ASW 2.0 quality). Doing 100fps as the base frame rate, is above the flicker fusion threshold also eliminates a lot of reprojection artifacts. To understand the stutter-to-blur continuum (sample and hold stutters and blur is the same thing) -- see www.testufo.com/eyetracking#speed=-1 and look at 2nd UFO for at least 20 seconds. Low frame rates vibrates like a slow-vibrating music string, and high frame rates blur like a fast-vibrating music string.

Once reprojection starts off with a base frame rate above flicker fusion threshold (e.g. 100fps), the vibrating stutter is blur instead; and the sample and hold effect ensures there's no double-image effect. Now, apply the ASW 2.0 algorithm instead of ASW 1.0, and the artifacts is less than DLSS 3.0.

Some white papers will come out later 2023 to raise awareness by researchers, by developers, by GPU vendors, etc. With OLED having visible mainstream benefits even in non-game use (240Hz and 1000Hz is not just for esports anymore) if it can be done cheaply, with near zero-GtG displays. People won't pay for it if it costs unobtainum, but reprojection is a cheap upgrade to GPUs. And, will help sell more GPU s in the GPU glut, and also expand the market for high-Hz displays outside esports.

120Hz vs 240Hz is more mainstream visible on OLED than LCD, and the mainstreaming of 120Hz phones, tablets and consoles is raising awareness of the benefits of high refresh rates. One of the white papers I am aiming to write, is about the geometric upgrade requirement (60 -> 144 -> 360 -> 1000) combine with the simultaneous need to keep GtG as close to zero as possible. Non-flicker-based motion blur reduction is a hell lot more ergonomic; achieved via brute framerate-based motion blur reduction; reprojection ratios of 10:1 reduces display motion blur by 90% without strobing, making 1ms MPRT possible without strobing -- and I've seen it with my eyes. It's the future; but I need to play the Blur Busters Advocacy role to educate the industry (slowly) over the years -- like I did with LightBoost in 2012 to, um, spark the explosion of official strobe backlights.

The weak links preventing Hz-vs-Hz visibility caused by the spoonfeeding of GtG-bottlenecked jitter-bottlenecked refresh rate incrementalism means 240Hz-vs-360Hz, only a 1.5x difference, is literally throttled to only about 1.1x because of other various factors (slow GtG, jitter factors). Average users don't care about that. Even high-frequency jitter (70 jitters/sec at 360Hz) vibrates so fast that it's just extra blur; and that's on top of GtG blur mess on top of pure nice linear-blur MPRT.

Now, in a blind test, more Average Users (>90%) can tell 240Hz-vs-1000Hz on an experimental 0ms GtG display (monochrome DLP projector) in a random test (e.g. www.testufo.com/map becoming 4x clearer) much better than they can tell apart 144Hz-vs-240Hz LCD. You did have to zero-out GtG, and go ultra-dramatic up the curve, but just because many can't tell 240Hz-vs-360Hz or 144Hz-vs-240Hz, doesn't mean 240Hz-vs-1000Hz isn't a bigger difference to mainstream! (Blind test instructions were posted in the Area 51 of Blur Busters Discussion Forum).

On a zero-GtG display (where all motion blur is MPRT-only), for 240fps 240Hz versus 1000fps 1000Hz, the blur difference is akin to a 1/240sec photograph versus a 1/1000sec photograph (4x motion clarity difference):

display-persistence-blur-equivalence-to-camera-shutter-690x323.png.webp


As an indie that plays the Refresh Rate Advocacy role -- I try to find ways to convince companies. If the companies can see a market to pushing frame rates AND refresh rates up geometrically (and catch even a bigger % of the niche). The high Hz market is slowly becoming larger and larger.

I'm really focussed on firing all the cylinders of the weak-links motor. 240Hz OLED just solved the GtG bottleneck, reprojection will solve the GPU bottleneck, and 1000Hz OLED is being targeted before the end of the decade.

But yes, BFI is going to be perpetually important for games that can't use reprojection, and legacy material (I love saying "60 years of legacy 60fps 60Hz material"), as I'm pushing REALLY hard to multiple parties to add BFI to OLED. It's a challenge, since some don't think BFI is needed.

I'm doing both routes.
Please tell the manufacturers that the customers want it. Lots of us enthusiasts would love being able to have our cake and eat it too with OLED and BFI. I would jump on it in an instant. No cap. I’m a crt tube head that’s been waiting for this moment.
 
Loving this monitor more and more.

Some tips for this monitor: Turn off energy savings, lower the blue color to 44 in the monitor settings and if your on Nvidia in the control panel go to "Adjust desktop color settings" and under "Color accuracy" check the box to override to reference mode".

FFXIV in SDR:

fi21iueaheoa1.jpg
 
Loving this monitor more and more.
That's the content that OLED shines really well on. I've noticed even 10,000-LED-count FALD LCD still struggle on this specific type of material. FALD is superior for larger amounts of bright pixels than that, but if you're a lover of horror/space/dungeon/etc...

Please tell the manufacturers that the customers want it. Lots of us enthusiasts would love being able to have our cake and eat it too with OLED and BFI. I would jump on it in an instant. No cap. I’m a crt tube head that’s been waiting for this moment.
Still doing my best. You've seen what I've done with some models to successfully bring 60Hz single-strobe to a few models.

That being said, 240Hz OLED BFI (when it finally arrives) will initially be limited (by backplane limitations) to no less than 1/240sec motion blur (240ms). So you can have 60fps material with 75% motion blur reduced, like www.testufo.com/blackframes#count=4&bonusufo=1 -- this TestUFO animation works fantastically on 240Hz OLED, to produce adjustable persistence for 60fps (4.2ms, 8.3ms, and 12.9ms persistence) via 1:3, 2:2 and 3:1 black frame insertion options. This would not be too terribly difficult for a 240Hz OLED nameplate to add to their firmware. I do hope that I will finally be able to convince 1 or 2 of them to add this by the end of the year, but no guarantees. To big vendors, it is a kind of a niche feature, but it is popular among the Blur Busters audience. Let's see what happens this year.

In the best case scenario, 48Hz-120-Hz would gain optional adjustable-persistence BFI. For simplicity of expectations, the first BFI would be for 60Hz and 120Hz only.
 
Back
Top