LG Ultragear 27" OLED 240hz 1440P 27GR95QE-B

Has anyone found a way to make the colors look good?

Specially when using HDR, It looks sobad side to side to a 27gl850b from 3 years ago, a lot of colors are just washed out or dull. Even a Dualup I have near it, have better vibrance and saturated colors and it's not a great panel at all.

Bought it for 750€ thinking it would be good at that price, and I could ignore some of the complains about it, and here I am, thinking on returning it.

Brightness is not a problem, I use all my displays at about 120 nits, motion clarity is amazing and could be a super selling point f you could maintain 240fps everywhere, but you won't even with a 4090 and glitch-lss 3 and there is no 120 hz bfi, even ignoring the text rendering problems, the colors are an issue, no matter what I try on the RGB levels, never look pretty like some oled tv's or great IPS's.

Also, the HDR, wtf LG, not letting the customer calibrate the RGB? Activating HDR have 2 modes, gamer 1 aka yellow mode, where whites are literally yellow, or gamer 2 aka blue mode where whites are blue

I get pc users have been after oled monitors for so much time, that now everyone get's excited by them, but this one at this price, don't know, doesn't seem like a good deal tbf, like this should be a 500€ display (even with the PC tax vs TVs), not much more, I wouldn't want to be one of the customers who paid 1200€ just 3 months ago. I mean, you can even purchase a S95B 55" for 950€, wth are they thinking charging about 1000€ (when not on sale) for this 27"

If this were a TV and not a display, and LG tried to charge 1000, it would sell literally 0.

oh yes, and the cherry on top, is that when connected through HDMI, if you use anything at full screen the monitor changes automatically to 4k 120hz and doesn't let the user change it manually, forcing him to use a DP cable without enough bandwidth for 1440p 240 12 bit, or edit resolutions with CRU to be able to use the HDMI 48gb input, but half of the times after editing with CRU at least with nvidia, the drivers suddenly loses the gsync compatibility with the display

what a disastrous release.
 
Last edited:
I just don't use HDR don't use the stock stand too high and don't use adaptive sync makes my eyes hurt.
 
I picked up the Corsair version of this monitor (27QHD240) and had a chance to set it up last night. Coming from a 2016-era G-Sync TN gaming panel, it just looks so so much better! After picking up a 77A80J for my living room a few years ago it became unenjoyable to use my computer monitors for gaming due to the lack of contrast (I finally knew what I was missing). Now I feel like I actually want to use my computer again.

I don’t have much experience with HDR on PC and my limited testing didn’t show it looking better than SDR in the one game and few minutes I tried, but I hope to play with it more soon. I’m not expecting much as Corsair didn’t bother to get any HDR certifications for the monitor.

Overall the Corsair seems like a solid alternative to the LG if you want something with a longer warranty and express burn in protection.
 
Does the Corsair have a Vesa mount in the back how is the stand?
It does have a VESA mount (100mm x 100mm).

The stock stand is solid. Just two legs in a V pattern, but I didn’t notice any wobble. It rotates, swivels, adjust height, etc.

The appearance of the monitor is one of the main reasons I went with it (in addition to the warranty). It doesn’t have a gamer aesthetic at all, and there are no RGB lights on the back.

The LG has been on sale, though. You can get the LG plus a few years of store warranty (like GeekSquad) for the same price as the Corsair at the moment. If I wasn’t an hour and 45 minute drive from the nearest Best Buy I might have given the LG a shot, as I like how you can hardware calibrate SDR on that one.
 
Do you set a frame rate cap for games that is below the monitor refresh rate? Enable g-sync with full screen and windowed? And turn v-sync on in the control panel and off in the game?

I’m still needing to do more testing, but a frame rate cap lower than my screen’s max refresh rate that is also at a level where I don’t experience wide FPS swings was working great in Diablo IV last night.
 
Do you set a frame rate cap for games that is below the monitor refresh rate? Enable g-sync with full screen and windowed? And turn v-sync on in the control panel and off in the game?

I’m still needing to do more testing, but a frame rate cap lower than my screen’s max refresh rate that is also at a level where I don’t experience wide FPS swings was working great in Diablo IV last night.
My case was with a 60fps hard locked game with gsync enabled, first time I see the flickering
 
I picked up the Corsair version of this monitor (27QHD240) and had a chance to set it up last night. Coming from a 2016-era G-Sync TN gaming panel, it just looks so so much better! After picking up a 77A80J for my living room a few years ago it became unenjoyable to use my computer monitors for gaming due to the lack of contrast (I finally knew what I was missing). Now I feel like I actually want to use my computer again.

I don’t have much experience with HDR on PC and my limited testing didn’t show it looking better than SDR in the one game and few minutes I tried, but I hope to play with it more soon. I’m not expecting much as Corsair didn’t bother to get any HDR certifications for the monitor.

Overall the Corsair seems like a solid alternative to the LG if you want something with a longer warranty and express burn in protection.

Hmm, nicer stand, HDMI 2.1, 3 year warranty, 0 dead pixel guarantee, burn in protection (however that is defined by corsair), matte screen finish, claimed brightness 450 nit peak brightness, 800 nit @10%APL, 1,000 nit@3% APL. Waiting on reviews.
 
I wonder if the LG can't offer more brightness and colour accuracy on hdr as the other displays using this panel by design because of theirs was mishandled the engineering team like for example the lack of a better heatsink, they don't want to fuck with the customers of the panel and just offer a baseline monitor for them to look at, or they know the displays will burn and don't want to cover the warranty costs.
 
I'm gonna go with the latter. You'd think the maker of the panel is more in touch with its capabilities and since they are a large consumer electronics co they have a reputation to uphold. Where as if your asus you can make up excuses to f#$% people, limit production to keep prices elevated.
 
For great demo material for any OLED...

...try the newly released DLSS-compatible System Shock remake, and configure it to DLSS Balanced. It keeps 200fps+ even at 3440x1440, with excellent low blur performance.

1687152012383.png

(DLSS compatible remake is now out since June 1st, 2023!)

Set everything to maximum if you have a 3000 or 4000 series, but may need to bring "Fog Quality" and "Shadow Quality" 1-notch lower.

For System Shock to perform the best (and most CRT-like too):
  • Monitor Adaptive-Sync = ON
  • Monitor Profile = Game (I treat myself to 100% brightness for this specific game)
  • NVCP G-SYNC = ON
  • NVCP VSYNC = ON
  • NVCP Framerate Cap = 235 (or use RTSS)
  • Game Settings VSYNC = OFF (framepaces better with VRR)
  • Game Settings Motion Blur = OFF (unless you get headaches from stroboscopic stepping effects)
  • Game Settings everything Ultra (except Fog Quality / Shadow Quality, 1 notch lower for 3000 series GPUs)
  • Game Settings Difficulty = all set to "Normal" (level 2 of 3) except Cyberspace set to "Easy" (it's balanced the too-difficult side)
  • Game Settings Fullscreen = Exclusive (not borderless)
  • Game Settings DLSS = Balanced or Performance
  • Windows Settings HDR = OFF (System Shock alas, performs better in SDR mode, but simulates HDR very well in SDR mode)
  • Mouse Settings = 2000 Hz + 3200dpi + low ingame sensitivity (smoother CRT-like motion during smooth mouselook)
And now you do 3440x1440 at 200fps+ even on an RTX 3080, very low-jitter engine, it's pretty well optimized and the game is a perfect glove-fit for OLED. Now if you're 3060 and trying to push an ultrawide, you may have to back off the settings further, but it's lovely to go almost fully Ultra at 200fps+ while properly demoing OLED performance.

Also, the framepacing in system shock is delightfully smooth, so I don't see G-SYNC flickering inside System Shock.

Even mere mouse jitter can kill 120-vs-240 Hz differences, so try to use a 2000Hz pollrate or even 4000. Most new mouse sensors supporting 2000Hz pollrates, does a sublime job at 3200dpi in System Shock -- and high-DPI really makes a difference in jitter-reducing the slow mouse turns more common in solo games (unlike fast flick turns). The ability to mouselook around (while eyetracking the mouselook) and see motion blur reduction benefit of high-framerate high-Hz OLED, is a pretty neat experience. You can use mouse software to lower DPI when exiting to desktop. High-hz and high-dpi really dejitters the mouseturnfeel in certain games like System Shock when it comes to the fast GtG of OLEDs. 120-vs-240 is more visible with 2000Hz mouse poll rate (research paper confirmed).

Research paper confirms that 2000Hz+ is beneficial to dejittering (dl.acm.org/doi/10.1145/3472749.3474783)
1687153613287.png

This is important for blur-busting fans on OLED if you want to easily see 120-vs-240 during mouseturns/mouselook/etc, since any form of motion blur reduction (of either strobing method, or of the brute-framerate method on fast-GtG displays) can amplify visibility of jitter. So you don't want to skip optimizing your mouse jitter, when it comes to 240Hz OLED. The slow-mouselook dejitter technique to go is "high pollrate, high dpi, low in-game sensitivity".

As long as the game behaves -- and thank our lucky stars, System Shock remake does!. Now, I also use Process Lasso to tame my background software (e.g. cloud sync) and quit any CPU hogs, since 200fps+ on OLED looks better if you de-jitter it properly (all weak links, VRR, framepacing, mouse, background software, shader recompiles, texture streaming pauses, whatever jitters). A major refresh rate race weak link, most certainly, is jitter -- even at the 2ms timescales. Jitter needs to be a tiny fraction of a frametime to stay invisible, and at 200fps+, your frametimes are tiny, so tinier jitter is a bigger problem, so control your jitter to get your money's worth on 240Hz OLED.

Now, for older eyes, admittedly, 4x blur differentials (e.g. 60-vs-240, or 120-vs-480, or 240-vs-1000Hz of future OLEDs) is much more visible, but the 2x blur differential scales only linearly only with zeroed GtG and zeroed jitter. Any nonzero GtG and any nonzero jitter = less than 2x blur difference between 120-vs-240. Diminishing returns apply here, but it's about geometrics (after you control GtG and jitter). In a zero-GtG zero-jitter situation, 240-vs-1000 is more visible even to grandma than 120-vs-240, due to the geometric nature (like 1/240sec SLR photo versus 1/1000sec SLR photo), assuming sufficient spatial resolution to see the motion blur differentials. But you can only maximize Hz differentials if zero-GtG and zero-jitter. So, as you upgrade your Hz, you need to control your jitter more.

Otherwise, if you do not control your jitter properly, then 120-vs-240 is not night and day -- jitter of any form can pretty much ERASE the difference between 120-vs-240!!! So that's why I post the jitter-free System Shock settings above.

The perfect blacks of System Shock and its ultra-saturated colors, really show off OLEDs more than an order of magnitude better than an LCD, because of the extreme contrasts between perfect blacks and ultra-saturated colors. So any OLED chirstening ceremony, needs to include a contrasty game like cyberpunk-style material, to properly push your OLED capabilities, whether you have WOLED or QD-OLED -- both performs spectacularly with System Shock.

I played the 1994 version of System Shock way back in 1994, and it's really nice to play the now-completed faithful remake that really pushes the OLED framerate excellently, and makes an OLED shine in blacks, colors and motion performance. In many situations you've got literally several hundreds/thousands of islands of neon dots/pixels/lines/etc in the midst of darker-than-LCD-blacks -- which really shows off OLED per-pixel illumination control. You haven't tried 240Hz OLED properly unless you've properly tried 200fps+ cyberpunk material (neons+darks) with proper dejitter optimizations.

Save often though!
 
Last edited:
The optical output's sound quality is alarmingly bad. I wonder how they messed this up, and if the testers even had ears. 3.5mm is better but suffers from noise.
 
Even mere mouse jitter can kill 120-vs-240 Hz differences, so try to use a 2000Hz pollrate or even 4000. Most new mouse sensors supporting 2000Hz pollrates, does a sublime job at 3200dpi in System Shock -- and high-DPI really makes a difference in jitter-reducing the slow mouse turns more common in solo games (unlike fast flick turns).
Wouldn't a mouse with motion sync (vrr for mices) at 1000hz be enough? most of them already implement it and only adds 1ms of input lag, so don't need to purchase a razer 4k for 200€

motion sync should eliminate any kind of jitter from mices, and also, a lot of games, even super popular ones like apex, don't support 4000hz, they start lagging and stuttering constantly with anything over 1000, some games with +500, so 1000+motion is usually the safest bet
 
Last edited:
motion sync should eliminate any kind of jitter from mices, and also, a lot of games, even super popular ones like apex, don't support 4000hz, they start lagging and stuttering constantly with anything over 1000, some games with +500, so 1000+motion is usually the safest bet
Correct -- not all games work well with 2000 Hz. But System Shock apparently does!

This is true that motion sync makes a big difference. I have it enabled, but 2000 Hz Motion Sync still made a further improvement on 240Hz OLED in System Shock than 1000 Hz. While the sensor rate in 1000Hz mice is much higher than the poll rate. The benefits are very tiny, but the improvement is there (at least for me). I'm way more picky about motion quality (motion blur) than other attributes; especially in non-competitive situations.

Yes, I will complain that Cyberpunk 2077 seemed to have benefit from 1000Hz-vs-2000Hz, which is crummy.

But 2000Hz did improve System Shock mousefeel for me (it might not for you -- but at least give it a try). As the game is able to successfully spray framerate out of the wazoo, its minimum system requirement being a GTX 670, which means lots of headroom to perform fantastically on modern RTX cards, and respond above-average to jitter optimizations. The developers clearly optimized it to be framerate-futureproof, so it scales really well with jitter optimizations.

In a motion-clarity priority POV, rather than latency POV, jitter optimizations are quite important. When you're prioritizing on motion clarity (as being more important than latency) for solo gameplay, then looking at all the jitter causes is valid. 8000 Hz is much more problematic, but some games perform really well with 2000 Hz, so that tends to be the sweet spot.

Even high-frequency jitters (70 jitters/sec at 360fps 360Hz) is still problematic even if mostly 'invisible'. It is simply extra motion blur (like a fast-vibrating music string), ala the sample and hold stutter-to-blur continuum (TestUFO demo, watch 2nd UFO for 30 seconds as it accelerates/decelerates in frame rate). Notice how the stationary vs moving UFO looks very different! So, whether you have 20 jitters/sec of 100 jitters/sec or 200 jitters/sec -- it's degrading your motion quality. Low-frequencies will vibrate (like a bass music string), while high-frequencies adds extra persistence blur (like a high-frequency music string that just looks blurry when it vibriates).

Just as you saw with UFO looking different in the above link depending on stationary-vs-moving eye -- there are 4 image/motion quality factors to be concerned about:
1. Stationary eye, stationary image (e.g. staring at a photograph or while standing still in game)
2. Moving eye, stationary image (e.g. moving enemy on a stationary background)
3. Stationary eye, moving image (e.g. background images panning past as you stare at crosshairs)
4. Moving eye, moving image (e.g. eyetracking during a pan / scroll / slow turn / slow mouselook)

Jitter optimization has the biggest effect on item #4. Not everyone notices (e.g. people using item #3 almost exclusively -- e.g. stationary stare at CS:GO crosshairs), but my eyes are all over the screen in certain games like System Shock. When you're immersing yourself in a gigantic screen and you're sitting real close (18"-20") which means a 45" screen covers literally almost 90 degrees of your FOV -- your eye habit has a tendancy to just "follow the panning / turning / scrolling" -- too much motion blur can create a motionsickness situation (motion blur sickness is a more common ailment in VR, but also affects today's modern increasingly large monitors). Remember this jitter-as-blur mainly affects situation (4) and different games will behave differently, with your eye habits behaving differently.

Remember, back in the 1024x768 XGA days when you were using your favourite CRT on your 3dfx Voodoo2 SLI in year 1999 -- yes, 4000 pixels/sec wasn't eyetrackable. But in today's 4K-wide displays like the 45" 240Hz OLEDs -- 4000 pixels/sec takes 1 second to cross from left edge to right edge. 1 millisecond error at 4000 pixels/sec is a 4-pixel offset (in jitter error). Now if you vibrate that 70 times a second (beyond flicker fusion), that's 4 extra pixels of motion blur added to whatever MPRT and GtG you have! So -- see! -- those milliseconds matters even more up the diminishing curve of returns, if you're motion-quality priority in today's "simultaneous resolution+refresh rate insanity". The human-visibility noisefloor is pushed down to tinier milliseconds with 4K-wide fast-GtG high-Hz high-framerates.

Trying to linearly scale the refresh rate race (120-vs-240) as an exact 2x motion clarity improvement -- requires you to halve all your error margins (GtG / jitter / etc) while also simultaneously doubling your frame rate. (Or better yet, 4x improvements, since geometrics are more human visible, ala 60-vs-240, or 240-vs-1000, but you need to quadruple your frame rate and quarter your error margins including GtG and jitter). Otherwise anything (slow GtG, jitters, etc) can throttle Hz differences. I'm not much of a fan of refresh rate incrementalism for that reason too (e.g. 240Hz vs 360Hz LCD is a 1.5x difference throttled to 1.1x difference due to slow GtG and jitters). But with OLED, the fast GtG means the refresh rate race can scale more linearly -- if everything else is controlled.

1ms random jitter (worsened by system inaccuracies) in a 4ms MPRT is very borderline visible, as a ~25% worsening in motion quality (ratio of jitter:MPRT). Usually jitter (in milliseconds error) bigger than MPRT (in milliseconds) is when it starts becoming bothersome (jitter=MPRT in milliseconds it can actually halve motion clarity). But as soon as jitter becomes half of MPRT, it starts (barely) throttling the motion clarity, and when jitter (in ms) exceeds MPRT (in ms). The MPRT(100%) of 120fps OLED is 1/120sec = 8.33ms, and the MPRT(100%) of 240fps OLED is 1/240sec = 4.16ms.

Now, if you whac-a-mole all the worthy jitter error margins, you can have significant net motion clarity improvements (e.g. 50%+ clearer motion) if you peel/solve as many layers as possible off the jitter onion. Doing this boring optimization on a non-strobed LCD doesn't improve as much due to the slower GtG, but the ultrafast GtG of OLEDs amplifies the visibility of jitter and stutters (even the fast GtG is why 60fps perfect-framepaced looks more stuttery on OLED than it does on LCD -- and why you ideally want to go well beyond 60fps when playing on OLEDs if you hate jitter/stutter). So it is more worthy to optimize jitter when you're playing in the 200fps+ stratospheres on a 240Hz+ OLED.

If you are motion-quality priority, it's always good to bump pollrate up a notch if the game and the system can handle and benefit from it. It won't help all games, but it clearly helps some, even with MouseSync enabled. Test your pollrate for specific games and go to the highest that the specific game feels smooth at. 500Hz works best in some games, but 2000 Hz works best in other games. You can use mouse-profile switching on a per-app basis.
 
Last edited:
anyone knows if LG have any kind of community managers anywhere to ask them to fix the hdmi firmware bug that they fixed on displayport but never on hdmi?
 
anyone knows if LG have any kind of community managers anywhere to ask them to fix the hdmi firmware bug that they fixed on displayport but never on hdmi?
Nope. Best thing to try is emailing a tech support guy and hoping for the best.
 
Been playing UO on my fist day of vacation much more easier on the eyes than playing on a LCD lowered the rez for scaling and turned refresh to 60hz. Basically that the reason why I quit the game 20 years ago I got rid of my huge CRT setup and LCD monitors were like a Blazer.
 
Been playing UO on my fist day of vacation much more easier on the eyes than playing on a LCD lowered the rez for scaling and turned refresh to 60hz. Basically that the reason why I quit the game 20 years ago I got rid of my huge CRT setup and LCD monitors were like a Blazer.

60Hz!

Ironically, I have some expertise in vision ergonomics, where sometimes lowering refresh rate is sometimes a solution. I told someone to try 24Hz in Windows on their OLED, and it
helped reduce motion sickness.

There's something about intermediate frame rates that causes nausea in some people (40-60fps), so running software at 24Hz is a workaround, unless you can run content at framerates far beyond blur/stutter/flicker thresholds -- e.g. 240fps 240Hz. The other workaround is to intentionally enable the GPU Motion Blur effect (for some, it's an "Accessibility Setting" due to headaches/nausea from things like stroboscopic stepping effects)

My theory is that motion that's above stutter thresholds, but below blur thresholds, has a bothersome effect on some people. There's also people with the medical condition of Akinetopsia ("motion blindness").

There's about 4 thresholds that are triggered at different milestones:

  • Very approximately ~10fps = motion stops being a slideshow
  • Very approximately ~100fps = flicker stops being visible, sample-and-hold stutter turns into blur (stutter to blur continuum; watch 2nd UFO for 30seconds)
  • Very approximately ~1000fps = persistence motion blur disappears for 30-degree FOV 24" 1080p
  • Very approximately ~10,000fps = stroboscopic effects stops being visible, and persistence motion blur disappears for 180-degree FOV 16K-resolution (e.g. VR or MSG Sphere display)

Retina refresh rate is ~20Kfps at ~20KHz, though you need ultra geometrics for 90% population (Average Joe's) to see it -- 4x diffs, e.g. 240-vs-1000, not near invisible-to-most-nonesports 240-vs-360.

One effect that I've discovered is that if you still have persistence motion blur, but stutter is gone, the motion can be a little weird and nauseating to some people (e.g. soap opera effect -- interpolation doesn't undo camera-shutter motion blur).

I love high frame rates and high refresh rates, but I have an understanding where some frame rates can be a trigger for motion sickness / nausea / medical vision conditions. Even I watch my 24fps movies uninterpolated in Hollywood Filmmaker Mode.

Also, the fast pixel response of OLEDs, tend to make perfect 60fps "stutter" more than on an LCD. The stutter-to-blur continuum threshold is at a higher point on faster-GtG displays. It's tied to the flicker fusion threshold (vibrating edge effect of stutter -- frequencies/beats/harmonics of edge flicker of stutter and judder). Slower GtG muddies this and lowers it slightly, and faster GtG raises this somewhat. So 60fps can feel like a slightly lower frame rate on OLED than 60fps on LCD, so you need to throw a few more framerate (e.g. 80fps+) to compensate for the extremely-fast pixel response.

There's the uncanny valley effect -- people who are bothered by intermediate frame rates (e.g. 48fps or 60fps) but prefer low frame rates (e.g. 24fps) and high frame rates (e.g. 240fps+ and/or strobed), because intermediate frame rates on sample and hold are "simultaneously smooth and stutter free, yet motion blurry". For that, people can prefer "the old fashioned feel of low framerate" as a bigger vision comfort. At least until "framerates high enough to also eliminate motion blur".

There are other reasons, e.g. perfect framepacing at 60Hz but crappy framepacing at 240Hz, so console ports for PC play better at 60Hz than at 240Hz, since the snap-to-grid (refresh cycles is like a one-dimensional grid during VSYNC ON) effect is a stutter-filtering effect for 60fps legacy content! The other trick is to use a good framerate capper (e.g. RTSS) to fix the bad framepacing, by using an RTSS-powered 60fps cap to de-stutter bad 60fps console ports. RTSS uses microsecond-accuracy capping algorithms, so it can 'help' a bad console ports' erratic framepacing via alternate means than switching to 60Hz.

There's even the converse -- people who get headaches at 24fps but higher refresh rates and higher frame rates (120fps+) solving the problem. Everybody reacts differently to high frame rates.

Big vision-ergonomics rabbit hole -- but "Good To Know" stuff.
 
Last edited:
I really think it doesn't matter with Ultima Online because it's a 2D sprite game since higher refresh rates were never around back in 1997 I just know using 60hz is better with the games since it really doesn't tear or have regular frames like a Normal 3D game would. Just know I like the lower refresh better. Not sure if the game or takes advantage of it it not when 240 hz is enabled since the game is a 25 year old antique.
 
Last edited:
I picked one of these 27GR95QEs up yesterday after struggling to decide which IPS FALD screen I wanted. In the end, it turns out I wanted none of them. What I really wanted was OLED. I want motion clarity and perfect blacks. I would have preferred a 4k OLED, but honestly, I was not going to get 240fps on a 4k display. The 1440p is a good compromise in order to get the higher fps.

I was really concerned about the brightness after hearing the complaints. I'm not concerned anymore. It gets brighter than I need it to. I run my LG 38GN950 at lower brightness than this OLED can produce, and that monitor gets pretty darn bright in SDR. I play in a dark room, dedicated as a computer room / office, usually with the lights off. Plenty, plenty bright. I wouldn't want it brighter. Also, some of you guys must have Superman vision to be able to see fringing around text. I'm reading this site and other sites on the monitor and text looks fine to me. And I've got it about 18 inches from my face, maybe less.

This monitor will do double duty as PS5 and PC monitor. On the PS5, I like how I can choose to run it at 4k/120Hz (and let the monitor down-sample the image), or finally use native 1440p if I want. All HDR options detected automatically. Only thing not available is ALLM, which is more of a TV thing.

How long do you think it'll be before I burn health and mana globes into the screen from playing Diablo 4 and Path of Exile exclusively about 15-20 hours a week? On the bright side, since those are the only 2 games I really play, the burn-in will cover itself up... lol.

Anyway, super happy with this purchase. The image quality and motion clarity has to be seen to really be appreciated. So good.
 
I was going to update the firmware on my new 27GR95QE using the on-screen display app, but it states it's already up-to-date. Wild.
 
So I ended up buying this recently and Im just not happy with it. It is too dim and the colors dont look right in game cause I have to turn brightness and black equalizer to 100. (I tried everything in between). Whites are more like dirty grey so I knew it might be an issue.

The HDMI bug doesnt allow me to play fortnite in 1440p as it just keeps giving me a black screen every 5 seconds. I can play 1440p in battelfield 2042 but I cant get 240hz so the screen smears as I turn even with gsync on. I ended up playing on 1920 x 1080 in both games just to give it a full test. It wasnt bad but the dimness is a deal breaker for me.

I went back to my benq 2546k which just looks so much better in BF2042 even with lower frames (DYAC+ on premium). Maybe I should try the Asus pg27AQN?
 
So I ended up buying this recently and Im just not happy with it. It is too dim and the colors dont look right in game cause I have to turn brightness and black equalizer to 100. (I tried everything in between). Whites are more like dirty grey so I knew it might be an issue.

The HDMI bug doesnt allow me to play fortnite in 1440p as it just keeps giving me a black screen every 5 seconds. I can play 1440p in battelfield 2042 but I cant get 240hz so the screen smears as I turn even with gsync on. I ended up playing on 1920 x 1080 in both games just to give it a full test. It wasnt bad but the dimness is a deal breaker for me.

I went back to my benq 2546k which just looks so much better in BF2042 even with lower frames (DYAC+ on premium). Maybe I should try the Asus pg27AQN?
Had some issues with mine, try these settings and the DP cable that came with the monitor. Mine seems bright enough but I'm in a dark room.


View: https://www.youtube.com/watch?v=13tT2ec2tkk
 
Had some issues with mine, try these settings and the DP cable that came with the monitor. Mine seems bright enough but I'm in a dark room.


View: https://www.youtube.com/watch?v=13tT2ec2tkk


Thanks man, I did try those settings but its a no go for me. Even if this monitor was $400 I wouldnt keep it due to dimness and the 4k bug (tried 2 different DP cables) which I couldnt get rid of. PG27AQN is on deck though.
 
Had some issues with mine, try these settings and the DP cable that came with the monitor. Mine seems bright enough but I'm in a dark room.

Same. I'm not discrediting anyone who says this monitor doesn't get bright enough. I get your point of view and if you need/have to deal with lights or windows. All I can say is that if you can play in a dark room, it's really awesome.
 
Same. I'm not discrediting anyone who says this monitor doesn't get bright enough. I get your point of view and if you need/have to deal with lights or windows. All I can say is that if you can play in a dark room, it's really awesome.
I get it, wish mine was brighter but it was a big step up from my lg 27gp850-b. Also I don't feel its worth the 850 I paid, maybe 600ish.
 
I just picked one of these up and it is going back. The brightness changing on its own is just ridiculous. I had read that turning off power saving mode would fix it but it did not.
 
I just picked one of these up and it is going back. The brightness changing on its own is just ridiculous. I had read that turning off power saving mode would fix it but it did not.
What picture mode are you using where it does that?
 
What picture mode are you using where it does that?
I was using vivid and switched to gamer 1 and now it does not do that. Still I see many other issues though. It still will go dim a bit in parts of some games which is annoying as even at full brightness this monitor is not very bright to begin with. I get this weird black crush in some games that seems to come and go. I have also seen some insane flickering when starting some games. Some of the games I launch the first time with this monitor are defaulting to 30hz and also are in the wrong aspect ratio. And gsync just does not seem to work right in some games I have been testing as I still get tearing and/or stuttering. And yes I know how to use gsync for years now and it is working fine in most games.
 
27 inch stand.jpg



Just upgraded my Vesa stand my 1st one had a forked metal base and kinda ugly this one by far is nicer I have 3" of space from the bottom of my PC table.
The LG stand doesn't go low enough for my setup otherwise I would use that I tried using it like 6-7 times no go feels like I'm looking up all the time which gives off more eyestrain. Still have Adapative Sync off with this thing causes me to blink like crazy.
 
Last edited:
I've been using this monitor since black friday and I'm really pleased with it. Those pure blacks are what I was looking for. Right of the start, I applied the settings in the TFTCentral video. For me, who generally reduce the brightness of my displays and increase the contrast, those settings were perfect. 120 nits is plenty enough for desktop use (gamer 1 mode) and I use the gamer 2 mode for most of my games. I'm also using BetterClearTypeTuner as suggested in this thread with the Grayscale option. Text looks best to me with it in this mode. At first I wasn't sure if I would keep it only because of the price but now I would not go back to anything else, at least for a few years!
 
I’ve started experimenting with this monitor and different input sources. I recently tried an Apple TV 4K (Gen 3) on it, and holy hell it looks good. Ironically my old Samsung Q90R also looks better on the Apple TV 4K.

This tells me something; it’s not the monitor that’s the issue… it’s the source. The LG Cx and other OLED TVs have some post processing done to the image to make it look better, while the LG 27” OLED monitor does not; it shows the source as it is presented. People need to stop comparing this monitor to OLED TVs. The OLED TV is almost always going to look better, but it is at a slight cost of input latency. The LG 27” OLED is amazing if you give it a good image source.
 
Back
Top