Samsung Odyssey Neo G9 57" 7680x2160 super ultrawide (mini-LED)

ah yes the next years 39" and 45" oleds will be interesting as the replacement for this 57" Neo.

But only if they have a decent resolution, i dont want none of that digusting 83ppi of the current 45" oled monitors available otherwise I would have jumped on them already.
 
ah yes the next years 39" and 45" oleds will be interesting as the replacement for this 57" Neo.

But only if they have a decent resolution, i dont want none of that digusting 83ppi of the current 45" oled monitors available otherwise I would have jumped on them already.

5120x2160 at 45 inches would make it 123 PPI so it would be a slight downgrade in PPI from a Neo G9 but maybe that's not too bad.
 
123ppi is 100% good for me.

Hell I even like the 110 ppi from 27" 1440p res!

So if those next years oleds are rumored to be 39" or 45" at 123ppi, I would be ecstatic!
 
What fps you guys getting in graphics intense games at full res and max graphics + full RT enabled?

I have a 4090 and 12900K

And I am getting only 50-58fps in Cyberpunk 2077 and Alan Wake 2. Thats with DLSS performance!!!

Its so hard to bear! DLSS performance really looks like arse! Need to use DLSS quality at a minimum, but that will drop me to 40fps.

Man, I love the image quality of this monitor but damm I do not like the performance of AAA games.

I don't know what to do, I almost feel like getting rid of this monitor because its pretty much unusable with current gen GPUs for gaming at this res. And thats my sole use for this monitor.

If I cant have atleast 70fps in games at max detail, then whats the point? The experience of playing at sub 40-55fps is seriously hindered no matter how good the image looks!
DLSS Balanced is the sweet spot for Cyberpunk IMO. But it's just a really damn demanding game. I would simply try to play it at say 5120x2160 custom res or at 3840x2160.

The way I see it, the superultrawide's main benefit is a lot of seamless desktop space for work uses. I would never buy one as a gaming only monitor because you run into a lot of inconveniences.
For gaming, you need to pick your compromises. I'd rather use a 5120x2160 or 6144x2160 custom resolution to reduce FOV distortion and improve performance with some black bars on sides. Not supporting this out of the box and requiring hacky solutions sucks big time.
 
123ppi is 100% good for me.

Hell I even like the 110 ppi from 27" 1440p res!

So if those next years oleds are rumored to be 39" or 45" at 123ppi, I would be ecstatic!
The 39" is 3440x1440. The 45" is unconfirmed, but the 5120x2160 panel was slated for late 2024 production in previous road maps so it might come out only in 2025. I hope I am wrong.

i'd definitely get a 45" 5120x2160 curved OLED.
 
The 39" is 3440x1440. The 45" is unconfirmed, but the 5120x2160 panel was slated for late 2024 production in previous road maps so it might come out only in 2025. I hope I am wrong.

i'd definitely get a 45" 5120x2160 curved OLED.

IIRC the 32" QD OLED panel was also supposed to be in production either Q2 or maybe even Q3 2024 but supposedly Dell/Alienware is going to have the monitor out in Q1. We'll see if that ends up being true but if it does then there's a chance these things come out much earlier than expected.

EDIT: Nevermind I stand corrected. They are both 3440x1440

1703179778461.png
 
Last edited:
is it possible to adjust the size of the display on the screen to fit a 5120 x 1440 resolution, so it is not stretched ? so there would be some black bars on each side
 
is it possible to adjust the size of the display on the screen to fit a 5120 x 1440 resolution, so it is not stretched ? so there would be some black bars on each side
I am pretty sure you can. I did 2560x1440p or 3840x2160 (can't remember which one i tested?) with black bars all around before when i first got the monitor.
 
Here's my G95NC review!

Preface:

I've own the original G9 which was replaced by the G9 Neo roughly 2 years later. I've always loved the G9, the original did have quite a few issues, notably a 2x6" glitch strip lower left of center that would appear for a split second every 4-6 hours and it had extremely muted / washed out HDR so it was always a choice between decent SDR colors or amazing brights via HDR.

The original G9 Neo addressed these issues offering vastly improved SDR and HDR color performance. Initially HDR was similarly broken on the G9 Neo but this was fixed roughly 4-6 months into it's life cycle with a firmware update and then HDR had nearly as good color reproduction as SDR.

I was initially on the fence with the Dual 4K G9 Neo proposition, for one, I had just upgraded to the Neo in 2021 and I was extremely worried that an RTX 4090 under WB would be able to drive double 4K resolution as the math put the pixel count at 2.25x more than 5120x1440. At the time of it's release I just so happened to be in the market for at least one 8TB Samsung 870 QVO ($400), with my Military veteran discount bringing the price down to $2250 and the $500 promotional credit I was lured into a purchase.

I purchased the monitor on the 22nd of Sept and it was delivered on the 28th and yes the box is massive, but not as big as a refrigerator as many are claiming. The panel arrived with no defects, on the day of setting up I had quite a bit of buyers remorse and anxiety that there would be a dead pixel or two, or some other issue, or that I would damage it while transferring the original G9 Neo off the Ergotron arm + HD pivot, luckily no issues whatsoever. My initial reaction after powering it up was.....disappointment. It took some fiddling around with the color profiles to get the image to an acceptably satisfactory appearance while web browsing, viewing windows and other media content. I initially settled on the RPG profile @ 40 Brightness and 30 Contrast, Local dimming to High and Contrast Enhancer disabled. I wanted to see how HDR appeared and the first few games I tried it did not look right at all, in fact, it suffers from the same problems the original G9 did, HDR is COMPLETELY broken on this panel and yes, I've basically fiddled with everything. I managed to get it to look barely acceptable using the Gaming Standard preset with Brightness @ 50, Contrast @ 45, Local Dimming to High and Contrast Enhancer to High. The issue with HDR is multifold, not only is it completely washed out just like the original G9 in terms of color but it also has a really annoying bug where the "HDR Tone Mapping" resets to Static even the UI says it's Active and the end user may be unaware of this. The solution is to set it to Static and then back to Active. This is extremely annoying as you must navigate like 15 button presses with the unergonomic monitor interface to accomplish this and it basically must be done every time you start a game, or if you change the render resolution. Simply changing DLSS quality in Cyberpunk 2077 will induce this bug.

So my initial day with it, I wasn't happy, I would give the monitor maybe 3 out of 5 stars, it is sharper that's for sure.

But having patience I stuck with it and eventually out of curiosity while playing Assassins Creed Valhalla in HDR (washed out!) I wanted to see how the game looked in SDR so I disabled HDR while the game was running and this was when I discovered a valid workaround, no not just a workaround, a DRAMATIC improvement in the image compared to the G9 Neo. What I discovered was that the panel was in this weird place where it was still using the HDR "Gaming Standard" preset with High Local Dimming, Contrast Enhancer and brightness and contrast nearly maxed out and the peak nits were about 80% of that of an HDR image but with SDR color! Having a Eureka moment two weeks after getting this monitor was euphoric. I closed down the game disabled HDR within windows, fired it back up, and this time used the HDR "Gaming Standard" profile and MY GOD, it was like HDR but with SDR color! In fact, it's so good it's better than G9 Neo in HDR after they fixed it with the firmware update! I was so impressed with the result that I was STUCK in the game for an additional 3 hours until like 4 am!

The next day I woke up and couldn't wait to try all of my other games and sure enough, they all looked absolutely AMAZING in SDR faking HDR via this method. I swear to you, light sources are about 80% as bright as in HDR but with amazing SDR color with this method. Upon further experimentation I realized that the presets actually tint the image in a manner that cannot be adjusted within the GUI and I realized that "Gaming Standard" applies a bluish / nearly greenish tint to whites whereas "RPG" applies a very sublte red to whites and is maybe 5-10% dimmer. I have now settled on the RPG profile with the following settings:

Brigtness: 50
Contrast: 40
Sharpness: 10
Color: 30
Local Dimming: High
Contrast Enhancer: High

I use this profile for web browsing / outside of gaming:

Profile: RTS
Brightness: 45
Contrast: 30
Sharpness:10
Color: 30
Local Dimming: High
Contrast Enhancer: Disabled

Having solved the image issue I can say WOW, seriously, textures the entire image, I've been on 1440p for the past decade and WOW 2160p absolutely is sharper at 2-3 ft viewing distance. Games look INCREDIBLE. This monitor absolutely destroys the G9 Neo. In fact, I've gone back to revisit games I recently played and WOW the colors, the crispness: Marvel's Guardians of the Galaxy, Spiderman Remastered, Witcher 3, Cyberpunk 2077, The Ascent, Sekiro, Forza Horizon 5, my god this monitor is GORGEOUS faking HDR brightness with this workaround.

Performance:

I was expecting crippling performance, oddly the 2.25x increase in pixel count does NOT translate into a 2.25x reduction in performance with the 4090. I'm seeing a 2x reduction in games on average. Having also upgraded from a 3090 to a 4090 my performance is about the same as driving the predecessor panel with the predecessor GPU. Yes 4090 absolutely rips at this resolution, and VRAM usage, it's above 15GB on avg now so don't try to drive this panel without at minimum a 4090!

Here's a performance run-down:

System:
i9 12900k @ 5.1 GHz @ 1.375v, E-Cores disabled
MSI 4090 Gaming X Trio @ 2700 MHz core @ .970v undervolt, +600 MHz memory
4x8 4000 MHz Trident Z Royal DDR4 15-16-16-36
Windows 10
G95NC firmware update 1003 (no difference to the power on / wake from sleep handshake / latency issue nor the windowed app G-Sync induced momentary black screen problem. Haven't checked if HDR was fixed, with this SDR hack I could care less!)

Cyberpunk 2077: 65-75 FPS all settings maxed, Path Tracing enabled, DLSS 3.5 enabled, DLSS FG enabled, DLLS: Performance
The Witcher 3: 75-80 FPS all settings maxed, RT on and maxed, DLSS FG enabled, DLSS: Balanced
Guardians of the Galaxy: all settings maxed, DLSS: Quality 70-80 FPS
Spiderman Miles Morales: all settings maxed, RT on and maxed, object distance: 10, DLSS FG: enabled, DLSS: Quality: 90-110 FPS (sadly the snowfall midgame breaks DLSS FG and FG must be disabled, still seeing 60-75 FPS no FG!)
The Ascent: all settings maxed, DLSS: Quality: 75 FPS
Forza Horizon 5: all settings maxed, DLSS FG enbled, DLSS: Quality: 110 FPS
God of War: all settings maxed, DLSS: Quality: 120 FPS
Jedi Survivor all settings maxed, DLSS FG enabled, DLSS: Quality: 80-90 FPS
Sekiro:, all settings maxed, 90-120 FPS
Red Dead Redemption 2: all settings maxed, DLSS Quality: 90-110 FPS
Far Cry 6: all settings maxed, FSR enabled: 90-110 FPS
Watch Dogs Legion: all settings maxed, additional detail: 0, RT quality: High, DLSS: Quality: 70-80 FPS (WOW)
Doom Eternal, all settings maxed, DLSS FG on, DLSS: Quality: 120 FPS everywhere

To sum up, my performance did not necessarily halve, for example previously my FPS in both Forza Horizon 5 and Spiderman Remastered with FG enabled was roughly 160-180 FPS whereas now it's more than half of that. I want to say the performance hit is closer to 80%? Really pleasantly surprised with this, and with as good as the VRR works 75 FPS feels fantastic on this monitor and is perfectly playable. Hell 65 FPS on this panel feels good.

Positive observations:

Although this panel is not G-Sync certified the VRR works better than it did on the G9 Neo before it, previously G-Sync / VRR would not work @ 120 Hz, and one had to enable 240 Hz mode before firing up a game. I ran the former panel @ 120 Hz because I didn't want to burn it out prematurely and this was always a hassle. It also seemed to stumble a bit down low @ 60-70 Hz. This is no longer the case, there is zero stutter even as low as 60 Hz, under 60 Hz is problematic but it seldom if ever dips under 60 Hz.

Negative observations / ongoing issues:

Certain windowed apps such as Asus Crate Armoury, Razer Synapse, and Reshade cause the panel to turn off for 3-5 seconds with G-Sync enabled, the solution is to disable G-Sync before attempting to use these apps.

The monitor has a handshake issue at present and will not wake when waking the PC. I've gotten in the habit of timing turning on the panel a few seconds after waking the PC from sleep. If you turn the panel on too early and there is no signal it goes back to sleep, too late and then there is severe input latency in Wndows on the desktop etc and the PC must be restarted.

I have occasional popping noises, but WAY less than the G9 Neo before it, as others have noted they seem to come from the upper left of the panel.

I have no dead pixels or other issues.

My initial review of 2-3 stars is now easily 5 stars even with the issues. The SDR image of this panel in dual 4K, with about 80% of the brightness and contrast of an HDR image with non of the drawbacks, it's utterly hypnotizing, any game I fire up I prepare to sit there for 4 hours or more because it's pure eye candy revisiting all of my recent titles, the texture clarity, the image sharpness, the colors and contrast, my god, this is visual crack.

5 stars hands down.

Here's to hoping they resolve the power on handshake issues, the HDR, and the G-Sync stuttering of windowed apps but to be honest, HDR being broken was the best thing that could have ever happened as SDR with the aforementioned HDR hack makes results in an image that is vastly superior to the HDR on the predecessor panel even after they fixed the washed out colors problem. To be honest, I could care less if they fix HDR, to all of those curious, please do try these settings and tell me your thoughts. Totally, completely blown away with this panel. Oh and the best part is that my Ergotron's HD pivot is JUST strong enough when completely tightened to hold the panel without drooping. You're going to want this arm if you get this panel as the factory stand puts the panel right in your face. I have an Ikea Bekant with a depth of 31.5" and the factory stand put's it right in your face. With the Ergotron arm if you completely loosen the screw at the base you can push the panel nearly all the way back and it's infinitely more comfortable. I pull it in roughly 6" when I game. I also managed to put the $500 credit towards a pair of 8TB 870 QVO drives while they were on sale, putting the total to $120 after applying the promotional credit making this a $2375 purchase for this monitor and $800 worth of storage. I am incredibly happy with this monitor. Words cannot describe how incredible games look on this faking HDR whilst retaining the color of SDR.

I highly recommend this monitor IF:

You can swing it's purchase price.
You have at minimum an RTX 4090
You have the patience to tinker and mess around with settings as I did (but I basically did the work for you with this review)
You have an Ergotron Arm + HD mount or a massive desk.

Anything less than an RTX 4090 you're going to be in for an unpleasant experience.
4090 can handle this monitor! 75 FPS on this panel all settings maxed feels great!

The cherry on top of this upgrade is that with the 4090 I was starting to see a bottleneck in some select titles, i.e. Watch Dogs Legion, and was contemplating an entire chipset upgrade which is a hassle with a full loop, not just an expense. I nearly upgraded to a 13900KS last year and I'm glad I held off (only a handful of titles were exhibiting a CPU bottleneck) well, now I no longer have a CPU bottleneck anywhere! As long as more developers and modders continue to implement Frame Generation in newer titles I may be relatively future proof with the 4090 going forward.

Thanks for reading.
Hi,
I was wondering on what arm are you using? I just replaces the G9 from the 49” to the 57” and my Ergotron HX is not able to hold the weight of the tilt :(

For everyone’s info im using it with Mac mini M2 pro. After updating the firmware im able to get 7k and 120hz with display port to thunderbolt.
It worked at some point with HDMI but then suddenly stopped.

Only problem with using DP to Thunderbolt is getting to the menu, monitor goes black if you do and you have to unplug it and plug it again..
 
Anyone here has a 12900K and a RTX 4090 @ 3GHz with this monitor?

Are they able to load up cyberpunk and tell me if they see the GPU usage drop from 100% to 75-80% within 30 seconds of loading a save game?

Happening to me, and I am trying to gather why my GPU usage drops rapidly and consequently the fps. I go from 60fps to 40-45fps, and then if I go into a menu/title screen or load a save game the fps goes back upto 60+ for about 20-30 seconds, then immediately it begins to slowly decline to a steady 40-45fps. Quiet frustrating, why won't it stay at 60fps or so if that is what it can initially push out?

EDIT:
Well I just realised that DLSS balanced isn't even rendering at 8K, so the lower res it's rendering might be the cause for the CPU bottleneck? Just a theory, have no idea if actually could be of any merit.
 
Last edited:
Anyone here has a 12900K and a RTX 4090 @ 3GHz with this monitor?

Are they able to load up cyberpunk and tell me if they see the GPU usage drop from 100% to 75-80% within 30 seconds of loading a save game?

Happening to me, and I am trying to gather why my GPU usage drops rapidly and consequently the fps. I go from 60fps to 40-45fps, and then if I go into a menu/title screen or load a save game the fps goes back upto 60+ for about 20-30 seconds, then immediately it begins to slowly decline to a steady 40-45fps. Quiet frustrating, why won't it stay at 60fps or so if that is what it can initially push out?

EDIT:
Well I just realised that DLSS balanced isn't even rendering at 8K, so the lower res it's rendering might be the cause for the CPU bottleneck? Just a theory, have no idea if actually could be of any merit.
I'd get rid of any overclocks, clean install drivers and see if that solves the problem.

Techpowerup's 7800X3D review's Cyberpunk 2077 test at 4K puts the 12900K as a fraction of an fps slower than the 7800X3D. You would be almost entirely GPU limited at 8Kx2K even if you use DLSS.
 
I'd get rid of any overclocks, clean install drivers and see if that solves the problem.

Techpowerup's 7800X3D review's Cyberpunk 2077 test at 4K puts the 12900K as a fraction of an fps slower than the 7800X3D. You would be almost entirely GPU limited at 8Kx2K even if you use DLSS.
All good I found the culprit!

It is DLSS Frame Gen!

It drops my gpu usage from 100% to 75-80% within 30 seconds of playing CP2077, and never recovers from that usage unless I enter a title screen or load a save game!

I turn DLSS FG off, I get 96-100% GPU usage constantly, it never dips or it never drops unless I turn DLSS FG 'on' again. Even though I lose a chunk of FPS when I disable DLSS FG, it actually feels nicer to play the game because of the instantaneous response rather than the noticeable input latency that DLSS FG introduces.

Still, I am curious to know why DLSS FG drops GPU usage so much and so rapidly in CP2077, and if there is a fix for it?
 
All good I found the culprit!

It is DLSS Frame Gen!

It drops my gpu usage from 100% to 75-80% within 30 seconds of playing CP2077, and never recovers from that usage unless I enter a title screen or load a save game!

I turn DLSS FG off, I get 96-100% GPU usage constantly, it never dips or it never drops unless I turn DLSS FG 'on' again. Even though I lose a chunk of FPS when I disable DLSS FG, it actually feels nicer to play the game because of the instantaneous response rather than the noticeable input latency that DLSS FG introduces.

Still, I am curious to know why DLSS FG drops GPU usage so much and so rapidly in CP2077, and if there is a fix for it?
That's weird. I'd try what happens if you set the display to 4K 16:9, does it work as expected then.

I just tested and on my LG CX 48" + 4090 + 13600K my GPU utilization is always 98-99% at 4K DLSS Balanced with all bells and whistles.
 
That's weird. I'd try what happens if you set the display to 4K 16:9, does it work as expected then.

I just tested and on my LG CX 48" + 4090 + 13600K my GPU utilization is always 98-99% at 4K DLSS Balanced with all bells and whistles.
I will try 4k res later tonight, see what occurs.
 
Very nice snoopi. Try going lower on the scale with DLSS. The display resolution is half of 8k, as it is essentially two 4k displays as a single unit, and DLSS Ultra Performance's entire reason for existence was to help with 8k. So at half of 8k.... might want to try DLSS performance and see how it goes. Assuming you don't just dump PT and max out regular RT.

Using FG to boost sub-60 fps is always going to feel kinda funky. Which is ok, as it wasn't really intended for that.
 
Very nice snoopi. Try going lower on the scale with DLSS. The display resolution is half of 8k, as it is essentially two 4k displays as a single unit, and DLSS Ultra Performance's entire reason for existence was to help with 8k. So at half of 8k.... might want to try DLSS performance and see how it goes. Assuming you don't just dump PT and max out regular RT.

Using FG to boost sub-60 fps is always going to feel kinda funky. Which is ok, as it wasn't really intended for that.

I did try all DLSS levels (Quality to UP), + reflex on/off and always the same things occurs GPU usage drop occurs very rapidly ONLY when DLSS FG is enabled.

But when DLSS FG is disabled, GPU usage never drops, so I am 100% confident this is some kind of DLSS FG bug, well for my case atleast.

I can get 60fps @ 8K max detail, full RT/PT DLSS FG off.

DLSS Balanced will drop me to 45 fps DLSS FG off.

DLAA or Native res (no DLSS) won't even render 1 frame lol it almost froze the image. I have to use escape button to show the menu screen to go back and change settings to a DLSS enabled scenario lol.

DLSS quality will drop me to 30 fps.
 
I did try all DLSS levels (Quality to UP), + reflex on/off and always the same things occurs GPU usage drop occurs very rapidly ONLY when DLSS FG is enabled.

But when DLSS FG is disabled, GPU usage never drops, so I am 100% confident this is some kind of DLSS FG bug, well for my case atleast.

I can get 60fps @ 8K max detail, full RT/PT DLSS FG off.

DLSS Balanced will drop me to 45 fps DLSS FG off.

DLAA or Native res (no DLSS) won't even render 1 frame lol it almost froze the image. I have to use escape button to show the menu screen to go back and change settings to a DLSS enabled scenario lol.

DLSS quality will drop me to 30 fps.
Exactly the same scenario as I discovered so you are not alone. I have a 7700x and a 4090. With frame Gen my useage goes around 80% after a few seconds. Without Frame Gen its 100%.
 
Exactly the same scenario as I discovered so you are not alone. I have a 7700x and a 4090. With frame Gen my useage goes around 80% after a few seconds. Without Frame Gen its 100%.
Interesting.

Do you think that means there is a CPU bottleneck, or just a DLSS FG bug?

Why does GPU usage drop to 80% when DLSS FG is enabled.

It basically means that enabling FG is utterly pointless and countereffective for me.

Because if FG is enabled I can get say
60fps, but after a few seconds GPU usage will drop from 100% to 80%, and hence I will be left with only 45fps, because that is approx 20% less than what a 100% GPU utilisation will output.

If I disable FG, then I get 45 fps also BUT without the FG latency.

So in my case, there it is absolutely pointless to enable FG until this can be fixed.
 
I did try all DLSS levels (Quality to UP), + reflex on/off and always the same things occurs GPU usage drop occurs very rapidly ONLY when DLSS FG is enabled.

But when DLSS FG is disabled, GPU usage never drops, so I am 100% confident this is some kind of DLSS FG bug, well for my case atleast.

I can get 60fps @ 8K max detail, full RT/PT DLSS FG off.

DLSS Balanced will drop me to 45 fps DLSS FG off.

DLAA or Native res (no DLSS) won't even render 1 frame lol it almost froze the image. I have to use escape button to show the menu screen to go back and change settings to a DLSS enabled scenario lol.

DLSS quality will drop me to 30 fps.
You could try swapping the DLSS DLL files to the latest versions and see if that helps in any way.

Otherwise all that seems pretty similar to what I got trying to run the game with 2x 4K screens in Nvidia Surround. Native res and DLAA were 1 fps if even that.
 
You could try swapping the DLSS DLL files to the latest versions and see if that helps in any way.

Otherwise all that seems pretty similar to what I got trying to run the game with 2x 4K screens in Nvidia Surround. Native res and DLAA were 1 fps if even that.

That kind of performance and yet some people want actual 8K monitors which is 4x 4K screens lol. Yeah yeah I know not every game is path traced or even ray traced, but moving forward almost every game will have some form of ray tracing and I can't imagine what kind of awful performance you're going to get trying to ray trace at 8K even on an RTX 6090.
 
You could try swapping the DLSS DLL files to the latest versions and see if that helps in any way.

Otherwise all that seems pretty similar to what I got trying to run the game with 2x 4K screens in Nvidia Surround. Native res and DLAA were 1 fps if even that.
Pretty sure I already updated the dlss files.
 
That kind of performance and yet some people want actual 8K monitors which is 4x 4K screens lol. Yeah yeah I know not every game is path traced or even ray traced, but moving forward almost every game will have some form of ray tracing and I can't imagine what kind of awful performance you're going to get trying to ray trace at 8K even on an RTX 6090.
I mean people want 8K displays mainly for desktop use and would game at 4K on them.

Samsung really messed up by not supporting 5Kx2K and 6Kx2K out of the box on their superultrawide. Those would give great compromises with higher performance and less FOV distortion.
 
I mean people want 8K displays mainly for desktop use and would game at 4K on them.

Samsung really messed up by not supporting 5Kx2K and 6Kx2K out of the box on their superultrawide. Those would give great compromises with higher performance and less FOV distortion.

Yeah but if you are gaming at "4K" but are actually DLSS upscaling to 8K, that is even more demanding than just running 4K + DLAA which already has terrible frame rates in heavy RT/PT. I was not a huge fan of RT and would just turn it off in every game until PT came out and found it to be really game changing to the visuals, pretty sure Nvidia will continue to push for more and more PT in future titles so for me at least, 8K is just stupid even if I were to buy it for desktop use because it's going to be non playable in any PT game even if you do use DLSS Performance which renders at 4K. Ultra Performance would render 1440p internally then upscale to 8K but from the initial testing done by DF it looks pretty bad.
 
I'd love to be able to run 5k, 6k, and some uw resolutions letterboxed on a big 8k 1000R ark or flat screen etc. in the future but so far I think DSC won't allow for that.

I hope handfuls of years ahead they might actually add motion vector handles to everything in game engine development and OS+peripheral drivers so that frame generation might be able to do multiple frames accurately, frame gen informed of actual vectors of everything instead of only guessing vectors by AI comparing two frames with no actual vector information/handles given to it. (This would also help VR/MR/XR as they are going to need very high resolutions in the future).

The other possibility would be if some mfg actually made very well performing upscaling on a 8k gaming tv/monitor internally on the monitor end of the equation, maybe kind of like g-sync chips were added (like nvidia shield's have AI upscaling on them too). .. one that wouldn't add lag or have other issues and would boost the rez *after* the 4k+high hz signal got to the tv. Not holding out hope for that any time soon if ever though. I think display stream compression was tech's answer for that for now instead.

I prioritize HDR for those kinds of visuals, but ray tracing is too demanding to make it worth it imo. I was one who had zero interest in 4k when it was stuck at 30hz for example, or even when it could do 60hz with low frame rates. The pixel density increase in picture quality wasn't worth it with the performance crash. I kept a 1440p cinema display for desktop/apps/imagery next to one of the first 27" 1080 120hz screens at one point too (non-gsync 120hz at that point).

Hard to get away from using more than one screen vs different techs and capabilities for me. I'd consider one of those hydraulic pillar arms people use to hide their tv behind things, but instead to raise a gaming screen up in front of a big 8k desktop/app screen and then lower it down lol. Use displayfusion and some streamdeck keys to make the 8k screen cutout it's space in a custom virtual monitor array or something, leaving the gaming screen area empty on the 8k until I was done gaming and the hit a button to switch the 8k screne back to full. DiY native rez gaming space superimposed Sounds a little crazy but I've considered doing it in the future. Regardless, I feel like 8k of screen real estate would be worth having a screen just for desktop use alone until tech catches up on the gaming side, and use a different 4k or higher monitor that was more capable for gaming on the side. I'll prob end up doing that once more 8k screens/gaming tvs come to market in compeition with each other - even if I had to do them side by side across two walls along a corner of a room or something instead of some periscope up/down or sliding scheme :LOL:

latest?cb=20200416182522&path-prefix=en.jpg
 
Last edited:
I'd love to be able to run 5k, 6k, and some uw resolutions letterboxed on a big 8k 1000R ark or flat screen etc. in the future but so far I think DSC won't allow for that.

I hope handfuls of years ahead they might actually add motion vector handles to everything in game engine development and OS+peripheral drivers so that frame generation might be able to do multiple frames accurately, frame gen informed of actual vectors of everything instead of only guessing vectors by AI comparing two frames with no actual vector information/handles given to it. (This would also help VR/MR/XR as they are going to need very high resolutions in the future).

The other possibility would be if some mfg actually made very well performing upscaling on a 8k gaming tv/monitor internally on the monitor end of the equation, maybe kind of like g-sync chips were added (like nvidia shield's have AI upscaling on them too). .. one that wouldn't add lag or have other issues and would boost the rez *after* the 4k+high hz signal got to the tv. Not holding out hope for that any time soon if ever though. I think display stream compression was tech's answer for that for now instead.

I prioritize HDR for those kinds of visuals, but ray tracing is too demanding to make it worth it imo. I was one who had zero interest in 4k when it was stuck at 30hz for example, or even when it could do 60hz with low frame rates. The pixel density increase in picture quality wasn't worth it with the performance crash. I kept a 1440p cinema display for desktop/apps/imagery next to one of the first 27" 1080 120hz screens at one point too (non-gsync 120hz at that point).

Hard to get away from using more than one screen vs different techs and capabilities for me. I'd consider one of those hydraulic pillar arms people use to hide their tv behind things, but instead to raise a gaming screen up in front of a big 8k desktop/app screen and then lower it down lol. Use displayfusion and some streamdeck keys to make the 8k screen cutout it's space in a custom virtual monitor array or something, leaving the gaming screen area empty on the 8k until I was done gaming and the hit a button to switch the 8k screne back to full. DiY native rez gaming space superimposed Sounds a little crazy but I've considered doing it in the future. Regardless, I feel like 8k of screen real estate would be worth having a screen just for desktop use alone until tech catches up on the gaming side, and use a different 4k or higher monitor that was more capable for gaming on the side. I'll prob end up doing that once more 8k screens/gaming tvs come to market in compeition with each other - even if I had to do them side by side across two walls along a corner of a room or something instead of some periscope up/down or sliding scheme :LOL:

View attachment 624795

Even if we ignore RT/PT, the newest games are still ridiculously demanding at 4K. Remnant 2 has no RT at all but will run at sub 60fps on a 4090 at native 4K, now imagine how bad the fps is going to be if you run it at 4K DLSS upscaled to 8K. Seems like being very taxing on the system is going to be a common theme with UE5 titles and a whole bunch of games are going to be made on UE5 like the Witcher 4, Hellblade 2, etc. So trying to play any of those games at 4K -> 8K DLSS isn't going to yield a good experience IMO.
 
Yeah but if you are gaming at "4K" but are actually DLSS upscaling to 8K, that is even more demanding than just running 4K + DLAA which already has terrible frame rates in heavy RT/PT. I was not a huge fan of RT and would just turn it off in every game until PT came out and found it to be really game changing to the visuals, pretty sure Nvidia will continue to push for more and more PT in future titles so for me at least, 8K is just stupid even if I were to buy it for desktop use because it's going to be non playable in any PT game even if you do use DLSS Performance which renders at 4K. Ultra Performance would render 1440p internally then upscale to 8K but from the initial testing done by DF it looks pretty bad.
I'd expect it would be either "set the game to 4K, then use DLSS on top of that for 4K DLSS scaling factors" or "use 8K + DLSS", especially if we would get e.g 8K @ 60 Hz + 4K @ 120+ Hz type display solutions.

While you can use DLSSTweaks to alter the scaling factors for DLSS, it would be nice if Nvidia could provide a more granular solution eventually.
 
Well I just upgraded from 12900K to 14900K on exact same system, and FG issue still happens!

Once FG is enabled in CP2077, GPU usgae is dropped by 20-25% resulting in nearly same fps as no FG where 100% GPU is used.

So can't be a cpu bottleneck!
Has to be game or dlss bug

EDIT:
I tried Dying Light2 and that works with FG normally and GPU usage is at 100%. With or without FG, GPU usage is 100%.

Now I know the issue is not related to the resolution of this monitor I will stop discussing it here unless members still want to discuss it. Thanks guys.
 
Last edited:
Even if we ignore RT/PT, the newest games are still ridiculously demanding at 4K. Remnant 2 has no RT at all but will run at sub 60fps on a 4090 at native 4K, now imagine how bad the fps is going to be if you run it at 4K DLSS upscaled to 8K. Seems like being very taxing on the system is going to be a common theme with UE5 titles and a whole bunch of games are going to be made on UE5 like the Witcher 4, Hellblade 2, etc. So trying to play any of those games at 4K -> 8K DLSS isn't going to yield a good experience IMO.

Remnant 2 isnt a big publisher/dev team so might not be optimzed well even if on ue5, but ue5 does have a high graphics ceiling potential. I've always said that graphics ceilings are arbitrarily set by devs in the first place though anyway. Their challenge is more about whittling games down to fit "real time" .

By the time manufacturers start really competiting with 8k screens we'll probably be on nvidia 5000 series (and onward later )which should be a bump up. Hopefully frame gen will continue to mature too.
 
Last edited:
Remnant 2 isnt a big publisher/dev team so might not be optimzed well even if on ue5, but ue5 does have a high graphics ceiling potential. I've always said that graphics ceilings are arbitrarily set by devs in the first place though anyway. Their challenge is more about whittling games down to fit "real time" .

By the time manufacturers start really competiting with 8k screens we'll probably be on nvidia 5000 series (and onward later )which should be a bump up. Hopefully frame gen will continue to mature too.

Maybe unoptimized but again it doesn't even have RT, UE5 titles that will implement a wide range of RT are going to run just as bad if not worst. Unless RTX 5090 is somehow more than twice as fast as a 4090 I don't see it making heavy RT/PT playable at 4K. I already have to drop down to DLSS Balanced and rely on FG to get acceptable frame rates on a 4090 which is rendering 2227x1253 internally before upscaling to 3840x2160. A 5090 won't suddenly make me able to play at the same settings with acceptable frame rates at 3840x2160 then DLSS upscaled to 7680x4320. Of course you could just turn off all the RT effects, but to me that's kinda like turning off HDR, you lose a lot of eye candy.
 
Yes I'd turn off RT in that scenario as I said in my previous reply. But I'll probably end up with a 8k just for desktop/apps/imagery and vids rather than doing any heavy gaming on outside of isometric and turn based stuff.
 
Last edited:
Yes I'd turn off RT in that scenario as I said in my previous reply. But I'll probably end up with a 8k just for desktop/apps/imagery and vids rather than doing any heavy gaming on outside of isometric and turn based stuff.

That's a use case I can get behind. It's the idea of single 8K screen for doing everything is what seems silly to me, we just aren't quite there yet in terms of having power to drive it.
 
That's a use case I can get behind. It's the idea of single 8K screen for doing everything is what seems silly to me, we just aren't quite there yet in terms of having power to drive it.
Let's say that Samsung releases a 55" 8K ARK model with a 120 Hz panel, or a dual mode "8K @ 60 Hz + 4K @ 120 Hz" panel since that's a thing now. To run 8K @ 120 Hz even on the desktop our current port options would be:

Port8K 120 Hz 8-bit8K 120 Hz 10-bit
HDMI 2.1 48 GbpsDSC 3.0xDSC 3.75x, DSC 3.0x is just shy of 120 Hz but chroma subsampling would do.
DP 2.1 UBHR 13.5DSC 2.0xDSC 2.5x or 3.0x

So just to be realistic since these are really pushing it, it would be 8K @ 60 Hz for desktop, but a 120 Hz panel could be used for 4K @ 120 Hz gaming. Then you have options:
  • Running games at 4K -> integer scaled to 8K. But this would probably limit you to 60 Hz since the res remains at 8K.
  • No scaling for 4K at smaller size on the screen. Might have to be able to move the ARK closer to you, which might not be feasible.
  • 8K @ 60 Hz + DLSS Performance for 4K render res. Too demanding for AAA RT/PT games.
  • 8K @ 60 Hz + DLSS Ultra Performance for 1440p render res. Within reason for AAA RT/PT games, as long as you are ok with 30-60 fps.
  • 4K @ 120 Hz + DLSS. Should work for same performance as current 4K displays. Hopefully the display can do the integer scaling tho!
I don't think anyone is expecting to game at 8K. If anything I would expect many games to really struggle even launching at 8K as their memory management etc might not be up to it. Cyberpunk is a good example where without DLSS, at 8Kx2K it is sub-1 fps level and seems to really get messed up where even getting back to the menu is a challenge.

I do agree it's a challenging scenario, but I'd still love to see it become a reality. To me these 77"+ 8K TVs and 32" 8K monitors are the stupidest formats for the resolution. 50-60" would be a great size but it's also large enough that realistically it's your only display and it better be curved too.

Dual/triple 4K screens is still far more cost effective and avoids a lot of issues. I think my next setup might be a 40" 5120x2160 + 27-32" 4K which should avoid most of the problems of the Neo G9 57" while retaining most of its perks.
 
  • Like
Reactions: elvn
like this
Back
Top