Samsung Odyssey Neo G9 57" 7680x2160 super ultrawide (mini-LED)

  • Like
Reactions: elvn
like this
Has anyone seen anything online about when laptops with Displayport 2.1 are likely to come out? My work monitor is a 7 year old 40 inch Philips 4k, the first of that form factor to ever come out, still going strong. All the ultrawides have been too small in height so far, but this one is finally almost the same height but with extra width and an even better ppi. I know that I could build a desktop with the latest AMD GPU, but would rather just upgrade my work laptop to something that can power this. And even though we don't yet know what this monitor can do via hdmi 2.1, throughput wise it doesn't seem technically possible to run it. I haven't seen a single thing online about laptops with DP2.1 so thought I'd ask here
 
Has anyone seen anything online about when laptops with Displayport 2.1 are likely to come out? My work monitor is a 7 year old 40 inch Philips 4k, the first of that form factor to ever come out, still going strong. All the ultrawides have been too small in height so far, but this one is finally almost the same height but with extra width and an even better ppi. I know that I could build a desktop with the latest AMD GPU, but would rather just upgrade my work laptop to something that can power this. And even though we don't yet know what this monitor can do via hdmi 2.1, throughput wise it doesn't seem technically possible to run it. I haven't seen a single thing online about laptops with DP2.1 so thought I'd ask here


There are some laptops with a dp 1.4 output but it's not really standard on laptops A few of the current top gaming laptops have mini dp 1.2 which is lower rez/bandwidth. At least one has dp 1.4 but typically they have hdmi 2.1 and usb-c display capability. Even if you got a dp 2.1 laptop 2 or more years after it became standard in desktop gpus, a lot of laptop gpus, even if they sound like the same model number wise (e.g. "rtx 3080 laptop"), are usually weaker than desktop ones to begin with and throttled power delivery/performance vs heat a lot more than desktop gpus . . so not well suited for pushing a very high rez + very high Hz.

You'd only need the full bandwidth when running the full resolution of this at the highest hz. Will probably need a desktop "5000" series gpu on the nvidia side of things to get dp 2.1 (assuming they'll have dp 2.1 on the "5000" series), or amd for desktops. I could be wrong but dp 2.1 probably won't show up in laptops until much later and in few models. There are some gpu passthrough boxes you can get for a laptop but at that point probably better off building a micro-ATX or mini-ATX machine with a full gpu and using a small dufflebag + portable screen.

. . . . . . . . . .

This calculator from linus tech tips is pretty neat. I set the link to default to 7680x2160 , 240hz , 10bit using displaystream compression (DSC) 2x which fits into DP 2.1's bandwidth but you can swap values using that calculator to whatever you want.

https://linustechtips.com/topic/729...ssion=dsc2.0x&calculations=show&formulas=show

So that fits into dp 2.1 if using 2x compression via DSC. If you were using hdmi 2.1 and DSC 2x was available you could get full rez 10 bit up to around 150 fpsHz or so, so not that bad really especially considering the resolution you'd be pushing.

. . . . . . . . . . . .

I think it could be advantageous if it were possible for mfgs to put better AI upscaling (for gaming) in the screens themselves at some point, so that you could send high hz 4k or other ultrawide (x1200, x1600) rez to the display at the max of the port/cable bandwidth and then AI upscale it to 8k or 2x4k ultrawide resolution on the screen end of the equation but I haven't heard any rumor of anything like that tech being developed (i.e. adding a nvidia AI upscaling module to a gaming TV or gaming monitor to bypass the port/cable bottleneck).

A long time from now there will probably be a hdmi version upgrade for more 8k and 8k+ bandwidth and eventually that will probably trickle into gpus, laptops, consoles, and stand-alone players. I wouldn't expect that or a dp 2.1 laptop for quite awhile yet though as dp 2.1 isn't really in full force yet itself. Maybe some mfg will do dp 2.1 in a "5000 series" full desktop power gpu in a laptop at some point sooner though like you are hoping but I wouldn't count on it.

That said, probably not the best display to use a gaming laptop on outside of 2d desktop/app real estate and maybe windowed/smaller 1:1 rez gaming ("smaller" like full 4k heh ) . You don't need dp 2.1 to do that, only if you want to get very high hz at 4k+ or the full 4k "doublewide" resolution. You also wouldn't run out of bandwidth at the full resolution on hdmi 2.1 if DSC 2x compression was available until a little above 150 fpsHz. You could also theoretically fit ~ 275fpsHz 4k 10bit into HDMI 2.1 using DSC 2x.
 
Last edited:
Thank you for the detailed writeup and the link. So actually assuming DSC 2x is genuinely visually lossless for typical work ie text editing and browsing, then 120hz/144hz will be fine via a laptop with HDMI 2.1, and that's more than enough.

I get the feeling that they'll also enable the use of 2x HDMI ports, the way that high-res monitors used to a long time ago, so if you have a desktop Nvidia card you just connect both, set it up in surround, and be able to get 240hz out of it with a 4000 series GPU. Given Nvidia's dominance, I can't see them releasing this without the ability to do demos of the full capability using 4090 PCs.
 
Thank you for the detailed writeup and the link. So actually assuming DSC 2x is genuinely visually lossless for typical work ie text editing and browsing, then 120hz/144hz will be fine via a laptop with HDMI 2.1, and that's more than enough.
I have never been able to tell any difference with DSC in use so I just don't feel it's worth even a consideration. The bigger problem is that DSC does not work with everything. I don't know how PC laptops deal with it, but Macs have historically had broken DSC more often than not. It does seem to work on my M2 Max Macbook Pro 16" though at least with my current 4K 144 Hz display over USB-C -> DP 1.4.

I get the feeling that they'll also enable the use of 2x HDMI ports, the way that high-res monitors used to a long time ago, so if you have a desktop Nvidia card you just connect both, set it up in surround, and be able to get 240hz out of it with a 4000 series GPU. Given Nvidia's dominance, I can't see them releasing this without the ability to do demos of the full capability using 4090 PCs.

2x HDMI will work for Picture by Picture mode, but that is likely to limit the resolution to something without DSC, like 4K 120 Hz, without HDR or VRR support. This is fine for desktop use of course. Ideally I'd like to run 21:9 + 11:9 like on the Samsung CRG9 I had.

I tried running two Samsung G70A 4K 144 Hz displays via Nvidia Surround and the only way to get it working was at 4K 120 Hz and even that required some trickery.

I do agree they want to make this also something that appeals to Nvidia owners. I have a 4090 that would work great with this display aside from the lack of DP 2.1.
HDMI 2.1 + DSC 3.0x compression should be capable of 200-220 Hz @ 10-bit and 240 Hz @ 8-bit color. If it has the cheaper 40 Gbps ports then that might get limited to 175-190 Hz @ 10-bit color.

Which would be realistically fine. If you look at the benchmarks I did with dual 4K displays in Nvidia Surround earlier in this thread, it's hard to get much beyond 150 fps in most games if you stick to full res or narrower ultrawide resolutions. 4K 240 Hz might still be possible if the display just scales it correctly instead of stretching it. Getting the max possible refresh rate over HDMI is going to be mainly headroom for games that truly can run at extreme framerates.

So far the display has been demoed with AMD GPUs and DP 2.1 and I expect they will do that going forward. Gamescom is in late August and IFA Berlin in early September so maybe we will see the display (or the TCL version) there.
 
I have never been able to tell any difference with DSC in use so I just don't feel it's worth even a consideration. The bigger problem is that DSC does not work with everything. I don't know how PC laptops deal with it, but Macs have historically had broken DSC more often than not. It does seem to work on my M2 Max Macbook Pro 16" though at least with my current 4K 144 Hz display over USB-C -> DP 1.4.



2x HDMI will work for Picture by Picture mode, but that is likely to limit the resolution to something without DSC, like 4K 120 Hz, without HDR or VRR support. This is fine for desktop use of course. Ideally I'd like to run 21:9 + 11:9 like on the Samsung CRG9 I had.

I tried running two Samsung G70A 4K 144 Hz displays via Nvidia Surround and the only way to get it working was at 4K 120 Hz and even that required some trickery.

I do agree they want to make this also something that appeals to Nvidia owners. I have a 4090 that would work great with this display aside from the lack of DP 2.1.
HDMI 2.1 + DSC 3.0x compression should be capable of 200-220 Hz @ 10-bit and 240 Hz @ 8-bit color. If it has the cheaper 40 Gbps ports then that might get limited to 175-190 Hz @ 10-bit color.

Which would be realistically fine. If you look at the benchmarks I did with dual 4K displays in Nvidia Surround earlier in this thread, it's hard to get much beyond 150 fps in most games if you stick to full res or narrower ultrawide resolutions. 4K 240 Hz might still be possible if the display just scales it correctly instead of stretching it. Getting the max possible refresh rate over HDMI is going to be mainly headroom for games that truly can run at extreme framerates.

So far the display has been demoed with AMD GPUs and DP 2.1 and I expect they will do that going forward. Gamescom is in late August and IFA Berlin in early September so maybe we will see the display (or the TCL version) there.


3x compression fits even more bandwidth yep. I guess 2x vs 3x DSC compression doesn't matter as long as it's still lossless compared with 2x. That calculator on LTT is pretty handy.

Like you indicated and according to that LTT calculator, theoretically if a display were capable of it DSC 3x compression over DP 2.1 could get almost 380 fpsHz at 7680x2160 10bit (and hdmi 2.1 could get ~ 220 fpsHz).

These 4x "doublewide"screens are being developed to use dsc so hopefully won't be an issue there with windows gaming. I'm assuming you are saying dsc doesn't work with every random display type or laptop's ports, laptop gpu systems rather than it not working with OS or apps, games.


So actually assuming DSC 2x is genuinely visually lossless for typical work ie text editing and browsing, then 120hz/144hz will be fine via a laptop with HDMI 2.1, and that's more than enough.



https://www.displayport.org/faq/#tab-display-stream-compression-dsc

How does VESA’s DSC Standard compare to other image compression standards?​



Compared to other image compression standards such as JPEG or AVC, etc., DSC achieves visually lossless compression quality at a low compression ratio by using a much simpler codec (coder/decoder) circuit. The typical compression ratio of DSC range from 1:1 to about 3:1 which offers significant benefit in interface data rate reduction. DSC is designed specifically to compress any content type at low compression with excellent results. The simple decoder (typically less than 100k gates) takes very little chip area, which minimizes implementation cost and device power use, and adds no more than one raster scan line (less than 8 usec in a 4K @ 60Hz system) to the display’s throughput latency, an unnoticeable delay for interactive applications.

What testing did VESA perform to determine that DSC is visually lossless?​



During the development of DSC, the contributing VESA member companies performed ongoing visual tests to uncover visual compression artifacts using different image types and image motion. This testing was used to fine tune the DSC codec (coding / decoding algorithm). For the final DSC codec, the visual performance of DSC was evaluated through clinical testing by VESA in collaboration with member companies. The evaluation included a statistically significant number of observers who viewed many images over four image categories including artificial engineered images, text and graphics, such as street maps or different examples of printed material, people, landscape, animals and stills. Overall, observers completed nearly 250,000 subjective image comparisons. VESA members also concluded subjective testing as a far more robust method to verify visually lossless quality rather than using objective metrics, such as, PSNR which typically designates one value for an image. The results of this testing indicated that DSC met the visually lossless criteria set by VESA. The details of testing methodology and results were published by VESA in early 2015.
 
Last edited:
Also found this on reddit that might be of interest according to a reply:

. . . . . . . . . . . <QUOTE> . . . . . . .

Short: Your GPU, your dock, and your adapter need to support DSC. Your cable and your monitor don't need to support DSC.

Long:

  • Your GPU must support DSC. DSC starts from here as it is your GPU that compresses the outgoing display streams with a DSC encoder.
  • Your dock must also support DSC. Thunderbolt 3/4 docks and DP 1.4 Alt Mode docks are the mainstream types of docks that support DSC at the moment.
  • The USB-C to DP adapter must support DSC. All USB-C to DP 1.4 adapters support DSC, which is what you probably have. Edit 2: I just saw your adapter is a DP 1.2 adapter, that means none of your setups are running DSC most likely.
  • The monitor doesn't need to support DSC. Why? Because all DP 1.4 ports and adapters have what's called a "DSC decoder" where it decompresses DSC-encoded display streams so that they can work on monitors that don't have DSC.

This is why it's possible to run DSC-enabled display streams to a DP 1.2 monitor, which doesn't have DSC enabled.

Your cable does not need to support DSC.

There are exceptions to all of this such as when you're dealing with a Thunderbolt monitor, but that's a different topic.

Edit: What's happening in each setup:

Your M1 MacBook supports DP 1.4 and DSC. It is running over Thunderbolt 4/USB4. DP 1.4 has enough bandwidth to natively support 4K60 without DSC, but it is likely running DSC in order to preserve bandwidth for data. Edit: it’s not running DSC as your adapter is DP 1.2. It’s running native 4K60 without compression.

ASUS ROG Zephyrus G has a GTX 1660 Ti GPU, which supports DP 1.4 and DSC. It is running over DisplayPort 1.4 Alternate Mode. It is actually likely not running DSC as DP Alt Mode is supplying 2 Lanes of HBR3 equal to 12.96Gbps, more than enough to run 4K60 without DSC.

ASUS Chromebox is using an Intel integrated GPU that only supports DP 1.2; it does not support DSC. It is running over DP Alt Mode and supplying only 2 Lanes of HBR2, equal to 8.64Gbps, which is only enough to run 4K30.

. . . . . . . < end QUOTE> . . . . .


EDIT: According to nvidia the display does have to support DSC.. but that reply I quoted above stated that all DP 1.4 ports and adapters (and so probably all ports above that like hdmi 2.1) can decode DSC so maybe there is a way to force it and make the display's (gaming tv/monitor) port decode it..

Nvidia:
  • When a display is connected to the GPU and is set to DSC mode, the GPU may use two internal heads to drive the display when the pixel rate needed to drive the display mode exceeds the GPU’s single head limit. This may affect your display topology when using multiple monitors. For example if two displays with support for DSC are connected to a single GeForce GPU, all 4 internal heads will be utilized and you will not be able to use a third monitor with the GPU at the same time.
  • If the GPU detects that a display supports DSC, DSC mode will be enabled automatically. Some displays may allow you to disable DSC by changing the communication link from the displays internal settings (eg. changing the DisplayPort mode from Displayport 1.4 to DisplayPort 1.2)
  • NVIDIA DSR, NVIDIA DLDSR and NVIDIA Image Scaling are not supported when DSC mode is enabled.
  • To determine if your PC monitor, notebook display or TV supports Display Stream Compression, please refer to your display manufacturer.
To learn more, please visit:
https://vesa.org/vesa-display-compression-codecs/dsc/
 
Last edited:
. .

https://www.pulse-eight.com/News/BlogDscVsCsc
Comparing DSC VS CSC
Display Stream Compression (DSC) standard was released in 2015 and quickly became an industry leading CODEC, recognised for delivering "visually lossless" mezzanine compression. Whereas CSC is an effective but simpler video technique that lowers bandwidth by reducing the amount of chroma information, which is unrecoverable. Below are some comparisons on video quality of the same source when comparing the two technologies.

DSCvsCSCVideoExample_RGB.png


DSCvsCSCPixelComparison_RGB.jpg


DSCvsCSCVideoRandomPixelExample_RGB.png
 
  • Like
Reactions: Xar
like this
Revisiting this reply one more time just to clarify. .

Thank you for the detailed writeup and the link. So actually assuming DSC 2x is genuinely visually lossless for typical work ie text editing and browsing, then 120hz/144hz will be fine via a laptop with HDMI 2.1, and that's more than enough.

The bandwidth concern is more about running games at very high fpsHZ on this screen which is 2x 4k rather than 8k. 8k would take a lot more bandwidth than even these screens. You can get 80Hz on the desktop at native rez 10bit over HDMI 2.1 on these "4x double-wide" screens using no compression which is very usable on a desktop for apps that are mostly static. So really don't need to use any compression on the desktop outside of gaming if you don't want to.

. . .

DP 1.4 7680x2160 rez 10bit:

~ 50 Hz using no compression
~ 95 Hz using DSC 2x compression
~144hz using DSC 3x compression

. . .

HDMI 2.1, 7680x2160 rez 10 bit:

~ 80 Hz using no compression
~ 150Hz using DSC 2x compression
~ 220Hz using DSC 3x compression

. . .

DP 2.1 7680x2160 rez 10bit:

~ 143 Hz using no compression
~ 269 Hz using DSC 2x compression
~ 380 Hz using DSC 3x compression
 
Last edited:
  • Like
Reactions: Xar
like this
I'm assuming you are saying dsc doesn't work with every random display type or laptop's ports, laptop gpu systems rather than it not working with OS or apps, games.
I'm saying that DSC support can be erratic. As an example, technically the AMD Radeon 5300M I had in my previous 2019 Intel Macbook Pro 16" should work with DP 1.4 DSC, but I was never able to get it working and others have reported issues with it using various displays, even with DSC breaking in OS updates. This has been a long running Apple issue but other laptops might have similar problems.

If you have a modern desktop AMD or Nvidia card, there's no point worrying about it. DSC works.

If you need to connect a laptop, that's where things get iffy. Workaround for the Samsung superultrawide would be to just use two display outputs and run in Picture by Picture mode - that's likely to work at least at 4K 60 Hz per side.
 
  • Like
Reactions: elvn
like this
Revisiting this reply one more time just to clarify. .



The bandwidth concern is more about running games at very high fpsHZ on this screen which is 2x 4k rather than 8k. 8k would take a lot more bandwidth than even these screens. You can get 80Hz on the desktop at native rez 10bit over HDMI 2.1 on these "4x double-wide" screens using no compression which is very usable on a desktop for apps that are mostly static. So really don't need to use any compression on the desktop outside of gaming if you don't want to.

. . .

DP 1.4 7680x2160 rez 10bit:

~ 50 Hz using no compression
~ 95 Hz using DSC 2x compression
~144hz using DSC 3x compression

. . .

HDMI 2.1, 7680x2160 rez 10 bit:

~ 80 Hz using no compression
~ 150Hz using DSC 2x compression
~ 220Hz using DSC 3x compression

. . .

DP 2.1 7680x2160 rez 10bit:

~ 143 Hz using no compression
~ 269 Hz using DSC 2x compression
~ 380 Hz using DSC 3x compression

Yep, agree that the high refresh is definitely much more relevant for gaming, but there's definitely a noticeable difference between 80hz and say 144hz for desktop silky smoothness, so most likely people using this monitor for work would want to set a higher resolution and rely on DSC as long as it doesn't lead to any weird experiences, which it sounds like it shouldn't.
 
I'm getting more excited about this monitor for flight sim use. But alas, I think this may be on the backburner awaiting the NVIDIA 5000 series for DP 2.1.
 

1. 1000-nits peak, 240Hz, Custom Heatsink, Semi Glare = asus-rog-swift-oled-pg49wcd-with-49-qd-oled-panel-144hz-and-1000-nits

. . 144Hz according to the hyperlinked page. The Samsung and MSI versions are 240hz though supposedly so this format will be available in 240hz versions.
. . only 1440 px tall Zzzzzz
. . a short screen, probably ~ 13" height Zzzzz
. . even if you sat up exceptionally close to where it wouldn't be too short anymore to your perspective, the ppd would end up being sub-optimal at that point (and the sides would be far flung into your periphery).
. . 1800 R curve = 1800mm radius or focal point of the curve. 1800mm = 71 inches. So you are going to be sitting far inside of the actual focal point of that curve, much farther inside of it than even a 1000R ~ 40" radius display. So it's probably going to get a lot of distortion on the sides, more and more the farther from the middle of the screen. When you are at the focal point of a curve, all of the points on the screen are equidistant from your eyes and pointed right at you. The closer you sit from that, the focal point is farther and farther behind you. IMO that makes it a bad choice for curvature for gaming/media. If you are spinning your head/chair to different portions of the screen, to different slices of screen windows, it wouldn't be bad but a head on full field display of dynamic media and gaming is not the same thing. 800mm = 32 inches. To me that would be good for most screens. Unless you had something like a 8k ark-like 55" inch screen 1000R/1000mm/40" or something someday, mounted on a tv stand etc. Usually there is some wiggle room on 1000R screens to sit closer and use the sides for immersion in racing/flight games but 1800R is way too broad imo.

The Asus ROG Swift OLED PG49WCD is a large 49″ super ultrawide display offering a 5120 x 1440 resolution OLED technology panel. It’s a competing model to those already announced from Samsung and MSI, and uses a Samsung QD-OLED panel but in the case of the Asus this is only 144Hz, whereas the other models are 240Hz. This seems like an odd choice given their 27″ OLED screen is 240Hz already and a 240Hz capable panel exists.

https://tftcentral.co.uk/news/samsu...0-x-1440-qd-oled-panel-and-240hz-refresh-rate

https://tftcentral.co.uk/news/msi-t...ltrawide-qd-oled-panel-and-240hz-refresh-rate



The 59" 4k-doublewide this thread is about is like two 32" 4k screens side by side. Imo it would have been better at 750 to 800R rather than 1000R but it's still better than 1800R. It's also around 15.5" tall though your view distance will determine the perceived height to your perspective - so sitting closer can help the perceived height of either screen but it will also will lower the PPD and make the curvature's focus farther off target --> worse uniformity and distortion. 1440p 1800R ~ 13" tall is not for me personally.

834273_cIoEkLA.png
1UEL5rW.png



. . . . .

2. Link seemed to bring me to the same asus rog swift oled but I searched the specs and found "
ASUS PA32UCXR32" 4K 60Hz IPS2304-zone, 1600-nits, 97% DCI-P3, self-calibration, Thunderbolt 4Q3 2023

. . 32 inch 4k is good for on top of a desk viewing angle and PPD wise.
. . 60 Hz IPS yet has far less zones than the macs, less than 1/4 of the mac screens.
. . if it's like the other pro art displays, it would have a boxy housing with air vent grills + active (audible) fan cooling profiles. Which is good to avoid aggressive ABL in order to get HDR brightness on large %'s to full screen with a long sustained duration - but it's still something to consider if that kind of thing might bother you.
. . both this and the 10,000 zones macs are a poor choice for gaming. They are 60hz and even those models that aren't limited to 60hz usually have slower gaming/display specs overall besides.


. . . . .

3. asus-launches-a-135-inch-microled-screen

Specifications:

ProArt Cinema PQ07

Panel size (diagonal)135″ (16:9)
Resolution4K UHD (3840 x 2160)
Pixel pitch0.7815 mm
Panel backlight/typeMicro LED
Brightness2000 cd/m² (peak brightness)
Color saturation95% DCI-P3
Display color10-bit
HDR supportHDR-10
Special featuresFully scalable to different sizes, shapes, and ratio; with numerous customization options

.
. . . it's 135 inches so not really a consumer item or price. Seems more like a proof of concept and/or corporate showpiece.
. . . 135 inches at 4k . . 60 deg viewing angle to be inside of your central viewing angle (and at 64 PPD) starts at 94 inch viewing distance, almost 8 feet (7' 10").
. . . how micro is "micro" at 135 inches? Not factoring in viewing distance for PPD or anything, this thing is 33 PPI so 33 pixels per inch of leds if it's truly per pixel emission. Less than 6px across by 6px tall per inch. (for reference, a 55" 4k is 80 ppi, 48" = 92 ppi, 42" = 105 ppi, 36" = 122 ppi, 32" = 138 ppi. 8k would be double those).


. . . . .

Still interesting to see the tech advances with heatsinks and more zones, and especially microled - even if it's still going to be oversized and out of reach for a long time yet.
 
Last edited:
  • Like
Reactions: Xar
like this
The 59" 4k-doublewide this thread is about is like two 32" 4k screens side by side. Imo it would have been better at 750 to 800R rather than 1000R but it's still better than 1800R. It's also around 15.5" tall though your view distance will determine the perceived height to your perspective - so sitting closer can help the perceived height of either screen but it will also will lower the PPD and make the curvature's focus farther off target --> worse uniformity and distortion. 1440p 1800R ~ 13" tall is not for me personally.

750 to 800R is very aggressive - I don't think it would work well for anything other than maybe games.

Viewing distance is a tricky thing, particularly for middle aged people or older. The eye stops being good at adjusting focus at different distances. This means that you're kind of stuck with the distance where your eyes can focus well, or you need glasses with a prescription which is specific to the distance you want.
 
While there is some wiggle room to sit a bit closer, the nearer you sit from the radius or focal point of the curve, the farther you are pushing that focal point behind you so you'll get more and more distortion the farther from the middle of the screen the displayed content is.

That might not matter as much if you were a rotating turret with your chair and head on the desk viewing segments of the monitor as desktop app windows like a multi-monitor setup minus bezels, but for gaming the full field of the screen it would be sub-optimal and worse the farther that focal point gets from you.


You can sit closer of course but when you sit closer than the focal point of the curve the curve distorts things since the pixels are instead pointing at a location behind you rather than where you are sitting.

The solid blue is the 1000R viewpoint, the transparent field is when sitting nearer. There is some wiggle room like the first example but the 2nd example is poor imo.


0UhdIIr.png




q03mqmG.png


The focal points you'd start from as a basis are

1.8meter for 1800R/1800mm ~ 71 inches (almost 8 feet, 7' 10").

1 meter for 1000R/1000mm ~ 40 inches.

.8 meter for 800R/800mm = ~ 32 inches. With some wiggle room that is much closer to desktop mounted distances.

. . .

It's a catch 22 because the farther away you sit from ultwawides/super-ultrawides the shorter they appear to your perspective and become more and more belt-like. Most people are sitting far inside of the curvature, far short of the focal point of ultrawides - by necessity since they are mounted on a desk (and would look too short sitting far enough away besides). If they had a screen size (incl physical height) vs. curvature that were balanced enough to keep the screen somewhat taller to your perspective it would be a lot better in my opinion.



I'd take a 55" 8k ark-like screen at 1000R, mount it on a floor stand and sit 36 to 40" away from it depending what I was playing. With the shorter heights of the ultrawides/super-ultrawides it brings you back to:

perceived screen height -VS- curvature/distance from focal point of curve -VS- distortion (and uniformity depending on panel type).


. . . . . . . . .

Here is the xenon flex at 800R. It's nowhere near as long as the super ultrawides but gives an idea of the curvature:

0-45-Inch-OLED-Bendable-Gaming-Monitor-01-1200x900.jpg


-45WQHD240-45-Inch-OLED-Bendable-Gaming-Monitor-02.jpg


CORSAIR-XENEON-1.jpg


. .

https://www.pcmag.com/reviews/corsair-xeneon-flex

But once curved, the screen looks good. It curves to about 800R, which is even tighter than the Samsung Odyssey Ark’s big, swooping curve of 1000R. (That 1000R measurement means that a ring of these monitors edge to edge would complete a circle with a radius of 1,000mm. The lower the number, the more pronounced the curvature.) The curvature makes the screen look bigger than it is, helps reduce glare, and delivers deeper perceived blacks, which enhances the immersive feel while gaming. Out of the three most common curvatures monitors offer (1800R, 1500R, and 1000R), 1000R was previously considered closest to the curvature of our field of vision, but that claim now goes to 800R.

That "1000R was considered closest to our field of vision, but that claim now goes to 800R" is only accurate when you are talking about the screen being up close mounted on a desk though - as like I said, it's a function of the "R" radius of the curvature of a circle, like the focal point of a curve/lens.

At the viewing distance of the curvature, all pixels would be pointed directly at you like tiny laser pointers.

czO72GA.png


As you sit closer than the ~ 40 inch view distance that 1000R eyeball is at, the curve would have to curve more to match you being in the center of a smaller sphere, and vice versa. If you were at 1800R you'd have to sit back almost 8 feet away to get the same equidistance / focal-point.

For reference, viewing a 32" 4k , which the 57" 7680x2160 screen is two of side by side essentially, you would get a full field view at around 60 degrees horizontal which is a 24 inch view distance. You could think of the screen as a 32" 4k centrally with + 1920px wide wing added to each side. The farther away you sit from that range, the shorter the screen will appear to your perspective which is no good imo. I wouldn't want shorter than what a 32" screen at normal 32" viewing distances gets.

There is a little wiggle room to use the sides for immersion but when you are sitting 24" away (610R, and equivalent to 60 deg view on a 32" 4k height wise) to 30" away (762R, equivalent to 50 deg view on a 32" 4k height wise.) from a 40" (1000R) radius screen to give it enough perceived height, it could be a better matched curve to start with. I'd rather the actual curve matched to full field in the first place from my regular seating position and then sitting closer at times, warping it some (increasingly to the far ends) sitting nearer when playing racing/flight games, with a high ppi screen. On a 32" 4k, 24" view distance still gets 64 PPD and a 30" view distance still gets 77 PPD.
 
Last edited:
Last year, I enjoyed getting to try the Odyssey Neo G8, and although I really ended up disliking the curve and the thick AR coating on that thing, it was still fun to get to try it out. I was hoping that this year I could have a similar summer experience trying out this Neo G9, but for whatever reason, it seems like it’s not coming out in the summer like the previous model dead. Does anyone here have any idea when Samsung might release this?
 
Last year, I enjoyed getting to try the Odyssey Neo G8, and although I really ended up disliking the curve and the thick AR coating on that thing, it was still fun to get to try it out. I was hoping that this year I could have a similar summer experience trying out this Neo G9, but for whatever reason, it seems like it’s not coming out in the summer like the previous model dead. Does anyone here have any idea when Samsung might release this?

There are only rumors. Someone on YouTube said that Samsung people hinted that it will be released in about two months.
 
For a sim-racing only setup, and based on what is known so far, would you all suggest waiting for the Neo G9 57, or pulling the trigger on the OLED G9 49" now? My GPU is a 4090, which looks sufficient for either monitor based on testing earlier in this thread. Thanks for any insight you can spare. The folks on here clearly know their stuff.
 
For a sim-racing only setup, and based on what is known so far, would you all suggest waiting for the Neo G9 57, or pulling the trigger on the OLED G9 49" now? My GPU is a 4090, which looks sufficient for either monitor based on testing earlier in this thread. Thanks for any insight you can spare. The folks on here clearly know their stuff.
Based on early Reddit reports, the OLED G9 seems like it has plenty of firmware issues so I'd probably wait a few months for the worst to be fixed.

For a simracing rig the OLED G9 might be the best option. It's going to have faster pixel response times and the 5120x1440 res is easier to run closer to 240 fps than 7680x2160. The Neo G9 57" is going to be a better all around productivity and gaming display.

I don't get why Samsung made the OLED G9 only 1800R though when OLED should be much easier to make 1000R curve than mini-LED displays.
 
  • Like
Reactions: Xar
like this
Based on early Reddit reports, the OLED G9 seems like it has plenty of firmware issues so I'd probably wait a few months for the worst to be fixed.

For a simracing rig the OLED G9 might be the best option. It's going to have faster pixel response times and the 5120x1440 res is easier to run closer to 240 fps than 7680x2160. The Neo G9 57" is going to be a better all around productivity and gaming display.

I don't get why Samsung made the OLED G9 only 1800R though when OLED should be much easier to make 1000R curve than mini-LED displays.

Thank you for the input. I was disappointed when I saw the 1800R curvature of the G9 OLED. I would have definitely preferred 1000R (or even 800R for my use case). The main thing drawing me towards the 57" is the increased FOV both vertically and horizontally. I calculated it will be 144 degrees horizontally and 42 degrees vertically for the 57", and only 112 horizontally and 37 degrees vertically for the 49". Eye to screen distance will be 20", so I'm also a little worried that 1440p will not be high enough resolution, but as you mentioned, 240hz will not be achievable for some time on the Neo G9 57.
 
Thank you for the input. I was disappointed when I saw the 1800R curvature of the G9 OLED. I would have definitely preferred 1000R (or even 800R for my use case). The main thing drawing me towards the 57" is the increased FOV both vertically and horizontally. I calculated it will be 144 degrees horizontally and 42 degrees vertically for the 57", and only 112 horizontally and 37 degrees vertically for the 49". Eye to screen distance will be 20", so I'm also a little worried that 1440p will not be high enough resolution, but as you mentioned, 240hz will not be achievable for some time on the Neo G9 57.

2025 for Ada-Next GPUs. By that time there might even be a better display than the G9.

https://www.hardwareluxx.de/index.p...ration-ada-lovelace-nachfolger-erst-2025.html

1687895610097.png
 
Ya it seems like the OLED G9 is the better buy, but I will still wait a few months regardless I think.
 
https://tftcentral.co.uk/articles/s...rom-samsung-improvements-and-changes-for-2023
The G9 will use the Gen2's panel to its highest capabilities, sadly it doesn't come with Heatsink like the ASUS's version and has this inappropriate 1800R which shouldn't be used with 49" 5120x1440.

Based on the reports and measurements in reviews, it doesn't look like there's any significant difference when compared to Gen1 other than the higher refresh rate and possibly slightly better text quality. In particular, no real improvement in brightness.
 
Based on the reports and measurements in reviews, it doesn't look like there's any significant difference when compared to Gen1 other than the higher refresh rate and possibly slightly better text quality. In particular, no real improvement in brightness.
This's true based on RTing's 65" S95B vs 65" S95C comparison. 50-100 nits maximum uplift across multiple HDR-SDR testings.
 
Listed for preorder in Canada: https://www.bestbuy.ca/en-ca/produc...g-monitor-ls57cg952nnxza-black-white/17160882

With a September 11 release date!

Not much new specswise, other than it weighs a whopping 15.4 kg without stand and has 3x HDMI ports.

Lack of USB-C is disappointing as that would make for easy connection to my Mac. I'm guessing that they didn't care about limiting it to DP 1.4 speeds over USB-C so considered more HDMI 2.1 ports to be a better option. Thankfully I have a M2 Mac so I can just use the HDMI 2.1 port on that.
 
  • Like
Reactions: elvn
like this
Interesting. I ran it through a calculator for usd and my sales tax.

Before Tax Price: $2,525.00
Sale Tax: 8.75% or $220.94
After Tax Price: $2,745.94
 
So looks like it will be $2,499 in the US. Not too bad really. Going to wait for it to appear on the US Samsung site for military discount.
 
Going to buy this to replace my PG32UQX since its basically 2 of them side by side albeit less bright but way faster response time. Really hope that rumor of DisplayHDR1000 is true so we don't have to deal with Samsungs local dimming ABL.

Even a 4090 in triple AAA titles will be rough with this and require DLSS/frame gen. Otherwise I'm okay playing some games in 16:9.
 
It has less total pixels than DLDSR 2.25x 4k (5760x3240), because it's fake 8k (7680x2160). So anyone with a 4k display can try running on that DLDSR mode to see how demanding it would be.
 
Going to buy this to replace my PG32UQX since its basically 2 of them side by side albeit less bright but way faster response time. Really hope that rumor of DisplayHDR1000 is true so we don't have to deal with Samsungs local dimming ABL.

Even a 4090 in triple AAA titles will be rough with this and require DLSS/frame gen. Otherwise I'm okay playing some games in 16:9.

It has less total pixels than DLDSR 2.25x 4k (5760x3240), because it's fake 8k (7680x2160). So anyone with a 4k display can try running on that DLDSR mode to see how demanding it would be.



It's essentially two 32" 4k screens side by side with no bezels, but with a 1000R ~ 1000mm ~> 40" radius curve.

Or you can think of it as a central 32" 4k with another 32" 4k split and added to each end as wings.

You should be able to run 4k or slightly wider than 4k at times if you want something more central, or you can use DLSS AI upscaling + DLSS AA and sharpening (quality mode of course) on games that support it. If it was based on the equivalent uw rez wider that a 32" 1440p resolution it would be upscaling from 5120 x 1440 rez up to 7680 x 2160.

There are a number of games that already get over 120fpsHz in 4k without DLSS when using a top tier card w/o RTX tanking the frame rate, or at at very high+ / ultra minus settings. DLSS would be even faster though and this is a 240hz screen.

4k = 8,294,400 pixels

5120 x 1440 = 7,372,800 pixels

The performance even with a 3070 at 5120x1440 seems good in this video. I haven't found any 4000 series charts/vids at 5120x1440 at a glance. Without DLSS would be a better measure since you'd probably be upscaling "from" (after) 5120x1440 on one of these screens rather than up "to".





..frame amplification (nvidia's version is frame insertion) tech that should mature over time/gpu gens. The higher the PPD / pixel density, the tinier any occasional edge/fringe artifacts should be on DLSS/frame amplification tech too.

..the "5000 series" gpus will come out sooner or later (with dp 2.1 I'm assuming). Time flies.

So either way you can get higher fps on the screen, let alone on less demanding games, yet still get a ton of bezel-free desktop real-estate when not gaming or on the sides when gaming in a middle window.

. . . .

The con I have with these kinds of screens is that in order to see the whole screen in your 60 to 50 degre
e human central viewing angle you'd have to sit at the radius or focal point of the curve. With the height of what would be a 32" 4k, sitting that far away (with the screen mounted on a simple thin rail spine'd tv stand for example) would turn this screen into a short belt-like screen to your perspective. So you have to sit closer to mounted on desk distances which will distort the screen the farther from the middle the pixels are. 800R would be 800mm -> 32 inch radius which would work a lot better as a focal point for what is equivalent to a 32" 16:9 screen height wise but this screen's specs are what they are. Still impressive specs overall but will have to see some good reviews of them.

Really hope that rumor of DisplayHDR1000 is true so we don't have to deal with Samsungs local dimming ABL.

Just so you know, TCL is going to release one of these too. While they are based on the same panel their specs/performance might be a little different.

I'm really hoping for something like a 1000R 55" 8k version of the ark someyear, by any mfg not necessarily samsung, with all of the wrinkles ironed out features wise but I'm seriously considering one of these "4k doublewide" or more like "half 8k" screens by the end of the year in the meantime if there are no issues with them.
 
Holy moly, its up at $3299US

That means in Aussy, will be like around $5-6K. Sheeezzzzuz....
 
You could still get screen space in your peripheral for immersion sitting at the focal point of a curve - you'd just need to design a screen that is as a semicircle, a larger portion of that circle in degrees.

To achieve this they could either design a screen with different dimensions or make a more aggressive curve, or some combination of the two.

Here is an extreme example of a 1000R, 1000mm, ~ 40inch radius screen at around 180 degrees:

1000R_180deg.sem-circle_1.png


Less degrees than that but greater than your 60 to 50 deg human central viewing angle would still work well providing immersion with a portion of the screen space in the periphery. Say + 30 deg each side = 120 degree semicircle.

1000R_120deg.sem-circle_1.png


When you sit inside of the focal point of a curved screen, way closer than the focal point of the curve, the pixels will be off axis almost like a gradient. The farther from the center of the screen you get the more and more off axis the pixels will be. Off axis pixels are bad for distortion and uniformity.

1000R_sitting.far.inside.of.focal.point_1.png



While both of the two previous images have about half of their viewable screen space split into their peripheral, the difference is that the 120deg example's viewer is at the focal point of the curve so all of the pixel's horizontal axis will be pointed directly at him. The 120deg example uses a longer screen. Alternately you could make a more aggressive curve. The point being it's better to be sitting near the center of the circle that the screen is a semi-circle of.

. . . .

If these 1000R screens were instead something like 800R it would be 800mm radius or focal point which equals ~ 32 inches. That's a lot more manageable of a distance for a screen with a 32" 4k's height. 750mm ~> 30 inch. 700mm = ~ 28 inches. etc. The R value determines the focal point of the curve or how large of a circle the curved screen is a semi-circle of. They could design a screen longer in relationship to that 1000R (wider, more degrees of the circle) for more immersion but imo the height would still have to be tall enough. Otherwise probably easier to just make the curve more aggressive for the same result as long as the resulting PPD is high enough when sitting at the radius/focal point of the curve. It is what it is though considering what's available for the near future.
 
Last edited:
Holy moly, its up at $3299US

That means in Aussy, will be like around $5-6K. Sheeezzzzuz....


That's bestbuy . CA (canada)
.
.

57-inch-superultrawide-price.usd_1.png


+ 8.75% tax here ~~~~> + $218 = $2710.55

That's within range of how much a 77" 4k OLED might be on year end prices for the last few years.
 
Last edited:
Back
Top