Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Having seen some higher refresh stuff I will say where it is most noticeable to me is trying to track a moving object with your eye. Like IRL if you watch traffic go by and hold your eyes on a fixed point, the cars are blurry, but follow a car with your eye and it is crisp and clear. You start to get that on the 360Hz displays. When you move a window around on the desktop, the text is much more readable than you see with even 240Hz. It's nice and something I'd totally love to see more of, but also not a big deal, particularly in games because you are usually going to cap out before you hit the high refresh rates anyhow. Even some more visually simple games will cap out because of CPU limits. Until we start to get more frame generation and reprojection, I don't see super high refresh rates being that useful.I'd guess that somewhere after 200fpsHz or so (solid or minimum, not average) the motion definition aspect of higher fpsHz probably has diminishing returns.
When you move a window around on the desktop, the text is much more readable than you see with even 240Hz
The type of game also matters a lot. I'm going to care much more about framerate on a fast paced game like first person shooters or racing games vs something slow like 3rd person adventures.
Having seen some higher refresh stuff I will say where it is most noticeable to me is trying to track a moving object with your eye. Like IRL if you watch traffic go by and hold your eyes on a fixed point, the cars are blurry, but follow a car with your eye and it is crisp and clear. You start to get that on the 360Hz displays. When you move a window around on the desktop, the text is much more readable than you see with even 240Hz. It's nice and something I'd totally love to see more of, but also not a big deal, particularly in games because you are usually going to cap out before you hit the high refresh rates anyhow. Even some more visually simple games will cap out because of CPU limits. Until we start to get more frame generation and reprojection, I don't see super high refresh rates being that useful.
That said, I certainly won't say no. If I have the option of two of the same kind of monitor but one has higher refresh, I'll get it if it doesn't cost too much more. I just won't prioritize it over other features I want at this point. So long as I can get 120, the rest is gravy.
I just won't prioritize it over other features I want at this point.
I upgraded from an xb271hu 165hz monitor to a 240hz and setting a key to swap between 120fpa max and 240fps max I absolutely can not tell a difference at all.
Ya, at this point that is my biggest consideration with higher refresh rate monitors in terms of "worth it" is even if I CAN notice it... am I actually going to have any games where I get to other than maybe staring in a corner? I like eye candy, I like resolution... so I pay for that with framerate. For the moment, I don't see that changing. I hope the blur busters guys are successful in getting GPU makers and game devs to work more on things like frame generation and reprojection (in demos it amazes me how much reprojection can help something feel smoother than the render rate) to the point where you do want a super high refresh monitor... but until then I won't worry about it. I'll get it if available (I'm watching the 240Hz 4k OLEDs with interest) but I won't prioritize it.But like you said, if you aren't getting those kinds of frame rate averages in the first place it won't matter appreciably in regard to motion clarity and motion definition/smoothness.
Maybe the problem is that when we get old enough to finally have enough money to buy these fancy monitors, we can no longer utilize them fully because of old ageYa, at this point that is my biggest consideration with higher refresh rate monitors in terms of "worth it" is even if I CAN notice it... am I actually going to have any games where I get to other than maybe staring in a corner? I like eye candy, I like resolution... so I pay for that with framerate. For the moment, I don't see that changing. I hope the blur busters guys are successful in getting GPU makers and game devs to work more on things like frame generation and reprojection (in demos it amazes me how much reprojection can help something feel smoother than the render rate) to the point where you do want a super high refresh monitor... but until then I won't worry about it. I'll get it if available (I'm watching the 240Hz 4k OLEDs with interest) but I won't prioritize it.
Some of that for sure. Man I wish I had the sound system I do now when I was a teenager with awesome hearing .Maybe the problem is that when we get old enough to finally have enough money to buy these fancy monitors, we can no longer utilize them fully because of old age
Or display manufacturers realize that won't make them any money so they invent something like monitors with inverse curve following the shape of our buts. Or Samsung just release a flat G9 in 32:18 format and claim it is their invention and has never been done beforeThe future of high refresh rate is likely to involve things like:
Meanwhile the higher refresh rates now available on 1080p monitors are likely to trickle down to higher resolutions. We already got 1440p at 240 Hz, next up is 4K 240 Hz on more LCDs + OLEDs, then most likely 1440p 360 Hz as 1080p displays push to 480 Hz.
- Native res becomes irrelevant and we settle on a "DLSS Quality on a 4K display" level render resolution as that seems to produce a "as good as native 4K" image in best cases in my own experience.
- More frames are generated by AI. Now Nvidia's DL frame gen generates every other frame, next step is likely every 2 or 3 frames. Or some variable mix of 1-3 generated frames for each real frame if that results in better accuracy.
- Variable render target DLSS + frame gen. It's already hard to tell the difference between DLSS Balanced and Quality on a 4K screen, so if the system can switch between them on the fly as needed to maintain a frame rate target, it would not be too noticeable to the player.
- Reprojection solves some of the "feel" issues of lower real framerate -> high AI generated framerate.
The future of high refresh rate is likely to involve things like:
Meanwhile the higher refresh rates now available on 1080p monitors are likely to trickle down to higher resolutions. We already got 1440p at 240 Hz, next up is 4K 240 Hz on more LCDs + OLEDs, then most likely 1440p 360 Hz as 1080p displays push to 480 Hz.
- Native res becomes irrelevant and we settle on a "DLSS Quality on a 4K display" level render resolution as that seems to produce a "as good as native 4K" image in best cases in my own experience.
- More frames are generated by AI. Now Nvidia's DL frame gen generates every other frame, next step is likely every 2 or 3 frames. Or some variable mix of 1-3 generated frames for each real frame if that results in better accuracy.
- Variable render target DLSS + frame gen. It's already hard to tell the difference between DLSS Balanced and Quality on a 4K screen, so if the system can switch between them on the fly as needed to maintain a frame rate target, it would not be too noticeable to the player.
- Reprojection solves some of the "feel" issues of lower real framerate -> high AI generated framerate.
From some blurbusters forum convos I had with Mark R, I think in order to get a lot of frames there might have to be a paradigm shift on development where the OS devs, game devs, peripherals+drivers devs all develop coding/systems that broadcast and share vector information between each other. VR already does this some with the headgear and hand controllers. So there is some actual vector information being used in VR in addition to it guessing motion vectors between two frames - where the nvidia version solely relies on guessing between two frame states with no vector information provided at all. The way nvidia (and OS + peripherals/drivers) are doing it now is uninformed by comparison to VR and especially by comparison to what it could be if an advanced vector info transmitting system were developed on PC between all of the important development facets (windows OS, peripheral/driver dev, game dev). Nvidia' s version of frame insertion is uninformed by comparison and is just guessing between 2 frames so it can be fooled into thinking things are moving when they aren't or not moving when they are (especially in 3rd person orbiting camera games), and just make bad guesses at times. If it was being informed of those peripheral vectors by os and drivers + in game entity's vectors by the game dev's coding the system would be a lot more accurate and would probably be able to do multiple frames. Hopefully it will progress to that kind of system eventually and we get there someyear.More frames are generated by AI. Now Nvidia's DL frame gen generates every other frame, next step is likely every 2 or 3 frames. Or some variable mix of 1-3 generated frames for each real frame if that results in better accuracy.
Or display manufacturers realize that won't make them any money so they invent something like monitors with inverse curve following the shape of our buts. Or Samsung just release a flat G9 in 32:18 format and claim it is their invention and has never been done before
One thing to add: We are in a weird spot. The RTX 4090 moved gaming to solid 4K 120 fps and above territory in one swoop, but at the same time the current batch of games, especially UE5 games, are pulling everything back down closer to 60 fps. So GPUs and games are at the same time outpacing monitors and not making full use of their capabilities, at least without reducing quality settings.
Just last year I was thinking "4K 120 Hz is more than enough", now I feel like I want 4K 240 Hz for the headroom.
I'd say with both games it's mostly an art direction problem. UE5 can look absolutely incredible and while it definitely needs more work to improve performance, the main issue with the games you mention is that they don't look better than say Doom Eternal while running at 1/4th the speed.IMO, the latest UE5 games have visuals that absolutely do not justify the hardware requirements. Does Remnant 2 look good? Sure....but it doesn't look anywhere near good enough to justify running at 45fps on an RTX 4090 at 4K, without any ray tracing at all. Immortals of Aveum, while it does have ray tracing in the form of Lumen, once again the visuals on display do not justify the hardware needed for those kind of visuals. The fact that many studios want to switch to UE5 really worries me because it seems like this engine is complete hot trash when it comes to either optimization or graphics/fps ratio or however you wanna name it. The only games that should run poorly on a 4090 at 4K at this point are fully path traced ones, anything else that's mostly raster should be easily breezed through unless again, the engine/optimization is garbage.
I'd say with both games it's mostly an art direction problem. UE5 can look absolutely incredible and while it definitely needs more work to improve performance, the main issue with the games you mention is that they don't look better than say Doom Eternal while running at 1/4th the speed.
UE 5.2 does some things to address shader compilation stutter afaik, so it will hopefully be solved by a standard feature eventually. It is a true bane on games though and not such a difficult problem that it can't be solved 90% by building shaders at start.And that is what worries me, just look at UE4 and how many times developers have totally failed to address shader compilation stutter. The issue became so prevalent that it led DigitalFoundry to make a whole video about it and it got to the point where not only Alex from DF, but myself and many others would literally frown whenever we found out a game was running on UE4 due to the fact that it was 90% likely to have shader compilation stutter. If developers could not address such a simple issue with UE4, I am seriously doubtful they would ever get their shit together when it comes to optimization on UE5. Much like how UE4 became synonymous with shader compilation stutter, I think UE5 will become synonymous with "runs like dogshit"
Going to have to get a lot lighter before people will be willing to use them as daily drivers. Just wearing goggles period is a pain to do all day, ask anyone who has to wear PPE, and the headsets are just too heavy to be even that comfortable. That's going to be the biggest issue with Apple's headset (aside from looking like a dork). A headset is fine for a short gaming session, nobody is going to want to do it for 8 hours of work.Eventually AR glasses will get enough ppi to effective PPD and functionality that people will probably start using those more instead of flat screens and phones. Right now there are various "sunglasses-like" models on the market that provide a floating flat screen composited into real space. However they are only 1080p and their tracking is clunky (even tracking just to keep the screen pinned in space) since they are still in a very early stage. In the long run they should get 4k and 8k per eye and people will be playing holographic games in midair, on floors or tables, mapped overlays in rooms, etc. as well as mapping flat screens into real space in front of you. You can already see some of this stuff with VR headsets and with early gen AR glasses but they are pretty rudimentary so far.
There a lot of subjectivity here (has I would imagine many find the last Zelda or Baldur Gates 3 to look better than some hard to run UE title, like you say art style-execution being often more important than raw technical capacity), they certainly look way more complex and I would imagine they have way more triangles, higher field of view and more dynamic objects.the main issue with the games you mention is that they don't look better than say Doom Eternal
I think game engines are just hard to get to scale to tons of cores. Not all tasks can just be divided down into parallel parts, and there isn't always a solution to that. Also many of the things you can make parallel might be so trivial it doesn't matter. Like you might have a game that has 80 threads and tries to do as much in parallel as possible but there's still one thread that hits real hard, another that hits pretty hard, and then a bunch of much smaller ones where you could stack 60 of them on a core and it would be fine. In that kind of situation you won't see scaling with high core counts even if they can throw each thread on its own core because it doesn't matter.Maybe the engine is well optimized but a bit ahead of his time, maybe it is not well optimized or maybe the few released game implementation does not scale cpu well enough
I think game engines are just hard to get to scale to tons of cores. Not all tasks can just be divided down into parallel parts, and there isn't always a solution to that. Also many of the things you can make parallel might be so trivial it doesn't matter. Like you might have a game that has 80 threads and tries to do as much in parallel as possible but there's still one thread that hits real hard, another that hits pretty hard, and then a bunch of much smaller ones where you could stack 60 of them on a core and it would be fine. In that kind of situation you won't see scaling with high core counts even if they can throw each thread on its own core because it doesn't matter.
It's easy to just shit on devs and act like if they REALLY wanted to they could make games scale better to high core counts... but the fact that it doesn't seem to happen, even on big engines like Unreal should indicate that maybe it isn't as easy as gamers would like it to be.
Guess that depends on what you want. I haven't seen any of the UE5 games IRL so I can't say for them. What I can say is that I think people can get unrealistic in their demands and crap on new games unfairly. Like I've been playing Jedi Survivor recently and it looks GREAT. I saw lots of comments that it doesn't look any better than the last one, doesn't justify its hardware requirements, has crap ray tracing, etc. I disagree completely. It looks amazing. The improvement is incremental, to be sure, but I like it.The issue with the latest UE5 games isn't CPU core scaling though. You are straight up GPU limited on a 4090 and will get sub 60fps if you don't use upscaling, and the visuals that you get in return aren't worth it one bit. If a game runs at 40fps on a 4090 it had better look ultra mindblowing, and none of the UE5 games do.
Extremely hard for sure and there always a cost to sync them and a game could need or want to do it hundreds of time per second.I think game engines are just hard to get to scale to tons of cores. Not all tasks can just be divided down into parallel parts, and there isn't always a solution to that. Also many of the things you can make parallel might be so trivial it doesn't matter. Like you might have a game that has 80 threads and tries to do as much in parallel as possible but there's still one thread that hits real hard, another that hits pretty hard,
Guess that depends on what you want. I haven't seen any of the UE5 games IRL so I can't say for them. What I can say is that I think people can get unrealistic in their demands and crap on new games unfairly. Like I've been playing Jedi Survivor recently and it looks GREAT. I saw lots of comments that it doesn't look any better than the last one, doesn't justify its hardware requirements, has crap ray tracing, etc. I disagree completely. It looks amazing. The improvement is incremental, to be sure, but I like it.
Part of the problem might be that people expect each improvement to be mind-blowing or something. As we push closer and closer to photorealism, each improvement will be more subtle. I guess you can argue it isn't worth it, but in that case my answer is turn down the settings, they are there to be turned down if you want.
Going to have to get a lot lighter before people will be willing to use them as daily drivers. Just wearing goggles period is a pain to do all day, ask anyone who has to wear PPE, and the headsets are just too heavy to be even that comfortable. That's going to be the biggest issue with Apple's headset (aside from looking like a dork). A headset is fine for a short gaming session, nobody is going to want to do it for 8 hours of work.
They'll need to get down to more like the weight of glasses before they are the kind of thing people will consider replacing screens with. I won't say it'll never happen, but it is a long ways off to that point.
Yeah I would be all for that. Afaik the upcoming Apple Vision Pro is able to stream a 4K screen from your Mac and display it, but that's still pretty limiting compared to having multiple displays in the virtual space. And I have a hard time believing the Vision Pro in its current form would be comfortable enough to wear all day long for work.Ya if they can get good glasses-based AR that are not much heavier than normal glasses and can also be prescription, I can see plenty of people, including me, having a real interest in that. Question will be how good they can make it work and how light and unobtrusive they can be.
I have a hard time believing the Vision Pro in its current form would be comfortable enough to wear all day long for work
that's still pretty limiting compared to having multiple displays in the virtual space
At least one of the sunglass models I read about have a diopter adjustment range. It apparently only covers nearsightedness so doesn't work for everyone but it shows it can be done. Depending how extreme your farsightedness is, how near things have to be to be blurry, VR/AR might be fine as the virtual screen's focal point is usually a decent distance away. .can also be prescription
"The Rokid Max AR Glasses include an adjustable diopter of 0.00D to -6.00D. This means that you can adjust the strength of the lenses to match your eye prescription. This will help you to see clearly without having to wear any inserts or wear your glasses over the AR glasses."
No. It's for near sightedness only. You're better off buy an Nreal or Rokid Max as those allow special lenses via external glasses that clip onto the device.
Ironically, the worse your near sightedness is, the bigger the screen.
Usually, glasses like this or VR headsets work fine for farsighted users without adjustment, as the screen is projected to your retina like it's 3-5 meters away. Companies which offer inset glasses, like VR Optician, even tell people with presbyopia to use their far distance values.
However, I guess this can vary depending on the model, and there's some limit.
MacOS already lets you flick between virtual desktops and apps that have been fullscreened. So having a floating 4K AR screen is not really an improvement over a physical screen in this regard. Being able to make e.g a 8K screen area or multiple monitors would make a big difference. Or an ultrawide screen that fits the viewing area nicely.Theoretically as long as your usage scenario didn't involve looking at all three displays at the same time, AR capable of showing one screen resolution's worth of pixels could swap screens. Perhaps as you look away from one to the other they'd swap, or just flip them like playing cards or virtual desktops, cube, etc. Not exactly the same as seeing things in your peripheral , at least for now but it could work.
The sunglass form factor is a narrow fov but could use some kind of huds/indicators/animated tabs/pop/balloons for off-screen information, then flip that cube/page/desktop to the "other screen". That or swap between having a quad of 1080p at 4k or quad of 4k on 8k glasses' "screens" as four windows on the virtual screen, then activate whichever one to full screen, or two in a split screen, 2+1 etc. when you want to see something larger. With AR you can still see your existing physical screen(s) too, so you could have a few smaller screens on the desk for extra real-estate and then flip to their mirror on the glasses when you wanted to.
Once screens/glasses hit 8k that can be a lot of real estate on a single virtually large central screen (4x 4k screens) so you really wouldn't need multiple monitors. it would be capable of showing more information than three 4k screens already at 4x 4k. That's what I'd like to do with a physical 8k screen in the next few years in the meantime. No bezels on the desktop screen window spaces. Maybe need a separate gaming screen still though for higher hz for awhile.
MacOS already lets you flick between virtual desktops and apps that have been fullscreened. So having a floating 4K AR screen is not really an improvement over a physical screen in this regard. Being able to make e.g a 8K screen area or multiple monitors would make a big difference. Or an ultrawide screen that fits the viewing area nicely.
Ideally you could just run MacOS inside Vision Pro without needing a laptop at all and it could then just draw all your apps as floating windows on as big a canvas as you'd like. But it seems it will be basically iOS in AR glasses.
seems it will be basically iOS in AR glasses.