Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Keep in mind that modern emulator front-ends like RetroArch can re-clock games to eliminate stutter and correct the audio pitch.
Though old arcade games may run around 55Hz, newer systems are typically ±2% of 60Hz.
I doubt most people would notice - or care - if a Neo-Geo game is running at 59.185606Hz or 60.0Hz, or if a SNES game is running at 60Hz instead of 60.08Hz.
The problems of games stuttering if the refresh rate was not exact, and pitch errors or audio glitches if the games were not run at their original speed, are no longer issues today.
That's not to say that Variable Refresh Rate tech isn't important - you will probably notice if a game is sped up from 55Hz to 60Hz, and VRR eliminates V-Sync latency, but for most of the games/systems that people want to emulate, it's also less of a problem now.
I'd be more concerned about having a good strobe option on the display than VRR support, since 2D graphics blur horribly on sample-and-hold displays - and all VRR modes must use S&H.
If it's configured correctly - and RetroArch can be confusing to set up if you're new to it - then things like 30Hz flicker in games works as intended. There is no irregular flickering or stutter.Actually, this is incorrect. Run Samurai Shodown II in Retroarch and you'll see the same scrolling and flicker irregularities. What's even worse about Retroarch is that it screws up variable refresh monitors, too, making it impossible to get perfectly smooth movement on them. Retroarch is garbage.
If it's configured correctly - and RetroArch can be confusing to set up if you're new to it - then things like 30Hz flicker in games works as intended. There is no irregular flickering or stutter.
I even use RetroArch on my CRT with its black frame insertion option enabled, so that I can use rates such as 110.035212Hz for old arcade games, because my CRT won't sync to anything below 60Hz. (55.017606Hz x2)
If you're running it in Windowed Full-Screen mode or don't set up the refresh rate, then you do get irregular flickering - which is really bad with BFI - but if you set it up correctly it works as you would expect.
It doesn't really seem possible to upload a video to demonstrate that - at least not at 60 FPS.Upload a 60fps video of Samurai Shodown II running in Retroarch with regular flickering, because I don't believe you. I know how to configure Retroarch, and even though its rate control gives you results somewhat better than just out of the box MAME at the wrong refresh rate, it is NOT perfect.
It doesn't really seem possible to upload a video to demonstrate that - at least not at 60 FPS.
I did record and upload a video, but got different results on each system I tried playing it on - all of which showed varying degrees of irregular flicker that was not present in the source video.
- The game itself is not running at exactly 60 FPS (RetroArch measured this screen as 60.002399Hz)
- The recording is not going to be perfectly synchronized to that refresh rate (my camera will only shoot 1/125, not 1/120 - and even if it did, it would not be genlocked)
- YouTube playback of that recording on another machine is not going to be perfectly in sync with the video either
Stepping through the recording itself showed that 30Hz flicker was working correctly in RetroArch - not that I needed a recording to tell me that, since black frame insertion would be completely unusable if RA was not synchronizing things correctly.
All the video highlighted was problems with YouTube playback, and not the game.
Your contention is that Retroarch can magically smooth out game updates even when your monitor doesn't run at the game's refresh rate, and your statements just proved that that's incorrect. If Retroarch could truly do that, there's no reason you shouldn't be able to create a 60fps video showing perfect updates, because Retroarch can smooth it all out, right? If I run MAME with syncrefresh, I CAN get perfectly regular updates in a Youtube video. The game will just be running too fast. Alternatively, I could just turn off vsync and record a video and have perfectly smooth updates, but you'd see tear lines.
You can't create a 60fps video in Retroarch without irregular flickering for the same reason Retroarch can't run Samurai Shodown II perfectly because the game isn't running at its native refresh rate but it's still trying to run the game at the right speed. Retroarch's rate control is NOT perfect. The reason you got different irregularities on each computer is that pretty much every monitor has a slightly different "true" refresh rate.
No, that's the only way to get truly smooth updates at the original speed. If you don't care about a <2% variance - and most people won't - then you can get perfectly smooth updates on any 60Hz display.Outputting to a CRT monitor at the game's native refresh rate or using a variable refresh monitor are the only ways to get truly perfect, bullet smooth updates in these games.
The game runs at 60Hz but the shadows flicker at 30Hz, since they are only on for half the time.Also, no idea why you keep referring to "30Hz flicker," which is nonsense, because Samurai Shodown II doesn't run at 30hz.
This demonstrates exactly the problem that I described in my previous post.I made a 60fps video of MAME with syncrefresh on, and as long as your computer/browser doesn't chug, the updates should be perfectly consistent. Browsers are piles of crap, though, so you might have to play it more than once to go all the way through without the browser stuttering. Running this in Chrome might be the best way to go.
https://www.youtube.com/watch?v=tk65th1w2uE
Then I'm not sure what you were disputing before, since I stated that in my very first post and you repeatedly said that I was wrong about RetroArch being able to run games in sync with your refresh rate so that they don't stutter at all, so long as you don't mind what is typically less than a 2% speed error.Uhhhh, all you did was restate what I originally said in the first place.
I agree that it's far from ideal for old games which run significantly lower than 60Hz. The Mortal Kombat games are probably the newest titles running on hardware like that however. It's typically very old games that ran significantly below 60Hz.If you do that for Mortal Kombat, you're running the game 10% faster than it should be. That's a terrible solution.
Black frame insertion on an LCD is terrible. It barely improves motion clarity and tends to wash out the image, since the LCD pixels are so slow to change.You're completely wrong about G-Sync and FreeSync. There is no "problem" with G-Sync or FreeSync. They allow you to run perfectly smoothly WITHOUT altering the speed the games run at. That's why they're better than the garbage you described. Also, you CAN actually do software blackframe insertion combined with G-Sync to make games run at exactly double their original speed (because most G-Sync monitors are 144hz). There is no downside to G-Sync or FreeSync at all. On top of all that, they have less input latency because there's no v-sync lag.
A multisync display which can run at 54.706840Hz, or 109.413680Hz if it won't sync below 60Hz, will be exactly as smooth as a VRR display - except it can also have the option of using backlight strobing, since VRR modes must disable that feature to work correctly.No, the advantage is that you can actually run games at the right speed smoothly.
If something is only displayed 30 times a second, it's flickering at 30Hz.No, the shadows flicker at 59.1hz, because that's the speed the game updates at. You're confusing what shadows look like and how often they're updated. They change every frame.
Black frame insertion does ruin the image on an LCD.Finally, I want to point out that strobed backlights and black frame insertion are more problematic than G-Sync. They destroy brightness and color quality. They dramatically reduce the quality of the image.
I don't think it should be assumed that someone is going to be using CRT shaders. Frankly I think most of them look terrible.Combined with the fact that you're going to be using some CRT shader for things like this, and scanlines already reduce image brightness, the combined effect is pretty brutal.
Not being able to use backlight strobing is a significant disadvantage when the content you're viewing does not benefit from VRR support (only multisync support) and is the type of content where motion blur is most easily seen.Whereas I can say that G-Sync and FreeSync are objectively better than standard v-sync in every way with no tradeoffs, strobing and black frame insertion are objectively just tradeoffs. They're not a clear win across the board.
On a flicker-free display, which VRR requires, your motion blur is directly related to your framerate.Really, to me, motion blur on modern 144hz 1ms monitors (which isn't even that bad) is preferable to sucking down the gray haze of blacklight strobing.
I think it's only LightBoost monitors that showed problems with color rendering when it was enabled
insults removed.
insults removed.
almost all CRT monitors I used and the one I currently use support 50Hz fine. I remember one monitor having some issues.most PC CRTs will not sync to anything below 60Hz
Black frame insertion isn't the same as interpolation, and it works without any strobing. So many clueless people on this site. I'm outta here. Enjoy your ignorance!