The Slowing Growth of vRam in Games

we're in a bit of a strange transition period: graphics card performance requirements are increasing at a steady pace but actual, on-card VRAM quantities are stagnating because of the need to pair chips with a certain bus width and the need of the GPU chip designers to limit the bus width on lower-end cards to avoid larger costs because the memory controllers/interfaces take up large portions of the GPU silicon die.

Taking into account that APIs like DirectStorage could increase VRAM requirements, along with general increases due to higher graphical fidelity and implementations of ray tracing, there appears to be a movement towards requiring higher VRAM quantities, as standard. However, up until the current generation of cards, GDDR6 module capacities are limited to 8 Gb (1 GB) and 16 Gb (2 GB).

http://hole-in-my-head.blogspot.com/2023/01/next-gen-pc-gaming-requirements-2022.html?m=1

Now, while this is a potential issue for owners of current and prior generation cards, there is a potential solution on the horizon: GDDR6W. This DDR technology allows for modules of up to 32 Gb (4 GB) capacities, meaning that larger capacity modules could be paired with more anaemic memory setups - as is found on lower-end cards.

The issue with this is that it is not certain which memory technology will be applied for the next generation graphics chips. The high-end dies (used for the most expensive graphics cards) will mostly benefit from higher bandwidths - meaning that GDDR7 seems like the more obvious choice for those future releases since it will increase speed up to 36 Gbps from GDDR6's current 16-18 Gbps. (Though, Samsung also announced 24 Gbps speed GDDR6 late last year). Neither GDDR7 or this faster GDDR6 are confirmed to come in higher capacity modules, meaning that if either technology is chosen, the VRAM drought on lower-end cards might not change, unless cards such as the RTX 4050 increase their memory bus to 192 bit but, given that the RTX 4070 Ti is already at a 192 bit bus width, this seems unlikely.

All in all, I think that memory technology and interfaces look like they will be the most interesting facet of PC gaming over the next few years...
 
I would argue there are more games out that are showing that you absolutely don't need 12-16GB VRAM to have a stellar looking game that also performs great (God of War, Spider Man, Tale: Requiem etc.), you also don't need to max all settings all the time, especially at 4K or 1440P UW, this is part of the problem / reason why they'll continue to sell more expensive hardware (people thinking they need 100+ FPS and Maximum settings to have a great experience, just another form of FOMO). TLoU is a bad port in it's current form and shouldn't be used as any type of indicator, just as Arkham Knight was when it first launched. Ultra is for screenshots, High is for good looking gameplay, High/Medium is also good looking with more of an emphasis on performance imo.
Is TLoU actually a bad port or a sign for things to come?

Probably both from what I see. Every console generation hardware has been max out in usage by developers. When you have a console that 12.5gb can be used for VRam or for graphics, plus fast streaming of data from the SSD. It will be used in games.

Taking that game and transitioning it to the majority of PC gaming hardware not having in this case VRAM, compromises would have to be done. If not then those games will have issues, big issues from compilation times, crashes due to lack of resources and so on.

Games taking advantage of PS5 fast SSD going to the pc may also have issues as well.

Yet you have graphics card by themselves costing more than the whole console itself, with limited Vram. Stagnating the normal upgrade path on the pc. With the better game play and graphics shifting to the console for the majority what is used for pc gaming.

PC ports may in the end may not be viable with current PC GPUs being used. Majority of them Nvidia designed with less Vram then what the consoles can use.
 
When you have a console that 12.5gb can be used for VRam
12.5gb is about what they have for ram total after the OS. They still have to share that with the game after, so probably another 3-4gb or more for the game itself. That leaves around 8-8.5gb for the graphics vram usage.
 
But but... My max settings that have to be zoomed in 300 percent on a screenshot to see vs high and ultra mixed!
Same with turning on RT, where majority of folks do not even notice a difference in shadows or paying attention to the illrelevant small reflections while destroying performance. Add in artifacts from upscaling from low resolution and frame generation. Where most, even with RT hardware will leave off.
 
Same with turning on RT, where majority of folks do not even notice a difference in shadows or paying attention to the illrelevant small reflections while destroying performance. Add in artifacts from upscaling from low resolution and frame generation. Where most, even with RT hardware will leave off.
Artifacts from dlss and frame gen vary by game but usually is minimal at 4k. Also, depends on what raytracing you're talking about. Some of it can make a drastic difference, even reflections (Spider-Man for example). That's another case by case basis. RT global illumination is an obvious and huge benefit, as is rt ambient occlusion. Shadows tend to be the least beneficial. Most Nvidia users leave raytracing on, just not at max necessarily.
 
12.5gb is about what they have for ram total after the OS. They still have to share that with the game after, so probably another 3-4gb or more for the game itself. That leaves around 8-8.5gb for the graphics vram usage.
PS5 has two ram pools, 16gb of GGDR6 for high speed for games and 512mb of DDR4 for OS background stuff, i/o. Plus the SSD also acts as a pool which can quickly transfer to and from memory.

What the PS5 can do would take a 12gb to 16gb Vram card to approximate. More data will need to be used for PC cards due to not having the other features and ability of the PS5 like 1.5 seconds to switch between two complete game states rewriting all available VRAM.

Porting is not as simple as it seems due to the differences in how the hardware works.
 
Artifacts from dlss and frame gen vary by game but usually is minimal at 4k. Also, depends on what raytracing you're talking about. Some of it can make a drastic difference, even reflections (Spider-Man for example). That's another case by case basis. RT global illumination is an obvious and huge benefit, as is rt ambient occlusion. Shadows tend to be the least beneficial. Most Nvidia users leave raytracing on, just not at max necessarily.
Agree with most what you say except for saying most leave it on. I believe Hardware Unbox showed 80% of the RTX card users leave it off. Which makes sense since most of the RTX cards are lower end.

Two ways to put RT into games. So called real time or by precalulating, baking high precision light maps and shadow maps, textures. Each has advantages and disadvantages. Real time is dynamic while baking can have much higher quality more accurate solutions but more static. Both are generally used even with RT games since RT hardware is not capable yet for highly complexed enviroments. Some of the reasons why people don't notice the difference since with RT off you still have RT via baking, just not real time.
 
Same with turning on RT, where majority of folks do not even notice a difference in shadows or paying attention to the illrelevant small reflections while destroying performance. Add in artifacts from upscaling from low resolution and frame generation. Where most, even with RT hardware will leave off.

There's a catch with PC gaming that people on forums like this gloss over. The most popular PC video cards are the 3060, 2060, 1060, and the 1660. Two of which do not do any of these fancy things, the other two while they support it can't actually run it respectably. Just as the most common CPUs are six cores and 4 cores, the ram is 16gb followed by 8gb, and the most common resolution is 1080p @60. Which this means is most peoples computer gaming is far from a Master Race item but objectively slaughtered by a console in every aspect. This is leaving out that console optomization is always going to be better because it's a static target and you don't have to deal with Windows. It's truly the low end option where the upside is cheap games on sales, stealing games, and the ability to cheat like crazy. The higher end options are so edge case that it would be lunacy for a developer or publisher to make sure they run well or even work.

We are still a couple hardware itterations out before native 4k and good ray tracing is remotely possible on the computers most people actually run and thus worth spending the time on.

There's also the issue of "what kind of game is it". If you are playing a fast paced online shooter or something like DoTA all these fancy things are pointless to counter productive and nobody is going to turn it on. For single player immersive games they are. But that runs into the issue again of most PC gamers can't run it and they run at lower resolutions to dump money into the console side of things.
 
There's a catch with PC gaming that people on forums like this gloss over. The most popular PC video cards are the 3060, 2060, 1060, and the 1660. Two of which do not do any of these fancy things, the other two while they support it can't actually run it respectably. Just as the most common CPUs are six cores and 4 cores, the ram is 16gb followed by 8gb, and the most common resolution is 1080p @60. Which this means is most peoples computer gaming is far from a Master Race item but objectively slaughtered by a console in every aspect. This is leaving out that console optomization is always going to be better because it's a static target and you don't have to deal with Windows. It's truly the low end option where the upside is cheap games on sales, stealing games, and the ability to cheat like crazy. The higher end options are so edge case that it would be lunacy for a developer or publisher to make sure they run well or even work.

We are still a couple hardware itterations out before native 4k and good ray tracing is remotely possible on the computers most people actually run and thus worth spending the time on.

There's also the issue of "what kind of game is it". If you are playing a fast paced online shooter or something like DoTA all these fancy things are pointless to counter productive and nobody is going to turn it on. For single player immersive games they are. But that runs into the issue again of most PC gamers can't run it and they run at lower resolutions to dump money into the console side of things.
I'd say this generation is a clear choice if you're on a budget. The console wins. Not only from a price to performance perspective but the simple fact that lots of the PC versions/ports have been riddled with problems and we have to wait weeks or months for them to be fixed.
 
I'd say this generation is a clear choice if you're on a budget. The console wins. Not only from a price to performance perspective but the simple fact that lots of the PC versions/ports have been riddled with problems and we have to wait weeks or months for them to be fixed.

Sells less copies, makes less money, is a PITA to optimize for, and most people on the PC run PCs that are worse than the console. It's the low end trash tier of gaming in reality.
 
Sells less copies, makes less money, is a PITA to optimize for, and most people on the PC run PCs that are worse than the console. It's the low end trash tier of gaming in reality.
lol, games are definitely not less expensive on console, unless you have a game pass of some sort which also exists on PC, PC also has the better experience for certain genres (Strategy, MOBA, FPS etc.). It depends what your budget is, console is still the best budget gaming option without a doubt and it has gotten better this generation so far for consoles, but that's typically how it goes in the earlier stages of a consoles lifecycle, then later on in the consoles lifecycle PC has by far the better hardware available. Even now if you have the money you can create a PC that's significantly more powerful than any console. It's a matter of budget and preference.
 
Sells less copies, makes less money, is a PITA to optimize for, and most people on the PC run PCs that are worse than the console. It's the low end trash tier of gaming in reality.
If 8gb cards were limited to less than $300 last couple of generations, 12gb up to $450 and so on up the chain, this would not been as big a issue I think. GPU wise, the 3060 and above are way more powerful than what the console has. Maybe the 12gb cards will be enough for Ports from games that push the consoles and definitely anything above.
 
Is TLoU actually a bad port or a sign for things to come?

Probably both from what I see. Every console generation hardware has been max out in usage by developers. When you have a console that 12.5gb can be used for VRam or for graphics, plus fast streaming of data from the SSD. It will be used in games.

Taking that game and transitioning it to the majority of PC gaming hardware not having in this case VRAM, compromises would have to be done. If not then those games will have issues, big issues from compilation times, crashes due to lack of resources and so on.

Games taking advantage of PS5 fast SSD going to the pc may also have issues as well.

Yet you have graphics card by themselves costing more than the whole console itself, with limited Vram. Stagnating the normal upgrade path on the pc. With the better game play and graphics shifting to the console for the majority what is used for pc gaming.

PC ports may in the end may not be viable with current PC GPUs being used. Majority of them Nvidia designed with less Vram then what the consoles can use.
I agree overall, new game releases in the next few years will most likely require around 16GB VRAM at 4K resolution, but the majority of gamers on PC aren't on 4K yet, I personally prefer 1440 UW I think it's the sweet spot for gaming right now for combining fidelity and performance, I think 12GB VRAM will be enough for this resolution if the game is properly optimized. AMD most likely is in the know as they provide the console's CPU + GPU now, this is most likely why their cards are being equipped with more VRAM as of late, they may be realizing the VRAM tax gaming at 4K better than Nvidia as of right now because new gen consoles are pretty much all connected to 4K TVs. It will be interesting for sure, and it does suck for anyone who dropped a good amount of money on a mid-high tier 30 series card most likely. I'd argue TLoU is just a bad port on release, I've had multiple crashes just waiting for shaders to compile so I'm deciding to wait for it to be fixed, not a huge deal since I've already played through the game a few times on PS4.
 
Last edited:
I think by the time 16GB becomes more of a requirement the Nvidia 5000 series will be released...I can hang on to my 3080 10GB VRAM GPU for another year (especially since I game at 1440p with a G-Sync display)

I wonder how much VRAM the upcoming Cyberpunk 2077 Overdrive advanced ray-tracing patch will need
 
I think by the time 16GB becomes more of a requirement the Nvidia 5000 series will be released...I can hang on to my 3080 10GB VRAM GPU for another year (especially since I game at 1440p with a G-Sync display)

I wonder how much VRAM the upcoming Cyberpunk 2077 Overdrive advanced ray-tracing patch will need
If I am not mistaken (been about a year since i played) I know I was using over 10GB of memory playing at 4k with maxed out settings. I cannot remember how much though. Maybe at 1440p 10GB was fine?

With this new ray tracing path I am expecting it to be quite a bit more now.
 
If I am not mistaken (been about a year since i played) I know I was using over 10GB of memory playing at 4k with maxed out settings. I cannot remember how much though. Maybe at 1440p 10GB was fine?

With this new ray tracing path I am expecting it to be quite a bit more now.

with 'normal' RT maxed out at 1440p 10GB 3080 is fine...but with the new Overdrive patch I don't think it'll be playable with a 3080
 
with 'normal' RT maxed out at 1440p 10GB 3080 is fine...but with the new Overdrive patch I don't think it'll be playable with a 3080
Max out RT now to see what you get at 1440p. Then know that it will be even worse next week lol.

Even with all the VRAM in the world. I got a feeling it won't be all that playable for a lot of people imo.
 
I think by the time 16GB becomes more of a requirement the Nvidia 5000 series will be released...I can hang on to my 3080 10GB VRAM GPU for another year (especially since I game at 1440p with a G-Sync display)

I wonder how much VRAM the upcoming Cyberpunk 2077 Overdrive advanced ray-tracing patch will need
Knowing how launch went, I don't think there's enough VRAM in the world to be able to run that patch...
 
Even with all the VRAM in the world. I got a feeling it won't be all that playable for a lot of people imo.

agreed...that's why they're calling it a 'Tech Preview'...I have my doubts that even a 4090 can handle it at 4K at respectable frame rates (even with DLSS 3)...a fully path traced modern AAA game?...I'm sure it's going to require a ton of patches/optimization
 
agreed...that's why they're calling it a 'Tech Preview'...I have my doubts that even a 4090 can handle it at 4K at respectable frame rates (even with DLSS 3)...I'm sure it's going to require a ton of patches/optimization
At least the screenshots will look pretty lol
 
I think by the time 16GB becomes more of a requirement the Nvidia 5000 series will be released...I can hang on to my 3080 10GB VRAM GPU for another year (especially since I game at 1440p with a G-Sync display)

I wonder how much VRAM the upcoming Cyberpunk 2077 Overdrive advanced ray-tracing patch will need
Pretty sure I watched a video where they stated at minimum a 4080, preferably a 4090 at least at 4K for the new patch.
 
I mean hey, nobody said it's wrong if you have to decrease settings. It's just a shame some cards have to do it earlier than others, especially considering the prices paid for them.
 
we're in a bit of a strange transition period: graphics card performance requirements are increasing at a steady pace but actual, on-card VRAM quantities are stagnating because of the need to pair chips with a certain bus width and the need of the GPU chip designers to limit the bus width on lower-end cards to avoid larger costs because the memory controllers/interfaces take up large portions of the GPU silicon die.

Taking into account that APIs like DirectStorage could increase VRAM requirements, along with general increases due to higher graphical fidelity and implementations of ray tracing, there appears to be a movement towards requiring higher VRAM quantities, as standard. However, up until the current generation of cards, GDDR6 module capacities are limited to 8 Gb (1 GB) and 16 Gb (2 GB).

http://hole-in-my-head.blogspot.com/2023/01/next-gen-pc-gaming-requirements-2022.html?m=1

Spot on. I've been saying this about gddr6 for a while especially since bringing the higher end cards down to 256 bit. Thinks will make alot more sense once the next gen of vram drops (gddr7?). Mid range 128 bit cards will get a healthy 16 GB cards while higher end 192 bit cards will see 24 GB and maybe 12 GB options. Very high xx90/xx80ti end will likely be 256 bit with 32 GB.
 
Spot on. I've been saying this about gddr6 for a while especially since bringing the higher end cards down to 256 bit. Thinks will make alot more sense once the next gen of vram drops (gddr7?). Mid range 128 bit cards will get a healthy 16 GB cards while higher end 192 bit cards will see 24 GB and maybe 12 GB options. Very high xx90/xx80ti end will likely be 256 bit with 32 GB.
Yep. Gddr7 should be a major boon. It may translate to lower costs too.
 
Spot on. I've been saying this about gddr6 for a while especially since bringing the higher end cards down to 256 bit. Thinks will make alot more sense once the next gen of vram drops (gddr7?). Mid range 128 bit cards will get a healthy 16 GB cards while higher end 192 bit cards will see 24 GB and maybe 12 GB options. Very high xx90/xx80ti end will likely be 256 bit with 32 GB.

rumors are the Nvidia 5000 series will be using GDDR7
 
If I am not mistaken (been about a year since i played) I know I was using over 10GB of memory playing at 4k with maxed out settings. I cannot remember how much though. Maybe at 1440p 10GB was fine?

With this new ray tracing path I am expecting it to be quite a bit more now.
Verdicts out the overdrive patch even brings the 4090 to unplayable frames (without DLSS and Frame Gen), looks like they’re crippling GPUs with software these days. Anything for the sake of infinite growth/sales, things will most likely get worse before they get better as Nvidia has essentially cornered the market for Ray Tracing, unfortunately it looks amazing when it’s implemented correctly so they’ll continue to get away with their pricing.
 
As the port does have issues. It does run fine on a lot of higher end system. Usually the people who can’t run it was running a lower end pc’s.

That’s why I ignore steams reviews. A lot of those people bitching are running 1060/2060/3050 cards expecting it to run at high settings.
It’s not fine, I’m running a 9900KS @ 5GHZ, 2070S 8GB, 32GB Ram at 3440 x 1440 on Medium preset and it has crashed 3-4 times, waiting for a fix, it’s simply poorly optimized.
 
It’s not fine, I’m running a 9900KS @ 5GHZ, 2070S 8GB, 32GB Ram at 3440 x 1440 on Medium preset and it has crashed 3-4 times, waiting for a fix, it’s simply poorly optimized.
No, sounds like you need more vram playing at 1440p. Go down to 1080p and you should be fine.

Also, the latest nvidia drivers were for 3000 series only. It’s possible they haven’t patched for the 2000 series since it’s now 2 generations old.

Can’t say it’s unoptimized, when with your setup there is too many variables
 
Last edited:
No, sounds like you need more vram playing at 1440p. Go down to 1080p and you should be fine.

Also, the latest nvidia drivers were for 3000 series only. It’s possible they haven’t patched for the 2000 series since it’s now generations old.

Can’t say it’s unoptimized, when with your setup there is too many variables
I’m not dropping to 1080p, I just got the AW3423DWF last week I shouldn’t have to play at low settings for the game to run without crashes even with a 2070S. I know I need a GPU upgrade, I’m waiting to see how the 4070’s perform they’re releasing on the 13th although I may bite the bullet on a 4070Ti with the hopes it can last 3-4 years at this resolution; I have no interest in gaming at 4K. At some point I’ll do a full upgrade most likely to a 13600k, but the 9900KS should still be fine for this monitor.
 
I’m not dropping to 1080p, I just got the AW3423DWF last week I shouldn’t have to play at low settings for the game to run without crashes even with a 2070S. I know I need a GPU upgrade, I’m waiting to see how the 4070’s perform they’re releasing on the 13th although I may bite the bullet on a 4070Ti with the hopes it can last 3-4 years at this resolution; I have no interest in gaming at 4K. At some point I’ll do a full upgrade most likely to a 13600k, but the 9900KS should still be fine for this monitor.
And with the 4070ti with 12GB you might possibly run into issues very quickly. At your resolution I would not get a card under 16gb Or you will be running into the same issues you are having now.

3 recent major games released are making a 8gb card look bad. It’s not because the GPU can’t handle the performance. It’s just running out of vram.

IMO you will be in the same boat if you got a 4070ti. Your best bet is to get a 6950xt, 4080 RTX or 7900xt/xtx.
 
And with the 4070ti with 12GB you might possibly run into issues very quickly. At your resolution I would not get a card under 16gb Or you will be running into the same issues you are having now.

3 recent major games released are making a 8gb card look bad. It’s not because the GPU can’t handle the performance. It’s just running out of vram.

IMO you will be in the same boat if you got a 4070ti. Your best bet is to get a 6950xt, 4080 RTX or 7900xt/xtx.
If that is going to be the case I’d just say screw Nvidia and grab a 7900 XT even if it means missing out on RT until UE5 becomes more mainstream. I’m going to wait a bit and see how things pan out over the next couple of months, forget spending $1,500 on a GPU lol. I’m still able to run the vast majority of games just fine on the 2070S.
 
No, sounds like you need more vram playing at 1440p. Go down to 1080p and you should be fine.

Also, the latest nvidia drivers were for 3000 series only. It’s possible they haven’t patched for the 2000 series since it’s now 2 generations old.

Can’t say it’s unoptimized, when with your setup there is too many variables
People were saying similar things when Arkham Knight was ported to PC, it most likely needs to be patched quite a bit imo, and there are a lot of variables on any PC build it comes with the territory.
 
And with the 4070ti with 12GB you might possibly run into issues very quickly. At your resolution I would not get a card under 16gb Or you will be running into the same issues you are having now.

3 recent major games released are making a 8gb card look bad. It’s not because the GPU can’t handle the performance. It’s just running out of vram.

IMO you will be in the same boat if you got a 4070ti. Your best bet is to get a 6950xt, 4080 RTX or 7900xt/xtx.
Well 12GB 6X is a big difference compared to 8GB 6, plus the L2 cache on the 4070 Ti is a ton more than a 2070S. The 192 bit bus is a big head scratcher for sure, which makes the overall bandwidth barely more than my 2070S. I’m not quite sure how the L2 cache really affects performance of VRAM.
 
Last edited:
And with the 4070ti with 12GB you might possibly run into issues very quickly. At your resolution I would not get a card under 16gb Or you will be running into the same issues you are having now.

3 recent major games released are making a 8gb card look bad. It’s not because the GPU can’t handle the performance. It’s just running out of vram.

IMO you will be in the same boat if you got a 4070ti. Your best bet is to get a 6950xt, 4080 RTX or 7900xt/xtx.

Let's not get ahead of ourselves and sacrifice huge performance differences in the name of vram panic. 6 GB first became a real issue at the high end around early 2020 with games like Wolfenstein, Doom external, Cyberpunk and a couple others. That was like 3 years ago and finally 8 GB is seeing a handful of cases where 8gb vram holds back an 8GB VRAM CARD.

We should be fine with 12 GB cards for a long while especially at current 12 GB card performance.

**REMEMBER: it's not how much vram a card has, but how much vram a card had in relation to its performance ie. nobody with an RX580 or GTX 1070 need to worry about vram because those cards have "only" 8 GB.
 
Let's not get ahead of ourselves and sacrifice huge performance differences in the name of vram panic. 6 GB first became a real issue at the high end around early 2020 with games like Wolfenstein, Doom external, Cyberpunk and a couple others. That was like 3 years ago and finally 8 GB is seeing a handful of cases where 8gb vram holds back an 8GB VRAM CARD.

We should be fine with 12 GB cards for a long while especially at current 12 GB card performance.

**REMEMBER: it's not how much vram a card has, but how much vram a card had in relation to its performance ie. nobody with an RX580 or GTX 1070 need to worry about vram because those cards have "only" 8 GB.
That is correct because by todays standards those cards are to play games at low/medium setting where 8GB of vram is enough.

Again, with the newer generation cards. Its not the GPU performance thats the issue. Its the VRAM to play at higher fidelity settings. You aren't trying to max out game settings with a 580/1070.
 
That is correct because by todays standards those cards are to play games at low/medium setting where 8GB of vram is enough.

Again, with the newer generation cards. Its not the GPU performance thats the issue. Its the VRAM to play at higher fidelity settings. You aren't trying to max out game settings with a 580/1070.
12gb is plenty for the foreseeable lifespan performance wise of 4070/4070ti.
 
That is correct because by todays standards those cards are to play games at low/medium setting where 8GB of vram is enough.

Again, with the newer generation cards. Its not the GPU performance thats the issue. Its the VRAM to play at higher fidelity settings. You aren't trying to max out game settings with a 580/1070.
We established that 8gb was not enough for today's high end cards. I was just responding to you saying that 12gb might not be enough in a high end card such as dismissing a 4070ti.
 
We established that 8gb was not enough for today's high end cards. I was just responding to you saying that 12gb might not be enough in a high end card such as dismissing a 4070ti.
Do you think it will be sufficient at 3440 x 1440? Especially considering it can run Ray Tracing, DLSS 3 and Frame Gen for casual single player titles? Ideally I would wait for 50 series and GDDR7 but that won’t be for at least 18-24 months, maybe longer for mid-high tier gaming cards.
 
Last edited:
Back
Top