How does an LG OLED solve the 4:3 sizing problem?
This illustrates it very well. See how the width of a rotated 42" 16:9 monitor is almost the same as the width a non-rotated 4:3 25" CRT.
http://www.displaywars.com/42-inch-d{9x16}-vs-25-inch-4x3
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
How does an LG OLED solve the 4:3 sizing problem?
This illustrates it very well. See how the width of a rotated 42" 16:9 monitor is almost the same as the width a non-rotated 4:3 25" CRT.
http://www.displaywars.com/42-inch-d{9x16}-vs-25-inch-4x3
Ah, so you run it in portrait, letterboxed?
Back in the day the Abit boards were known for this. They were light on features, but fast and pretty reliable.I'd buy it.
In just about everything and anything. I'll pay more money for quality/reliability/sturdiness but I will always opt for th eproduct with the lest number of "features" as they always cause problems.
There ought to be enough of us that there is at least a market for these products.
But it is true. In general the consumer is the worst enemy of good product design. The consumer/customer is the lowest common denominator.
Interesting trivia: if you use the Windows built in desktop rotation feature, it actually doesn't work with G-Sync. Good job Nvidia/Microsoft. You have to rely on software's own rotation
Abit was my favorite brand back then.Back in the day the Abit boards were known for this. They were light on features, but fast and pretty reliable.
Abit was my favorite brand back then.
Not needing HDMi 2.1I don't get what you mean. Inventing what problems? What did you do with your 4830 and 7870?
My 32" LCD used vertically, approximates a 4:3 20" CRT width in my cabinet. Works very well as it also doubles as a pinball machine with marquee/backboard on the second horizontal monitor. So much of the Golden Age games run vertical and IMO look pathetic on a horizontal 16:9.This illustrates it very well. See how the width of a rotated 42" 16:9 monitor is almost the same as the width a non-rotated 4:3 25" CRT.
http://www.displaywars.com/42-inch-d{9x16}-vs-25-inch-4x3
Not needing HDMi 2.1
Are they really missing anything here though? They aren't making it an APU, it sounded like a very basic iGPU and I think is part of the I/O die, so are you really missing anything here considering the I/O feature set?Welcome to why I’ve had a GeForce 710 for years. One GPU I can move around as necessary but generally hangs out in the box not drawing idle power and not wasting transistors on my CPU. The extra die space on the CPU could, if nothing else, be better used for cache.
It’s on the IO die, completely separate from the CPU, it’s presence doesn’t detract at all from the processor. Depending on how AMD leverages it, it could be a benefit, there are a number of IO operations that could benefit from GPU acceleration that it could handle independently and ahead of the CPU cores.Welcome to why I’ve had a GeForce 710 for years. One GPU I can move around as necessary but generally hangs out in the box not drawing idle power and not wasting transistors on my CPU. The extra die space on the CPU could, if nothing else, be better used for cache.
Bingo!Because the 42" mounted vertically in a cabinet just happens to fit 25" 4:3 cabinets perfectly and ends up being roughly the same size when you rotate the display of the game.
It’s a feature I’ll literally never use, and I’ll disable as far as possible to save power. I’m simply going to pair a high end processor with a high end video card. Hopefully this new iGPU is on its own power plane so it can be completely powered down.Are they really missing anything here though? They aren't making it an APU, it sounded like a very basic iGPU and I think is part of the I/O die, so are you really missing anything here considering the I/O feature set?
I guess we can agree to disagree.
I would say it’s detrimental if it uses any power when I have a discreet card installed.Again if they didn't included doesn't mean you would get more CPU functions. Having a igpu doesn't hurt anyone in anyway. I normally don't have a spare GPU and a igpu is always helpful for trouble shooting.
Unless the BIOS/UEFI and/or APU architecture are crap, the iGPU should be able to be fully disabled in BIOS/UEFI.I would say it’s detrimental if it uses any power when I have a discreet card installed.
Again:
It’s a feature I’ll literally never use. It’s dead transistor space *at best *. At worst, it’s dead transistor space that uses power and increases heat. Nearly any other use for that transistor count would be better.
The new IO and igp likely use less power and generate less heat than the previous IO controller alone.I would say it’s detrimental if it uses any power when I have a discreet card installed.
Again:
It’s a feature I’ll literally never use. It’s dead transistor space *at best *. At worst, it’s dead transistor space that uses power and increases heat. Nearly any other use for that transistor count would be better.
Some of us do actually concern themselves about power usage, down to levels of tenths of a watt…This will be the first ryzen cpu with an igp, they obviously decided it was a good decision to include. I’m sure it will use less power than your rgb bling or a single fan that nobody seems to concern themself with regarding power consumption.
On a 170-230w part?Some of us do actually concern themselves about power usage, down to levels of tenths of a watt…
https://hardforum.com/threads/distributed-computing-on-raspberry-pi.1997998/#post-1044639489
I get it, but if .1 watts really matters, I’m not sure x86 is the right place to be doing your computing.Some of us do actually concern themselves about power usage, down to levels of tenths of a watt…
https://hardforum.com/threads/distributed-computing-on-raspberry-pi.1997998/#post-1044639489
I always assumed they’d do a gpu chiplet, and have it communicate through then IO die with the cpu. The gpu portion of things tends to be pretty big, not sure if putting it in the IO would yield well enough.Something just occurred to me here for the eventual APU releases.
(Warning it’s late and this may be stupid)
When they want to release a series with beefed up graphics do you think they will just change out the IO die?
I have to imagine it’s easier to build a GPU then add on the IO logic then it is to design a CPU then also cram a GPU up inside it.
Just a random 1am thought.
I've wished for an igpu several times when having issues. I'm glad AMD are including it.
Yes. I power limit my 5950s via eco mode, power limit my videocards via nvidia smi, and disable as much of the junk that comes with motherboards as possible. Points per watt is the key performance measurement when you're running distributed computing 24x7x365.On a 170-230w part?
I mean, that link WAS to power tweaking raspberry pis, and it was doing the same basic thing that I'm talking about here - getting rid of the junk. Disable as much of the iGPU as possible. Turn off the LEDs. Reduce networking to 100mbit. Etc.I get it, but if .1 watts really matters, I’m not sure x86 is the right place to be doing your computing.
It is useful. It's just not useful to you.I mean, that link WAS to power tweaking raspberry pis, and it was doing the same basic thing that I'm talking about here - getting rid of the junk. Disable as much of the iGPU as possible. Turn off the LEDs. Reduce networking to 100mbit. Etc.
The issue with the rasppberry pis (as well as most consumer ARM hardware) is that they use 28nm for making those chips. Arm is pretty awesome at efficiency, but a well tweaked 5950 is more efficient at points per watt than a raspberry pi is. If I could buy a few of these (https://d1o0i0v5q5lp8h.cloudfront.n...uments/Altra_Max_Rev_A1_DS_v1.00_20220331.pdf) like I can a 5950, I would likely be running ARM.
Edit:
Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards. If the transistor count can't be used for something useful, I'd prefer they just left it out. That simple GPU in the new Ryzens is certainly a multi million transistor expansion - it would be so much better as something generally useful, even if it was some dram for the io. If it can't be something useful, just leave it off - and iGPU is NOT useful on a high end CPU.
But the IO is now pretty small and now on the same process as the 6x50 series GPUs. So so I just thought we’ll I mean it’s there, just slap the IO on a GPU and call it done.I always assumed they’d do a gpu chiplet, and have it communicate through then IO die with the cpu. The gpu portion of things tends to be pretty big, not sure if putting it in the IO would yield well enough.
One can imagine that, but one would have a hard time imagining one would on a desktop PC, how many Kw/H a year are we talking ?Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards
There are a lot of compute tasks where a GPU will be more efficient than a CPU. In a compute environment such as that why not leverage them and actually save on overall consumption?I mean, that link WAS to power tweaking raspberry pis, and it was doing the same basic thing that I'm talking about here - getting rid of the junk. Disable as much of the iGPU as possible. Turn off the LEDs. Reduce networking to 100mbit. Etc.
The issue with the rasppberry pis (as well as most consumer ARM hardware) is that they use 28nm for making those chips. Arm is pretty awesome at efficiency, but a well tweaked 5950 is more efficient at points per watt than a raspberry pi is. If I could buy a few of these (https://d1o0i0v5q5lp8h.cloudfront.n...uments/Altra_Max_Rev_A1_DS_v1.00_20220331.pdf) like I can a 5950, I would likely be running ARM.
Edit:
Now imagine you're doing this over and over again - disabling iGPUs, disabling onboard sound, disabling disco lights, disabling USB, disabling SATA, etc, etc, etc. You might start to get a little bit bitter at all the literal crap the manufacturers include that would be better left as expansion cards. If the transistor count can't be used for something useful, I'd prefer they just left it out. That simple GPU in the new Ryzens is certainly a multi million transistor expansion - it would be so much better as something generally useful, even if it was some dram for the io. If it can't be something useful, just leave it off - and iGPU is NOT useful on a high end CPU.
I haven’t used SATA in most of my systems for years now — The only system that has SATA drives is my freenas. I put exclusively NVME in everything else, so sata gets disabled and has become dead transistors. It wouldn’t bother me in the slightest to have to buy a SATA controller for my NAS, or buy a specific NAS MB. Wanting SATA in every MB at this point would be similar to wanting a built in IDE controller on your MB…It is useful. It's just not useful to you.
Also, why would I disable SATA, USB, and sound? Those should all be expansion cards? What is this the 90s and 00's still?
I already have a X080 card in my system anyway, and fully support offloading to those when it makes sense. For example, I run folding @home on my GPU because it’s way more efficient than running it on CPU.There are a lot of compute tasks where a GPU will be more efficient than a CPU. In a compute environment such as that why not leverage them and actually save on overall consumption?
OK, well I feel like you are an edge case and don't speak for the majority here. Not sure why anybody would otherwise give two shits about a 10th of a watt for their gaming PC. Not saying you're wrong or anything, just saying guess we'll agree to disagree on what's useful or not.I haven’t used SATA in most of my systems for years now — The only system that has SATA drives is my freenas. I put exclusively NVME in everything else, so sata gets disabled and has become dead transistors. It wouldn’t bother me in the slightest to have to buy a SATA controller for my NAS, or buy a specific NAS MB. Wanting SATA in every MB at this point would be similar to wanting a built in IDE controller on your MB…
Sound wise, I have never used integrated sound in any system I’ve had - and that goes back to my pentium 2s. Back then I used sound blasters, now I just use my headset which has it integrated already. Integrated sound is, and has always been, dead transistors.
As for USB, I disable it in all my Pis and any headless system as I do management remotely anyway. In my box that I also use for gaming, I only need 2 USB ports. When the MB allows it, I’ll disable additional controllers.
All of that is effectively built into the CPU - even if you don't want it, it's on the CPU die, because a lot of folks ~do~ - and the majority of users aren't the folks on [H] either way.I haven’t used SATA in most of my systems for years now — The only system that has SATA drives is my freenas. I put exclusively NVME in everything else, so sata gets disabled and has become dead transistors. It wouldn’t bother me in the slightest to have to buy a SATA controller for my NAS, or buy a specific NAS MB. Wanting SATA in every MB at this point would be similar to wanting a built in IDE controller on your MB…
Sound wise, I have never used integrated sound in any system I’ve had - and that goes back to my pentium 2s. Back then I used sound blasters, now I just use my headset which has it integrated already. Integrated sound is, and has always been, dead transistors.
As for USB, I disable it in all my Pis and any headless system as I do management remotely anyway. In my box that I also use for gaming, I only need 2 USB ports. When the MB allows it, I’ll disable additional controllers.
This. Folding/WCG is only a niche use case still (and I was a member of the team here for a while). He has an edge case - I have the opposite edge case.OK, well I feel like you are an edge case and don't speak for the majority here. Not sure why anybody would otherwise give two shits about a 10th of a watt for their gaming PC. Not saying you're wrong or anything, just saying guess we'll agree to disagree on what's useful or not.