Deleting the ability for CPU-GPU to multiply FP16-int8, etc... matrix, while still being good at raster will be hard to do, not sure what it would mean, every PC ever sold can do it no ?
https://calsci.com/Benchmarks.html
Old benchmark of training neural network on 8086 to Pentium 2 cpus
First one was the forest and nature:
https://www.youtube.com/watch?v=rTX1wtw606s
this one the city seem to have stepped it up.
I would not expect them to match Rockstars or anything (or be incredible performance for what they render like the Call of Duty teams), but for a small studio for a...
Graphic look not like a big jump but I suspect it will have twice of it (the city footage for example seem a big jump in complexity) and the first one did hold up over time really well anyway, with hardware evolution catching up to be able to run it over time.
And there something about looking...
But how can a brightness level that is ok to use in a sunny room-exterior can be ok in low-light inside condition, one or the other would be too bright or not bright enough it is impossible for it to not be the case, are you sure you do not have the Automatically adjust brightness set to on that...
I would imagine patch to change the framerate on 30 or 60 fps mode on free floating performance one when they exist maybe it will be automatic, but for the 30-60 will have it hardcoded.
https://www.theverge.com/2024/4/16/24131799/sony-ps5-pro-enhanced-requirements-ultra-boost-mode
Developers...
They already do quiet revision like that for newer PS5 revision over time, they have shipped smaller TSMC 6 SoC in lighter console for a while I think:
https://press-start.com.au/news/playstation/2022/08/29/new-ps5-model-cfi-1202b-cfi-1202a/
Launch: Disc: 4.5kg / Digital: 3.9kg
2021 Revision...
I am not much of a big laptop user, but wouldn't a level of brightness that work well outside be too bright inside, my Laptop I am using to type this right now, is almost to its minimum of brightness, not being use with strong light around. Using 50% probably being an easy to everyone set and...
But the question was about that point that was made:
. a company claims they can't provide raises, that union(izing) demand are exorbitant, that they HAVE to raise prices, lower support, outsource, do some other worker or customer unfriendly thing.... but it was found out that money they could...
There a compete trying to hide stuff to competition, wanting to flash they are the best, I imagine often they exaggerate by cherry-picking to look more advanced than it is (or even people in suit)....
Stock buy back, money goes out and to the same people, no ?
How any of it would be different if they had spent the same money going to the same people has a dividend instead of a buyback ?
I doubt that the case in realistic days of work load (heavy load would be quite niche, who run all day all core task on battery ?), obviously if you find way to use the same amount of watts of 2 computers with similar battery you end up with similar battery life , Apple will shine in how much...
That amount of difference people make from a company making a stop buyback vs a divident was always quite strange.
Comcast give 4-5B a year in dividends as well... why the other look specially at buy backs here ? Outside some option of timing your gain for the stockholder that can have tax...
Its compilation in general, chromium, linux kernel, etc... are just common source code to do for them like the BMW scene in Blender.
It is not something many people do, but of the people interested in a 16 core CPU, that software rendering/compression and other task that scale well yes.
Turn...
But is there many that use 16 that would not use 24, i.e. if more than 12 has any use... if we see a reason to go beyond 12, the same reason would apply for 24.
People interested in the 16 core are doing stuff that scale went going from 12 to 16 (or they would have bought the 12 core):
And...
More than 8 threads are common now (my cheap Acer laptop has 12 core-16 thread), 12 will tend to be the frequent minimum, consoles have 16 threads since 2020
And if your code can take advantage of 10 threads but not 18, it was probably quite the challenge to reach 10 already and it stopping to...
Maybe but isn't AVX512 virtually unused in something like Cinebench 2024 ? And a long list (most?) of things, would the gains are from there they would not be seen that much.
I think pre-delay they would have been close to be latest process (would the 3N happened in 2026 and would the N4 launched in 2024 like planned instead of maybe 2025.
N4-2025 instead of 2024
N3-2028 instead of 2026
still advanced and previous node obviously but not latest Apple level.
The screen come from an other laptop and a tablet.
If you ssh (or other) to your work computer/server session, infinite ? If you write text, html, finalcut/premiere is able to do quite a bit on 8GB, specially if you do not mind for 4k
Could be, considering it is purely a "choice", that will be decided with supply capacity-amd-etc... very well possible specially that the gap between the top popular card and the xx80 is quite big this time around, it is easy to make a new 5070 that beat the 4080 by a good amount to be super...
Yes exactly, we see xx70 class card matching the 3090 ti and beating the 3090... even at 4k ? That was the same with tthe 2080Ti vs 3070, in some scenario the 2080TI was still a bit faster, but those 2 card were in general in the same performance class (the you need to see numbers to see a small...
Have you defined the single monitor you want (the middle) to be the main monitor ?
If you monitor in the real world are 1-2-3 I am not sure why you want to rearrange anything....? they are already in the good order, could you take a screen shot of your Display setup and tell us which order they...
the title ;), even in the small summary of the source talking about the source they do not say that:
Insider Gaming, the first to report the full PS5 Pro specs, suggests a potential release during the 2024 holiday period.
https://insider-gaming.com/ps5-pro-specs-2024/
Insider Gaming can...
I think here there is some usecase, people that like Apple ecosystem and want something similar to a Chromebook or Ipad with a keyboard for typing or that will remote connect to their computer (work computer) for anything heavy, imagine you are a Google employee, many teams work mostly on remote...
Shuffling around should not change the monitor number (maybe setting one as primary do that-and or changing the port you connect them to)
I just mean, if you are shuffling them to be say 3-1-2, when you click identify do you see on the number that pop-up for the left screen to be the number 3...
For a while there was a demo and it had interesting differences over many of the typical genre.
One thing those games tend to not have right, is how often work from home was common back then, people lived at their bakery not commuted to work (which I imagine could be for gameplay reason making...
Well if we say that something called Xbox 5 when there never was a Xbox 4 is not stupid and not confusin, is the Xbox 5 released before or after the XboxX (or the PlayStation 6....), which one is better... impossible to know.
They could hit the pill to start correctly from there at least I agree.
I think they do the moment they go to manually build one on dell, but
https://www.dell.com/en-us/shop/dell-laptops/xps-15-laptop/spd/xps-15-9530-laptop/usexchcto9530rpl07?ref=variantstack
going from 16 to 32 is $200, a giant amount for 16GB of 4800-DDR5.
Apple charges the double $200 by 8 GB...
I do not imagine DX has still any marketing value, it became seen has a normality to have some 3D API going on, we will have been on DX12 for a decade next year summer, DirectX box is a bit meaningless now, being from Microsoft was never a big plus (for dev, bestbuy and publisher a giant one...
Even At 4k the 1080 was 70% faster than the 980 and Pascal is a legendary release.
How much Lovelace gained over Ampere could have put really high expectation (the 3070 ti use all of a 392 mm GA104, 256 bits bus, a 4080 super has a 379mm on 256 bits, both running at 300w, the 4080 super is...
Can make the "same gen mid-cycle" release challenging to name from the "new gen" if they want for such notion to still exist.
Xbox 2025 X that release in 2028 sound strange.
A very simple thing I do if it match your use case and are on windows.
1) Have the monitor that would be the only monitor in the 1 screen scenario be your main monitor.
2) windws + p, Pc screen only will put you in 1 monitor mode,
3) windows+p, extend will put you back to your 3 monitor like...
Not sure if that was for technical reason but I aesthetic choice (you can always do as much days as night has you want in your shaders), night can hardly be too dark if there no strong moon, it is normal to see nothing at all in moonless night and not be able to do task, specially if their any...
This sound almost impossible.
Xbox Series X 2 sound a bit stupid and anything that follow the Xbox Series X that is not Xbox Series X 2 will be like always confusing.
Xbox
Xbox 360
Xbox One, One S and One X
Xbox Series X and Series S
What could the next generation name be that would not be...
Maybe on training, but on inference those card being good at it will be a selling point for games and not something you want to necessarily gimp, what Tensor core will be on the 5000 series could open interesting DLSS and other use case like their AI run game demo, they could want the 5060 to...