Non- Gamer, Non-Miner uses for GPU?

DarkSideA8

Gawd
Joined
Apr 13, 2005
Messages
989
I've read a bit about folding, where researchers use distributed networks of GPUs to run simulations.

What other uses are there for GPUs or GPU ASICs? i.e. what are they better for than traditional CPUs?
 
Are we just ignoring the whole field of GPU compute?
Wellll - no... but I'm wondering if a whole lot of cards/ASICs did not get bought up for mining... but rather Corporate / Military espionage.

I know they're being used in AI and machine learning.
 
Last edited:
GPU compute is a *really* big field. Everything from protein folding and fluid simulation to disease modeling and subsurface seismic wave migration are ongoing, complicated topics of research with non-spooky applications. I don't even wanna know what some government agencies are up to with CUDA and ROCm behind security clearances.
 
I've read a bit about folding, where researchers use distributed networks of GPUs to run simulations.

What other uses are there for GPUs or GPU ASICs? i.e. what are they better for than traditional CPUs?
There is rendering farm that would be an other obvious and traditional by now one.

You can take a look at that list:
https://en.wikipedia.org/wiki/List_of_OpenCL_applications

For something that is maybe what you had in mind:
https://en.wikipedia.org/wiki/Hashcat

Previously, two variants of hashcat existed:

  • hashcat - CPU-based password recovery tool
  • oclHashcat/cudaHashcat - GPU-accelerated tool (OpenCL or CUDA)
With the release of hashcat v3.00, the GPU and CPU tools were merged into a single tool called hashcat. The CPU-only version became hashcat-legacy.[4] Both CPU and GPU now require OpenCL.

From what I understand of GPU and my little coding on them, they become incredibly better than CPU if you can make a similar operation to a lot of different value at the same time.

GPU are build around taking every vertex in a scene an applying to all of them that same deformation matrix to simulate where they need to be to simulate the camera position/orientation, they do not tend to support if condition, branch prediction and so on very well and small amount of operation that need to wait for a different result, they love doing in brute parallel a giant amount of very similar operation at the same time.
 
GPU compute is a *really* big field. Everything from protein folding and fluid simulation to disease modeling and subsurface seismic wave migration are ongoing, complicated topics of research with non-spooky applications. I don't even wanna know what some government agencies are up to with CUDA and ROCm behind security clearances.
Accelerated sort of something as simple as which coupons generate the most complementary sales like .99cent loaf of bread sells how much mayo/jelly/butter/bologna/put butter/etc at scale.

You'd want to how diff regions buy before you sign contracts for condiments promoted thru your loss leader internal bakery. Don't want to forget ketchup if that ends up to be a major condiment.

You can also do promos like: https://hotpocketsforbits.com/
and get better geotargeting along with process association to new things like streaming. Sounds funny, but Hot Pockets has to retarget or slip into a Boomer non sale state like Malt O Meal, Miracle Whip, or Steakum.
 
  • Like
Reactions: Halon
like this
The noise canceling in Nvidia broadcast (requires an RTX GPU) is pretty dang good.
 
Computational machine learning and artificial intelligence. GPUs are very effective at training deep neural networks on all sorts of datasets, as well as many other techniques people use such as for instance Gaussian processes. There's lots of ongoing work in various areas like computer vision, natural language processing, reinforcement learning and robotics.

Everybody except Google and a small number of industry players who can afford ASICs (Google calls their in-house ones tensor processing units) are doing their computationally intensive work on GPUs. I'm in process of trying to buy an RTX 3090 for a new workstation which will be used for academic work in this space.
 
  • Like
Reactions: Halon
like this
I use mine for blender rendering as well as transcoding sometimes... i'm sure there are other things, but those are my most common uses.
 
From my colleagues experience dealing with DNA sequencing there is a HELL of a lot of use of GPUs on that front. When you're dealing with literally terabytes of data of ATCG, GPUs do it far better than their 64core+ workstations managed.

They have queued backlogs for weeks.
 
I mean the biggest uses would be other 3D stuff. It isn't like games are the only thing we use 3D for. Be it architecture, or machine parts, or fluid dynamics, or anything else, we use computers to model it in 3D and GPUs are a necessity for that.

As for non-graphics uses? Any kind of computation that is very parallel, and doesn't involve an excessive amount of branching. Physics simulations are a big one, programs like Comsol, Ansys, and such all know how to use GPUs to accelerate their calculations because they run well on them.

nVidia will, unsurprisingly, give you a big list of all the things that are being done with it.

As for people buying up the new cards for those areas? Probably not a ton. Most of the time you want a Quadro or Tesla. This shortage seems to be largely driven by consumer demand. Notice that it isn't just GPUs that are in short supply. The new consoles are sold out everywhere, the new iPhone is backordered to hell and gone. There is a ton of pent up consumer demand caused in no small part by COVID and people are going nuts purchasing new toys.

In the professional arena it is usually a much more orderly purchase process. If you need a big computer with lots of GPU power, you go out for bids, find someone, and get an order in for a system that gets built and delivered many months in the future. For that matter, they had Ampere to play with before consumers did. nVidia released their big A100 accelerator back in May.
 
I've read a bit about folding, where researchers use distributed networks of GPUs to run simulations.

What other uses are there for GPUs or GPU ASICs? i.e. what are they better for than traditional CPUs?


GPU video decode can be a necessity for certain video codecs. GPU compute can greatly help video and photo editing. Cad-cam work. Scientific work as well.
 
Last edited:
Back
Top