I'm reading the Rules & Policies section of the folding@home website, in particular the Best Practices, ("Folding@home (FAH) is a major scientific endeavor, but is also a kind of contest for some donors to see who can donate the most points.") regarding The Project and Work Units (WUs). Maybe this is hardware dependent but in my case, it seems the rewards for GPU WUs are about 20-fold what they are for CPU WUs i.e. usually 50k for a 2-3 hrs GPU WUs vs 3k for a 2-5 hrs CPU WUs. So in my case not only do CPU WUs grant less points but they also take more time to complete. At this point in time, the ratio between GPU vs CPU donated to the project is about 1 to 4 (see OS).
At the same time there is sometimes a shortage of WUs for GPUs. I also saw at least one donor donating only GPU time which yields lots of points (and some sense of entitlement if not necessarily more WUs completed, which is what this is about, WUs for medical research projects & science towards illnesses that kill or maim people such as with the COVID-19 virus; AFAIK these points are not redeemable for a t-shirt, discount over a game or towards a gift card enabling disrespectful behavior/rant/trolling in the forum; and donations are more meaningful collectively than individually, really).
In that context, especially that of the 1 to 4 ratio in a context where 1 is still close to 750k GPUs (i.e. a lot), which might be more than 6 months ago etc. what is the rationale again for CPU WUs being "worth" so much less than GPU WUs, is it just my (old CPU) hardware (i5-2500k, 3.3GHz) which skews my perception and does that rationale still hold today?
Rationale for GPU WUs granting 20x more points than CPUs?
Moderators: Site Moderators, FAHC Science Team
Rationale for GPU WUs granting 20x more points than CPUs?
Intel Core i5-2500K CPU @ 3.30GHz @ 98%, 4 cores
Asus ROG STRIX RX480 8Gb
-
- Posts: 1996
- Joined: Sun Mar 22, 2020 5:52 pm
- Hardware configuration: 1: 2x Xeon [email protected], 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon [email protected], 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: [email protected], 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21 - Location: UK
Re: Rationale for GPU WUs granting 20x more points than CPUs
viewtopic.php?f=72&t=33496&hilit=qrb+difference has some discussion of this ... it comes down to the simple fact that GPUs process science that much quicker than CPUs.
The fastest CPUs atm can complete WUs at high 100k's of points a day which competes with the middle to lower end GPUs but not with the higher end GPUs (the low 1m's) ... and on "Points per watt" GPUs tend to smash CPUs out of the park ... it is for this reason that the "mining community" use GPU farms not CPU farms.
The fastest CPUs atm can complete WUs at high 100k's of points a day which competes with the middle to lower end GPUs but not with the higher end GPUs (the low 1m's) ... and on "Points per watt" GPUs tend to smash CPUs out of the park ... it is for this reason that the "mining community" use GPU farms not CPU farms.
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Re: Rationale for GPU WUs granting 20x more points than CPUs
GPUs procure more points, but they are less flexible. A GPU folding can affect your user experience (if you use that GPU for your monitor), because this is an all-or-nothing kind of operation. CPU jobs on the other hand are unobtrusive: they will never prevent you from using your hardware when you need it (FAH has a low priority).
Re: Rationale for GPU WUs granting 20x more points than CPUs
@Neil-B Thank you, that discussion was very interesting and I understand better. Imho some material should be "sticky" in the New Donors thread, such as a post about the GPUs.txt and this, for instance. Search doesn't allow for too common/short words...
@ajm Unobtrusiveness is a good point too, had forgotten about that, thanks!
@ajm Unobtrusiveness is a good point too, had forgotten about that, thanks!
Intel Core i5-2500K CPU @ 3.30GHz @ 98%, 4 cores
Asus ROG STRIX RX480 8Gb
-
- Posts: 12
- Joined: Sun Apr 26, 2020 6:59 pm
Re: Rationale for GPU WUs granting 20x more points than CPUs
Personally I've just set up hardware to maximally do C-19 jobs and I don't really care much about points. Yes, the combined GPU + CPU on my son's 2018 PC (6 cores * 2 threads) is shellacking my 2012 i7 iMac (4 cores x 2 threads) - no compatible GPU, by a wide margin ... but that's beside the interest in hoping we might find a vital clue to protecting us against SARS CoV2. In a nutshell: GPU's should get more points / hour, they are doing far more.
-
- Posts: 2522
- Joined: Mon Feb 16, 2009 4:12 am
- Location: Greenwood MS USA
Re: Rationale for GPU WUs granting 20x more points than CPUs
"Welcome to Whose Line is it Anyway, the show where everything's made up and the points don't matter"
I use points, they are a quick way to tell if all my PCs are running.
I use points, I can judge is this strategy yielding more points than that strategy. Points can approximate the value of the science I am doing.
I use points, I have for a month been judging how swamped the servers are/were, by the Points Per Day I can do.
However, in any big picture, the points don't matter. If you start thinking they do matter, you drift toward cheating, and no one wins.
CPU and GPUs I have an intel I5 with 4 cores and 8 way single precision SIMD running at 3 Ghz so it does 32 times 3 = 96 JimboFLOPS. (A made up value of Floating Point Operations Per Second)
I also have a Nvidia GTX1 060 with 1260 cores running at 1.5 Ghz or 1890 JimboFLOPS, so my Graphics card should get about 20 times what my CPU does, but the Quick Return Bonus favors quick, so it does even better.
The GPU really does 'brute force' the CPU's ability. The CPU does certain subtle things better, so still has value.
I use points, they are a quick way to tell if all my PCs are running.
I use points, I can judge is this strategy yielding more points than that strategy. Points can approximate the value of the science I am doing.
I use points, I have for a month been judging how swamped the servers are/were, by the Points Per Day I can do.
However, in any big picture, the points don't matter. If you start thinking they do matter, you drift toward cheating, and no one wins.
CPU and GPUs I have an intel I5 with 4 cores and 8 way single precision SIMD running at 3 Ghz so it does 32 times 3 = 96 JimboFLOPS. (A made up value of Floating Point Operations Per Second)
I also have a Nvidia GTX1 060 with 1260 cores running at 1.5 Ghz or 1890 JimboFLOPS, so my Graphics card should get about 20 times what my CPU does, but the Quick Return Bonus favors quick, so it does even better.
The GPU really does 'brute force' the CPU's ability. The CPU does certain subtle things better, so still has value.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
-
- Posts: 35
- Joined: Wed Apr 01, 2020 4:41 am
- Hardware configuration: Machine 1: Gigabyte Z160MX-Gaming 5, Core i5-6500, eVGA 600 W PSU, thermaltake water 3.0120 cooling, 16GB DDR4, GPU 0: GT1030, GPU 1: GTX1050 Ti, GPU 2: GTX1660
Machine 2: Core i3 laptop (dell inspiron 1464)
Retired/spare: GT 750
Unusable: Intel HD 530
Re: Rationale for GPU WUs granting 20x more points than CPUs
I happen to have one machine that does GPU only, because it's a quad core and it needs one core per GPU. This plus the remaining tasks the machine does makes the processor run at 90%, and adding the remaining core invites a bottleneck.
From what I understand about how base points are calculated, it is on an equal points for equal work basis. WUs, both CPU and GPUs are benchmarked on the CPU of one of FaH's own machines, a core i5, if the documentation is current.
EDIT:
From the FAQ https://foldingathome.org/support/faq/points/
From what I understand about how base points are calculated, it is on an equal points for equal work basis. WUs, both CPU and GPUs are benchmarked on the CPU of one of FaH's own machines, a core i5, if the documentation is current.
EDIT:
From the FAQ https://foldingathome.org/support/faq/points/
"Note that GPU projects are now being benchmarked on the same machine, but using that machine’s CPU. By using the same hardware, we want to preserve our goal of “equal pay for equal work”. Our GPU methods have advanced to the point such that, with GPU FAHCore 17, we can run any computation that we can do on the CPU on the GPU. Therefore we’ve unified the benchmarking scheme so that both GPU and CPU projects use the same “yardstick”, which is our i5 benchmark CPU"
Re: Rationale for GPU WUs granting 20x more points than CPUs
It's a problem that's been brought forward multiple times already, and FAH will currently stick with it.
The benchmark procedure made a lot of sense back in the days of CPU folding.
You get points for a job, and you get bonus points for finishing the job faster.
Issue is, a modern CPU with 6-8 cores is no match for a GPU with 2000 to 4000 shaders (kind of like half-cores in a way).
This kind of processing power allowed WUs to be returned so quickly, that the majority of the points now earned are no longer base points, but QRB(Quick Return Bonus) related.
While some find this unfair, and it is unfair, it is the rewards system that FAH chose to apply and stick with.
The benchmark procedure made a lot of sense back in the days of CPU folding.
You get points for a job, and you get bonus points for finishing the job faster.
Issue is, a modern CPU with 6-8 cores is no match for a GPU with 2000 to 4000 shaders (kind of like half-cores in a way).
This kind of processing power allowed WUs to be returned so quickly, that the majority of the points now earned are no longer base points, but QRB(Quick Return Bonus) related.
While some find this unfair, and it is unfair, it is the rewards system that FAH chose to apply and stick with.
Re: Rationale for GPU WUs granting 20x more points than CPUs
Thanks all, I'm satisfied that it's been discussed and that there are opinions and a rationale; the topic is really interesting imho. I don't remember the last time I bought computer parts, might have been a PIII haahah, my rig is leftovers from a gamer friend so his donation make mine possible. The fact that with every WU I can contribute to some researcher "hitting the jackpot" gives me great satisfaction irrespective of points. But hey, I like points too!
Intel Core i5-2500K CPU @ 3.30GHz @ 98%, 4 cores
Asus ROG STRIX RX480 8Gb