Project 13424 (Moonshot) very low PPD

Moderators: Site Moderators, FAHC Science Team

PantherX
Site Moderator
Posts: 6986
Joined: Wed Dec 23, 2009 9:33 am
Hardware configuration: V7.6.21 -> Multi-purpose 24/7
Windows 10 64-bit
CPU:2/3/4/6 -> Intel i7-6700K
GPU:1 -> Nvidia GTX 1080 Ti
§
Retired:
2x Nvidia GTX 1070
Nvidia GTX 675M
Nvidia GTX 660 Ti
Nvidia GTX 650 SC
Nvidia GTX 260 896 MB SOC
Nvidia 9600GT 1 GB OC
Nvidia 9500M GS
Nvidia 8800GTS 320 MB

Intel Core i7-860
Intel Core i7-3840QM
Intel i3-3240
Intel Core 2 Duo E8200
Intel Core 2 Duo E6550
Intel Core 2 Duo T8300
Intel Pentium E5500
Intel Pentium E5400
Location: Land Of The Long White Cloud
Contact:

Re: Project 13424 (Moonshot) very low PPD

Post by PantherX »

cine.chris wrote:...User/Donor abandonment is already high, this could accelerate that rate...
If you're looking at the EOC Stats, then those who have folded for several years know that it is a common seasonal trend that our numbers dip during the summer time and then rise up during the winter times. Plus, given the economic situation, some donors may want to safe more or have to look for employment elsewhere thus, a drop in figures is expected.
cine.chris wrote:...I'm still seeing normal levels of power consumption & GPU utilization, both a red flag to me that these WU are consuming resources (unlike some others I whined about) at high levels...
I am not sure what you mean my "seeing normal levels" and then "red flag" later on. Would it be possible for you to elaborate a bit more?
cine.chris wrote:...Points don't cost anyone anything... except disgruntled & frustrated donors that abandon a project as that's the only metric & compensation they see.
F@H has the policy to ensure that points reflects the correct scientific value. There have been occasions where bonus points were allocated to incentivize donors to upgrade clients as the new clients did offer significant boost in productivity. However, that's the exception not the norm :)
ETA:
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time

Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
gunnarre
Posts: 559
Joined: Sun May 24, 2020 7:23 pm
Location: Norway

Re: Project 13424 (Moonshot) very low PPD

Post by gunnarre »

Yeah, I don't see how a return to normal PPD would make people stop folding. The large variance made people worry that there was something wrong with their gear or FAH, and this should be a return to normal base points.
Image
Online: GTX 1660 Super + occasional CPU folding in the cold.
Offline: Radeon HD 7770, GTX 1050 Ti 4G OC, RX580
HaloJones
Posts: 906
Joined: Thu Jul 24, 2008 10:16 am

Re: Project 13424 (Moonshot) very low PPD

Post by HaloJones »

Let's double the points! No, triple them!
single 1070

Image
Neil-B
Posts: 1996
Joined: Sun Mar 22, 2020 5:52 pm
Hardware configuration: 1: 2x Xeon [email protected], 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon [email protected], 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: [email protected], 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21
Location: UK

Re: Project 13424 (Moonshot) very low PPD

Post by Neil-B »

... or folders could just choose their own points allocations?
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070

(Green/Bold = Active)
PantherX
Site Moderator
Posts: 6986
Joined: Wed Dec 23, 2009 9:33 am
Hardware configuration: V7.6.21 -> Multi-purpose 24/7
Windows 10 64-bit
CPU:2/3/4/6 -> Intel i7-6700K
GPU:1 -> Nvidia GTX 1080 Ti
§
Retired:
2x Nvidia GTX 1070
Nvidia GTX 675M
Nvidia GTX 660 Ti
Nvidia GTX 650 SC
Nvidia GTX 260 896 MB SOC
Nvidia 9600GT 1 GB OC
Nvidia 9500M GS
Nvidia 8800GTS 320 MB

Intel Core i7-860
Intel Core i7-3840QM
Intel i3-3240
Intel Core 2 Duo E8200
Intel Core 2 Duo E6550
Intel Core 2 Duo T8300
Intel Pentium E5500
Intel Pentium E5400
Location: Land Of The Long White Cloud
Contact:

Re: Project 13424 (Moonshot) very low PPD

Post by PantherX »

Must resist, must not post a meme... don't do it! Too late :lol:
Image
Meme explanation for anyone not aware: https://knowyourmeme.com/memes/oprahs-you-get-a-car

Okie dokie, we have a clear reason (see JohnChodera's post above) behind the normal points being assigned to Project 13424/13425 thus, any further discussion about "low PPD for Project 13424/13425" would not be very meaningful and could be off-topic. However, if anyone would like to propose a new points system, feel free to read the previous topics and then start a new topic which would address the previous issues raised :eugeek:
ETA:
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time

Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
Ichbin3
Posts: 96
Joined: Thu May 28, 2020 8:06 am
Hardware configuration: MSI H81M, G3240, RTX 2080Ti_Rev-A@220W, Ubuntu 18.04
Location: Germany

Re: Project 13424 (Moonshot) very low PPD

Post by Ichbin3 »

Just my 2 cents:
I compar some older WUs which have not been raised in PPD.

At my 2080TI@200W I got
13414 - 3.4 Mill PPD
16600 - 3.8 Mill PPD
14465 - 3.5 Mill PPD
vs
13424 - 3.2 Mill PPD

This is not a complain :mrgreen:
Image
MSI H81M, G3240, RTX 2080Ti_Rev-A@220W, Ubuntu 18.04
cine.chris
Posts: 78
Joined: Sun Apr 26, 2020 1:29 pm

Re: Project 13424 (Moonshot) very low PPD

Post by cine.chris »

OT- PantherX, which island?
I spent 3 wks cycling the South Island & Otago Trail.

Thx for the response.
PantherX wrote:
cine.chris wrote:...User/Donor abandonment is already high, this could accelerate that rate...
If you're looking at the EOC Stats, then those who have folded for several years know that it is a common seasonal trend that our numbers dip during the summer time and then rise up during the winter times. Plus, given the economic situation, some donors may want to safe more or have to look for employment elsewhere thus, a drop in figures is expected.
Yes, 2020 has many challenges, including the mocking reputation that the F@H forums culture has a reputation for. However, consistency is a good thing & supporting the Folding effort should be the focus for the community. Admitedly, looking at the EOC page en route to to my Aug 300M target & 800M overall, walking in the Graveyard of Abandonment did come to mind as I was passing over more greys than actives. Hence, triggering my concerns over abandonment of the Folding project.
PantherX wrote:
cine.chris wrote:...I'm still seeing normal levels of power consumption & GPU utilization, both a red flag to me that these WU are consuming resources (unlike some others I whined about) at high levels...
I am not sure what you mean my "seeing normal levels" and then "red flag" later on. Would it be possible for you to elaborate a bit more?
System with the power meter did show a 13% decrease. Afterburner was continuing show #'s >90%.
PantherX wrote:F@H has the policy to ensure that points reflects the correct scientific value. There have been occasions where bonus points were allocated to incentivize donors to upgrade clients as the new clients did offer significant boost in productivity. However, that's the exception not the norm :)
Policy is too often ignored. In closing, my reason for writing is to reinforce & encourage a policy that helps an important project with a huge potential to help society.
Image Image
Kjetil
Posts: 175
Joined: Sat Apr 14, 2012 5:56 pm
Location: Stavanger Norway

Re: Project 13424 (Moonshot) very low PPD

Post by Kjetil »

Ichbin3 wrote:Just my 2 cents:
I compar some older WUs which have not been raised in PPD.

At my 2080TI@200W I got
13414 - 3.4 Mill PPD
16600 - 3.8 Mill PPD
14465 - 3.5 Mill PPD
vs
13424 - 3.2 Mill PPD

This is not a complain :mrgreen:
On win10 is down from p 13422 to 13424 :D
https://folding.extremeoverclocking.com ... s=&u=94811
Neil-B
Posts: 1996
Joined: Sun Mar 22, 2020 5:52 pm
Hardware configuration: 1: 2x Xeon [email protected], 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon [email protected], 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: [email protected], 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21
Location: UK

Re: Project 13424 (Moonshot) very low PPD

Post by Neil-B »

As explained earlier in topic 13422 gave higher then usual points to make up for (and for most folder in a very generous manner) some WUs that ran slow ... For 13424 the slow WUs should not exist so the project has been baselined back to where it should be ... So yes, you will have seen a drop in ppd when comparing 13424 to 13422 - but this isn't because 13424 is low just that 13422 was overly generous.
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070

(Green/Bold = Active)
cine.chris
Posts: 78
Joined: Sun Apr 26, 2020 1:29 pm

Re: Project 13424 (Moonshot) very low PPD

Post by cine.chris »

Apologies, as this veers off-topic but related.
Ichbin3 wrote: At my 2080TI@200W I got
13414 - 3.4 Mill PPD
16600 - 3.8 Mill PPD
14465 - 3.5 Mill PPD
vs
13424 - 3.2 Mill PPD
I-am-#3, how are you logging your data?
I didn't see any features or available extensions to log WU & slot data.

I borrowed a 2080Ti for a few weeks during July-Aug and was sustaining ~3.6M PPD with aggressive WU management. I don't have one now. Converting that to what I'm seeing from from other RTX cards for 13424, I'd expect to see ~2.95M PPD. My 2070super is on a X570 mobo with a Ryzen 5 3600, Win10, & I seldom fold on CPUs. My 2060/2070 super combo would consistently match the 2080Ti PPD level. So, your 3.2M PPD for 13424 is very good in comparison!

I'm retired, so I have more time available than most people to watch, especially with quarantine, but logging the results from each WU would be much better & remove any subjective bias.

Update: after writing the above, I checked the 2060/2070 system, it was at 3.21M PPD so today's #s reflect agreement, in an abstracted sense, to what Ichbin3 shared. It also, more import to me, validates that a 2060/2070 super combo has comparable Folding performance to the 2080Ti. Unfortunately I neglected to make an Open-CL benchmarked power consumption measurement on the 2080Ti.
Image Image
bollix47
Posts: 2963
Joined: Sun Dec 02, 2007 5:04 am
Location: Canada

Re: Project 13424 (Moonshot) very low PPD

Post by bollix47 »

cine.chris wrote:I didn't see any features or available extensions to log WU & slot data.
Perhaps HFM ...
If you're using Windows it's easy, for linux it will take a bit of work but it does work using a current version of mono & the zip files of HFM.
Neil-B
Posts: 1996
Joined: Sun Mar 22, 2020 5:52 pm
Hardware configuration: 1: 2x Xeon [email protected], 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon [email protected], 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: [email protected], 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21
Location: UK

Re: Project 13424 (Moonshot) very low PPD

Post by Neil-B »

I'm intrigued ... What is "aggressive WU management"? ... Is this tweaking GPU performance or something else?
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070

(Green/Bold = Active)
cine.chris
Posts: 78
Joined: Sun Apr 26, 2020 1:29 pm

Re: Project 13424 (Moonshot) very low PPD

Post by cine.chris »

@bollix47 Thx, that was eezy. Installed & my 2 dedicated folders connected.
Impressive F@H history, #64!
Looks like you took a hit yesterday too.
Image Image
bollix47
Posts: 2963
Joined: Sun Dec 02, 2007 5:04 am
Location: Canada

Re: Project 13424 (Moonshot) very low PPD

Post by bollix47 »

Yes, but I expected that to happen once the slowdown problem was resolved. I'm getting what I was getting before the Moonshot and that's fine. A little bonus for a while was nice but now it's back to normal.
cine.chris
Posts: 78
Joined: Sun Apr 26, 2020 1:29 pm

Re: Project 13424 (Moonshot) very low PPD

Post by cine.chris »

Neil-B wrote:I'm intrigued ... What is "aggressive WU management"? ... Is this tweaking GPU performance or something else?
Something else...
Not something I'm suggesting for others.
======================================
the WU vortex story LOL
I had a dual 2060s dedicated Folder that got locked into a ToxicWU server whirlpool.
(my graphic description of being locked into a LowPPD trap)
In frustration I deleted a slot only to find the host latched the WU because of matching GPUs!
I was trapped, the server had me.
Even deleting both slots, I'd be trapped again.
So, I decided matching GPUs are bad juju.
Setting my two dedicated Folders to finish, I shuffled GPUs & squished the toxic Siamese pair.
Until yesterday, these two systems were delivering ~3&4M PPD with beautiful consistency.
They're still consistent, but ~2.5&3M PPD. LOL
Aggressive... now, if the cluster PPD dips out of my tolerance range, I pull slots if caught early.
Often, it self-repairs, if I wait.
When I had a borrowed 2080Ti, the difference overall after tweaking could be in the 4-5M PPD range.
I like setting goals & with 5M pts by tomorrow I'll have 300M for August, which is good for an old retired guy quarantined in the basement.
cluster: I've had up to six systems folding for a 'sprint'
note: my real Folding secret weapon is the 12,000 BTU 19 SEER mini-split Heat Pump that I DIY installed in the basement :D
Image Image
Post Reply