Own hardware vs. cloud computing
Moderators: Site Moderators, FAHC Science Team
-
- Posts: 523
- Joined: Fri Mar 23, 2012 5:16 pm
Own hardware vs. cloud computing
If I want to contribute as much compute as possible for some given amount of funds,
which is more cost effective: buying hardware and running it myself, or just buying cloud GPU compute time?
What cloud GPU providers are popular, and most cost effective?
which is more cost effective: buying hardware and running it myself, or just buying cloud GPU compute time?
What cloud GPU providers are popular, and most cost effective?
-
- Site Moderator
- Posts: 6986
- Joined: Wed Dec 23, 2009 9:33 am
- Hardware configuration: V7.6.21 -> Multi-purpose 24/7
Windows 10 64-bit
CPU:2/3/4/6 -> Intel i7-6700K
GPU:1 -> Nvidia GTX 1080 Ti
§
Retired:
2x Nvidia GTX 1070
Nvidia GTX 675M
Nvidia GTX 660 Ti
Nvidia GTX 650 SC
Nvidia GTX 260 896 MB SOC
Nvidia 9600GT 1 GB OC
Nvidia 9500M GS
Nvidia 8800GTS 320 MB
Intel Core i7-860
Intel Core i7-3840QM
Intel i3-3240
Intel Core 2 Duo E8200
Intel Core 2 Duo E6550
Intel Core 2 Duo T8300
Intel Pentium E5500
Intel Pentium E5400 - Location: Land Of The Long White Cloud
- Contact:
Re: Own hardware vs. cloud computing
I believe that AWS, Google Compute and Microsoft Azure are not cost effective at all (from a personal perspective)
I remember reading somewhere that you can rent GPU "mining" boxes but instead of mining, can fold on them at a reasonable price. I can't remember what the site is but hopefully, the person will see this topic and post it
I remember reading somewhere that you can rent GPU "mining" boxes but instead of mining, can fold on them at a reasonable price. I can't remember what the site is but hopefully, the person will see this topic and post it
ETA:
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time
Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time
Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
-
- Posts: 523
- Joined: Fri Mar 23, 2012 5:16 pm
Re: Own hardware vs. cloud computing
Ah I found that one platform is vast.ai
Can someone explain how they can be so affordable? I am seeing a system with Xeon E5-2620 v3 with 4x RTX2080Ti, rentable for $0.722/hour.
How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.
And someone who has tried folding on vast.ai, mind giving a short guide on how to set it up?
Can someone explain how they can be so affordable? I am seeing a system with Xeon E5-2620 v3 with 4x RTX2080Ti, rentable for $0.722/hour.
How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.
And someone who has tried folding on vast.ai, mind giving a short guide on how to set it up?
-
- Posts: 390
- Joined: Sun Dec 02, 2007 4:53 am
- Hardware configuration: FX8320e (6 cores enabled) @ stock,
- 16GB DDR3,
- Zotac GTX 1050Ti @ Stock.
- Gigabyte GTX 970 @ Stock
Debian 9.
Running GPU since it came out, CPU since client version 3.
Folding since Folding began (~2000) and ran Genome@Home for a while too.
Ran Seti@Home prior to that. - Location: UK
- Contact:
Re: Own hardware vs. cloud computing
It would be interesting to see a case study based on what could be run from a 8kw solar system as part of a household as well. If the power was generated for free and you maximised production by using low wattage cards like those mining 1060's @ 75w maybe into a low power 12-24v setup using a PicoPSU. How viable would that be as a long term, 'fire and forget', style folding solution. Connect them in directly to the battery bank and not via the inverter.
-
- Posts: 52
- Joined: Sat Mar 28, 2020 1:22 am
Re: Own hardware vs. cloud computing
v00d00 wrote:It would be interesting to see a case study based on what could be run from a 8kw solar system as part of a household as well. If the power was generated for free and you maximised production by using low wattage cards like those mining 1060's @ 75w maybe into a low power 12-24v setup using a PicoPSU. How viable would that be as a long term, 'fire and forget', style folding solution. Connect them in directly to the battery bank and not via the inverter.
I have a 9kw Solar system on my home, and on average am producing 6kwh / day surplus while also accounting for all my home usage and folding with a 1080ti and running Rosetta at home on 4 older systems (q9650, A10-5800k, A10-7870K, i3-370m).
I have a EVGA 1660 super on the way to experiment with best PPD / watt - depending what I find, I may step up to a 2060 super
-
- Posts: 523
- Joined: Fri Mar 23, 2012 5:16 pm
Re: Own hardware vs. cloud computing
How much did your solar system cost?
-
- Posts: 52
- Joined: Sat Mar 28, 2020 1:22 am
Re: Own hardware vs. cloud computing
About 20k after incentives, which means the payments are similar to my average power bill.
-
- Posts: 49
- Joined: Tue Mar 24, 2020 11:24 am
- Location: Finland
Re: Own hardware vs. cloud computing
You can make a lot more than 10M points with $20 if you go for the interruptible instances. I made 2M points with just $0.97 out of the free $1 trial.iceman1992 wrote: How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.
-
- Posts: 523
- Joined: Fri Mar 23, 2012 5:16 pm
Re: Own hardware vs. cloud computing
I haven't really looked into it, how do those work? So it's better to go for interruptible instances? How do I setup F@H on it?Jorgeminator wrote:You can make a lot more than 10M points with $20 if you go for the interruptible instances. I made 2M points with just $0.97 out of the free $1 trial.iceman1992 wrote: How on earth is that possible? It's 75.7 TFLOPS of GPU compute, I can get over 10 million FAH points just paying ~$20.
-
- Posts: 49
- Joined: Tue Mar 24, 2020 11:24 am
- Location: Finland
Re: Own hardware vs. cloud computing
I don't know the technology behind it, but you rent an instance and SSH into it. Then install F@H like you would on Linux.
An interruptible instance is up for bid for other people while you're renting it. That means you can get outbid and your instance will be paused until the higher bidder has finished using it. For example, I rented two instances, an RTX 2080 Ti and 2x GTX 1080 Ti for around 5 hours, both instances cost about $0.10/h each. They were never outbid in that time.
An interruptible instance is up for bid for other people while you're renting it. That means you can get outbid and your instance will be paused until the higher bidder has finished using it. For example, I rented two instances, an RTX 2080 Ti and 2x GTX 1080 Ti for around 5 hours, both instances cost about $0.10/h each. They were never outbid in that time.
-
- Posts: 523
- Joined: Fri Mar 23, 2012 5:16 pm
Re: Own hardware vs. cloud computing
That simple? I read somewhere that you need to setup docker on it?
-
- Posts: 49
- Joined: Tue Mar 24, 2020 11:24 am
- Location: Finland
Re: Own hardware vs. cloud computing
It's that simple. I used the nvidia/opencl:devel-ubuntu18.04 image for the instances.
-
- Posts: 523
- Joined: Fri Mar 23, 2012 5:16 pm
Re: Own hardware vs. cloud computing
So you didn't use docker? Okay, I guess I'll try it out after they figure out the overload. No use renting machines if they'll just idle
-
- Site Moderator
- Posts: 6986
- Joined: Wed Dec 23, 2009 9:33 am
- Hardware configuration: V7.6.21 -> Multi-purpose 24/7
Windows 10 64-bit
CPU:2/3/4/6 -> Intel i7-6700K
GPU:1 -> Nvidia GTX 1080 Ti
§
Retired:
2x Nvidia GTX 1070
Nvidia GTX 675M
Nvidia GTX 660 Ti
Nvidia GTX 650 SC
Nvidia GTX 260 896 MB SOC
Nvidia 9600GT 1 GB OC
Nvidia 9500M GS
Nvidia 8800GTS 320 MB
Intel Core i7-860
Intel Core i7-3840QM
Intel i3-3240
Intel Core 2 Duo E8200
Intel Core 2 Duo E6550
Intel Core 2 Duo T8300
Intel Pentium E5500
Intel Pentium E5400 - Location: Land Of The Long White Cloud
- Contact:
Re: Own hardware vs. cloud computing
I would be keen for anyone to document the exact instructions to get this to run once GPU WUs are back to being reliably served
ETA:
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time
Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time
Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
-
- Posts: 1996
- Joined: Sun Mar 22, 2020 5:52 pm
- Hardware configuration: 1: 2x Xeon [email protected], 512GB DDR4 LRDIMM, SSD Raid, Win10 Ent 20H2, Quadro K420 1GB, FAH 7.6.21
2: Xeon [email protected], 32GB DDR4, NVME, Win10 Pro 20H2, Quadro M1000M 2GB, FAH 7.6.21 (actually have two of these)
3: [email protected], 12GB DDR3, SSD, Win10 Pro 20H2, GTX 750Ti 2GB, GTX 1080Ti 11GB, FAH 7.6.21 - Location: UK
Re: Own hardware vs. cloud computing
Given my kit is strictly CPU I would be interested in a "Fools Guide" (being one of said individuals) so I could give it a try … Thanks in Advance
2x Xeon E5-2697v3, 512GB DDR4 LRDIMM, SSD Raid, W10-Ent, Quadro K420
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)
Xeon E3-1505Mv5, 32GB DDR4, NVME, W10-Pro, Quadro M1000M
i7-960, 12GB DDR3, SSD, W10-Pro, GTX1080Ti
i9-10850K, 64GB DDR4, NVME, W11-Pro, RTX3070
(Green/Bold = Active)