RTX 2080ti vs 2060, which one to buy?

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.
Theodore
Posts: 117
Joined: Sun Feb 10, 2019 2:07 pm

RTX 2080ti vs 2060, which one to buy?

Post by Theodore »

Trying to choose between the RTX 2080ti and the RTX 2060.

I'm thinking of either getting the 2060 and selling it later on, when better stuff is on the market, or get the 2080 ti now to keep for a year or two.
I'm thinking of buying from EVGA, because I can get the cards at a better deal.
Do you think it's feasible for an EVGA 2080 ti, to be able to run for 1 to 2 years straight, nearly 24/7?

I did read an article about a contamination in the cards, and am a bit hesitant to spend a larger amount of money on a card that might end up being bad.
https://www.extremetech.com/computing/2 ... gpu-wafers
foldy
Posts: 2040
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: RTX 2080ti vs 2060, which one to buy?

Post by foldy »

Have a good power supply and enough cooling in the case and clean it from dust every year. I guess we will not see better graphics cards this year and prices will stay high. But late 2020 nvidia will release the RTX in 7nm with 16GB and AMD will release Navi GPU. If noise is a problem for you then think about quiet hardware parts. If you run 24/7 then also calculate your power usage and price. If you can afford it the go for the biggest GPU.
Nathan_P
Posts: 1164
Joined: Wed Apr 01, 2009 9:22 pm
Hardware configuration: Asus Z8NA D6C, 2 [email protected] Ghz, , 12gb Ram, GTX 980ti, AX650 PSU, win 10 (daily use)

Asus Z87 WS, Xeon E3-1230L v3, 8gb ram, KFA GTX 1080, EVGA 750ti , AX760 PSU, Mint 18.2 OS

Not currently folding
Asus Z9PE- D8 WS, 2 [email protected] Ghz, 16Gb 1.35v Ram, Ubuntu (Fold only)
Asus Z9PA, 2 Ivy 12 core, 16gb Ram, H folding appliance (fold only)
Location: Jersey, Channel islands

Re: RTX 2080ti vs 2060, which one to buy?

Post by Nathan_P »

I've got a Galax/KFA 21070 purchased in august 2016 that is still going strong. Its had some down time but routinely runs for 3-4 months between shutdowns. Out of the 30 months i've had the card its been on for 24 of those months.

Here's an outlier for you, a RTX 2060 can get 800k-1m PPD depending on project and costs ~$350, a pair of 2060's costs ~ $700 and will net 1.6-2m PPD, a 2080Ti can't be found for less than $1000 and is reported to earn 2.2m PPD. Personally, if iIhad the space for the 2 cards I would get a pair of 2060's or if power cost wasn't an issue a 2060 and a 2070 - nearly the same points for around $200 less.
Image
Theodore
Posts: 117
Joined: Sun Feb 10, 2019 2:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Post by Theodore »

Thanks,

After some consideration, I went with the 2060.
It's the card for the average joe, like myself.

Had my income been bigger, I would have gone with a pair of 2080s.
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Post by gordonbb »

Nathan_P wrote:I've got a Galax/KFA 21070 purchased in august 2016 that is still going strong. Its had some down time but routinely runs for 3-4 months between shutdowns. Out of the 30 months i've had the card its been on for 24 of those months.

Here's an outlier for you, a RTX 2060 can get 800k-1m PPD depending on project and costs ~$350, a pair of 2060's costs ~ $700 and will net 1.6-2m PPD, a 2080Ti can't be found for less than $1000 and is reported to earn 2.2m PPD. Personally, if iIhad the space for the 2 cards I would get a pair of 2060's or if power cost wasn't an issue a 2060 and a 2070 - nearly the same points for around $200 less.
I have both an EVGA 2060 XC and 2070 XC and can confirm that they run well together. I’m getting 2.3MPPD with a mild overclock and dropping the power limit to 160 and 140W respectively which initial testing is showing to be much more efficient than running at stock power for folding.
Image
Theodore
Posts: 117
Joined: Sun Feb 10, 2019 2:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Post by Theodore »

gordonbb wrote:
Nathan_P wrote:I've got a Galax/KFA 21070 purchased in august 2016 that is still going strong. Its had some down time but routinely runs for 3-4 months between shutdowns. Out of the 30 months i've had the card its been on for 24 of those months.

Here's an outlier for you, a RTX 2060 can get 800k-1m PPD depending on project and costs ~$350, a pair of 2060's costs ~ $700 and will net 1.6-2m PPD, a 2080Ti can't be found for less than $1000 and is reported to earn 2.2m PPD. Personally, if iIhad the space for the 2 cards I would get a pair of 2060's or if power cost wasn't an issue a 2060 and a 2070 - nearly the same points for around $200 less.
I have both an EVGA 2060 XC and 2070 XC and can confirm that they run well together. I’m getting 2.3MPPD with a mild overclock and dropping the power limit to 160 and 140W respectively which initial testing is showing to be much more efficient than running at stock power for folding.
I know Nvidia cards don't fully use the power they're rated for while folding.
When they're rated for 160W, I presume they would only use ~140Watts while folding.
Reducing the power threshold from 160 to 140 Watts, or reducing it by ~15% in that case scenario, won't really lower power consumption on the card.

Which program do you use to undervolt the card (in Linux)?
Nvidia X-server doesn't give me that option.
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Post by gordonbb »

Theodore wrote:I know Nvidia cards don't fully use the power they're rated for while folding.
When they're rated for 160W, I presume they would only use ~140Watts while folding.
Reducing the power threshold from 160 to 140 Watts, or reducing it by ~15% in that case scenario, won't really lower power consumption on the card ...
Yes, you can lower the Power Usage of the card directly and easily.

Recent Nvidia Cards have 5 milliOhm Current Shunt resistors on the input legs from the PCIe Bus and each Power Connector. They use these in Boost 3 to monitor the Total Card Power usage and keep it under a defined threshold which is usually the Default Card Power Limit (217W in the case of the EVGA RTX 2060 XC Ultra, I believe).

The actual power draw and the limit can be viewed by running the nvidia-smi utility at a command prompt (terminal):

Code: Select all

nvidia-smi
To change this power-limit:

Code: Select all

nvidia-smi -i <GPU_id> --power-limit=<PWR_Limit>
where <GPU_id> is the ID of the GPU (Starts at 0 for the first GPU) and <PWR_Limit> is the new desired Power Limit.

You can query the default Power limits using:

Code: Select all

nvidia-smi -i <GPU_id> -q | grep Power
but I usually just enter a bogus power limit such as 0 or 1000 and then the command will throw an error and tell you the allowed range.

But this user-defined Power Limit will not persist across work units so you first should set:

Code: Select all

nvidia-smi -i <GPU_id> -pm 1
Once you change the Target Power Limit and set Persistence mode the GPU will honor the new limit and Boost will adjust the voltages (and hence the GPU Frequencies) to keep the GPU at the new Target.
Theodore wrote:... Which program do you use to undervolt the card (in Linux)?
Nvidia X-server doesn't give me that option.
Undervolting in Windows is usually done by setting a custom Voltage Curve in Precision Pro or Afterburner neither of which are available in Linux. Using the NVidia Control Panel under X-Windows you can adjust the Graphics (Shader) and Memory clocks as well as force the Fan Speed to a specific percentage if a card is running a little hotter than you'd like.

To Enable these settings in the PowerMiser and Thermal tabs you first need to add:

Code: Select all

Option "Coolbits" "12"
in your X server config. This can be a bit of a Black Art but the best way I've found so far is to edit /etc/X11/xorg.conf.d/nvidia.conf and add it to the "Device" Section and restart your X Server
Image
Theodore
Posts: 117
Joined: Sun Feb 10, 2019 2:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Post by Theodore »

gordonbb wrote:
Theodore wrote:I know Nvidia cards don't fully use the power they're rated for while folding.
When they're rated for 160W, I presume they would only use ~140Watts while folding.
Reducing the power threshold from 160 to 140 Watts, or reducing it by ~15% in that case scenario, won't really lower power consumption on the card ...
Yes, you can lower the Power Usage of the card directly and easily.

Recent Nvidia Cards have 5 milliOhm Current Shunt resistors on the input legs from the PCIe Bus and each Power Connector. They use these in Boost 3 to monitor the Total Card Power usage and keep it under a defined threshold which is usually the Default Card Power Limit (217W in the case of the EVGA RTX 2060 XC Ultra, I believe).

The actual power draw and the limit can be viewed by running the nvidia-smi utility at a command prompt (terminal):

Code: Select all

nvidia-smi
To change this power-limit:

Code: Select all

nvidia-smi -i <GPU_id> --power-limit=<PWR_Limit>
where <GPU_id> is the ID of the GPU (Starts at 0 for the first GPU) and <PWR_Limit> is the new desired Power Limit.

You can query the default Power limits using:

Code: Select all

nvidia-smi -i <GPU_id> -q | grep Power
but I usually just enter a bogus power limit such as 0 or 1000 and then the command will throw an error and tell you the allowed range.

But this user-defined Power Limit will not persist across work units so you first should set:

Code: Select all

nvidia-smi -i <GPU_id> -pm 1
Once you change the Target Power Limit and set Persistence mode the GPU will honor the new limit and Boost will adjust the voltages (and hence the GPU Frequencies) to keep the GPU at the new Target.
Theodore wrote:... Which program do you use to undervolt the card (in Linux)?
Nvidia X-server doesn't give me that option.
Undervolting in Windows is usually done by setting a custom Voltage Curve in Precision Pro or Afterburner neither of which are available in Linux. Using the NVidia Control Panel under X-Windows you can adjust the Graphics (Shader) and Memory clocks as well as force the Fan Speed to a specific percentage if a card is running a little hotter than you'd like.

To Enable these settings in the PowerMiser and Thermal tabs you first need to add:

Code: Select all

Option "Coolbits" "12"
in your X server config. This can be a bit of a Black Art but the best way I've found so far is to edit /etc/X11/xorg.conf.d/nvidia.conf and add it to the "Device" Section and restart your X Server
What I meant to say,

I see online people posting they've lowered their card's power consumption by 10% without noticeable performance loss, not taking into consideration that they've only set the power limits to what the card already was using.
Usually setting a ~10-15% lower power draw would result in no change in performance, or power usage.

One can set the power threshold below what the card is using, like you mentioned, but this comes at a performance penalty.
Lowering GPU voltage can lower power draw (and reduce heat) without a performance penalty, when set right.
I couldn't find much information online yet, about an equilibrium between core voltage reduction and performance (PPD) on GPUs.
Most information about voltage adjustment found online, is about overclocking and increasing voltage for higher performance, mostly in Windows applications; which makes sense if you want to play a game at maximum graphics settings.
Or want to earn maximum PPD in the shortest amount of time.

It makes more sense to find a way to lower the power consumption (and heat) for continuous use, like folding.
I would need a way to lower GPU voltage, while monitoring it's frequency for any drop caused by the undervoltage in Linux..
Nvidiux does currently not support my Nvidia cards.

Using coolbits in Linux, nvidia-server only allows me to change core clock, memory, and fan speed on some of my cards. Not on all of them.
The cards that have no fan control in nvidia-server, also have no GPU/VRAM frequency adjustment available.
My MSI and Asus card seem to be supported.
The PNY and Zotec not.

For the issue of not accepting all nvidia based graphics cards, it is mentioned in one of their forums to install a newer driver; 410 to 415.
I have been unsuccessful in my attempts to update the driver beyond 390 on my system.
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Post by gordonbb »

Theodore wrote:For the issue of not accepting all nvidia based graphics cards, it is mentioned in one of their forums to install a newer driver; 410 to 415.
I have been unsuccessful in my attempts to update the driver beyond 390 on my system.
410 is the first version to support RTX 2070s and 415 the 2060s.
To install 410, as it was not available in the ppa repository, I ended up installing the latest CUDA toolkit which had it.

415 is available for Ubuntu in the ppa repository now.

Which distribution are you using?

Worst case you could install the Binary (.run) files from Nvidia directly but I find when I do that it can often leave a mess that takes some work to clean up when the desired driver version is finally available in the distribution’s repository.
Image
katakaio
Posts: 25
Joined: Wed Oct 28, 2009 7:31 pm
Hardware configuration: Intel Core i5-6500 @ 3.2 GHz
EVGA GeForce RTX 2060 6GB XC Ultra
EVGA GeForce GTX 960 4GB FTW

Intel Core i5-3550 @ 3.3 GHz
EVGA GeForce GTX 750 Ti 2GB FTW
Location: Florida

Re: RTX 2080ti vs 2060, which one to buy?

Post by katakaio »

+1 to what gordonbb shared. I've installed binaries directly from Nvidia before when I needed a driver with no OpenGL libs, but ppa:graphics-drivers/ppa is a godsend if you're on a Debian-based distro and you just want the latest and greatest stable driver.

Code: Select all

sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt-get update
gordonbb
Posts: 511
Joined: Mon May 21, 2018 4:12 pm
Hardware configuration: Ubuntu 22.04.2 LTS; NVidia 525.60.11; 2 x 4070ti; 4070; 4060ti; 3x 3080; 3070ti; 3070
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Post by gordonbb »

There are also issues with Fan Control from the command line not working in the 410 drivers for the 2070 but fixed in the 415 driver but command line Fan Control for the 2060 does not work for the 415 driver so they seem to be off one driver

Similarly for GPU and Memory Overclocking using nvidia-settings from the command line does not work from the command line but, with coolbits properly set in 415 the GUI can be used for adjusting Fans and the clocks.

I get what your saying about the Power Limits. My 2070 has a default limit of 205W but rarely exceeds 190W when pushed in a case with good airflow and a large GPU clock offset applied.

This 190W I usually take as the absolute power limit and so adjust my operating Power Limit to 170W which is enforced by Boost.

I did a fair bit of testing with FAHBench using the default WUs and a then common WU in the 117xx series and observed a knee at the top of the performance curve where the last 10 to 20W provided only marginal increases in performance. Of course, the Quick Return Bonus (QRB) will offset the normal decreasing rate of efficiency increase.

Here’s the results for an EVGA 1060 6GB Card:
Image

As you can see this card caps out at 130W so I usually run these at a 120W Limit averaging 450kPPD but dropping down to 110W nets an even higher efficiency averaging 436kPPD.

I now have achieved the goal of my stats testing and have my efficiency per card (PPD/W) being plotted on a Zabbix server which calculates the Efficiency dynamically so I’m running a baseline at:
  • GTX 1060 110W
    GTX 1070 Ti 150W
    RTX 2060 140W
    RTX 2070 160W
and after a week I should have a pretty stable average efficiency then I can change the Power Limits and monitor the changes.
Image
Theodore
Posts: 117
Joined: Sun Feb 10, 2019 2:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Post by Theodore »

NVidia server doesn't offer overclocking on all cards, but on all cards it offers the options: "auto", "balanced", and "Performance".
Has anyone tried out if these settings affect performance or efficiency?
Currently mine are running on performance, but I might switch to balanced, if this results in slightly lower performance on a lower power draw.
I've played around with it, but didn't notice any immediate difference.
foldy
Posts: 2040
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: RTX 2080ti vs 2060, which one to buy?

Post by foldy »

"Performance" keeps high GPU clock and power usage even when only low 3D load is done, e.g. watching youtube video in chrome browser. As FAH has high GPU load the setting doesn't matter and you always get highest GPU clock and power usage.
HaloJones
Posts: 906
Joined: Thu Jul 24, 2008 10:16 am

Re: RTX 2080ti vs 2060, which one to buy?

Post by HaloJones »

Changing the power limit on Linux is as easy as:

nvidia-smi -i 0 -pl xxx

where 0 is the GPU ID and xxx is the power limit in W that you want.
single 1070

Image
MeeLee
Posts: 1339
Joined: Tue Feb 19, 2019 10:16 pm

Re: RTX 2080ti vs 2060, which one to buy?

Post by MeeLee »

If it is any help,

Running 2 RTX 2060 cards, costs you $700 for ~330W of power.
One RTX 2080 ti card costs you $1175 for ~225W of power.
Both settings net you ~2.2M PPD

On the 2060 cards, you can lower power levels to 130W, or 260W total power draw, and still get 2M PPD.
It brings the performance per watt difference between both settings much closer.
Post Reply