Tag Archives: Folding at Home

Is Folding@Home a Waste of Electricity?

Folding@home has brought together thousands of people (81 thousand active folders as of the time of this writing, as evidenced from Stanford’s One in a Million contributor drive.) This is awesome…tens of thousands of people teaming up to help researchers unravel the mysteries of terrible diseases.

But, there is a cost. If you are reading this blog, then you know the cost of scientific computing projects such as Folding@Home is environmental. In trying to save ourselves from the likes of cancer and Alzheimer’s disease, we are running a piece of software that causes our computers to use more electricity. In the case of dedicated folding@home computers, this can be hundreds of watts of power consumed 24/7. It adds up to a lot of consumed power, that in the end exits your computer as heat (potentially driving up your air conditioning costs as well).

Folding on Graphics Card Thermal

FLIR Thermal Cam – Folding@Home on Graphics Card

If Stanford reaches their goal of 1 million active folders, then we have an order of magnitude more power consumption on our hands. Let’s do some quick math, assuming each folder contributes 200 watts continuous (low compared to the power draw of most dedicated Folding@home machines). In this case, we have 200 watts/computer * 24 hours/day * 365 days/year * 1,000,000 computers *1 kilowatt-hour/1000 watt-hours = 1,752,000,000 kilowatt-hours of power consumed in a year, in the name of Science!

That’s almost two billion kilowatt-hours, people.  It’s 1.75 terawatt-hours (TWh)! Using the EPA’s free converter can put that into perspective. Basically, this is like driving 279 thousand extra cars for a year, or burning 1.5 billion pounds of coal.  Yikes!

https://www.epa.gov/energy/greenhouse-gas-equivalencies-calculator

F@H Energy Equivalence

Potential Folding@Home Environmental Impact

Is all this disease research really harming the planet? If it is, is it worth it? I don’t know. It depends on the outcome of the research, the potential benefit to humans, and the detriment to humans, animals, and the environment caused by that research. This opens up all sorts of what-if scenarios.

For example: what if Folding@Home does help find a future cure for many diseases, which results in extended life-spans. Then, the earth gets even more overpopulated than it is already. Wouldn’t the added environmental stresses negatively impact people’s health? Conversely, what if Folding@Home research results in a cure for a disease that allows a little girl or boy to grow to adulthood and become the inventor of some game-changing green technology?

It’s just not that easy to quantify.

Then, there is the topic of Folding@home vs. other distributed computing projects. Digital currency, for example. Bitcoin miners (and all the spinoffs) suck up a ton of power. Current estimates put Bitcoin alone at over 40 TWH a year.

Source: https://www.theguardian.com/technology/2018/jan/17/bitcoin-electricity-usage-huge-climate-cryptocurrency

That’s more power than some countries use, and twenty times more than my admittedly crude future Folding@home estimate. When you consider that the cryptocurrency product has only limited uses (many of which are on the darkweb for shady purposes), it perhaps helps cast Folding@home in a better light.

There is always room for improvement thought. That is the point of this entire blog. If we crazies are committed to turning our hard-earned dollars into “points”, we might as well do it in the most efficient way possible. And, while we’re at it, we should consider the environmental cost of our hobby and think of ways to offset it (that goes for the Bitcoin folks too).

I once ran across a rant on another online blog about how Folding@home is killing the planet. This was years ago, before the Rise of the Crypto. I wish I could find that now, but it seems to have been lost in the mists of time, long since indexed, ousted, and forgotten by the Google Search Crawler. In it, the author bemoaned over how F@H was murdering mother earth in the name of science. I recall thinking to myself, “hey, they’ve got a point”. And then I realized that I had already done a bunch of things to help combat the rising electric bill, and I bet most distributed computing participants have done some of these things too.

These things are covered elsewhere in this blog, and range from optimizing the computer doing the work to going after other non-folding@home related items to help offset the electrical and environmental cost. I started by switching to LED light-bulbs, then went to using space heaters instead of whole house heating methods in the winter. As I upgraded my Folding@home computer, I made it more energy efficient not just for F@H but for all tasks executed on that machine.

In the last two years, my wife and I bought a house, which gave us a whole other level of control over the situation. We had one of those state-subsidized energy audits done. They put in some insulation and air-sealed our attic, thus reducing our yearly heating costs. Eventually, we even decided to put solar panels on the roof and get an electric car (these last two weren’t because I felt guilty about running F@H, but because my wife and I are just into green technologies). We even use our Folding@home computer as a space heater in the winter, thus offsetting home heating oil use and negating any any environmental arguments against F@H in the winter months.

In conclusion, there is no doubt that distributed projects have an environmental cost. However, to claim that they are a waste of electricity or that they are killing the planet might be taking it too far. One has to ask if the cause is worth the environmental impact, and then figure out ways to lessen that impact (or in some cases get motivated to offset it completely. Solar powered folding farm, anyone?)

Solar Panel in Basement

LG 320 Solar Panel in my basement, awaiting roof install.

Folding on the NVidia GTX 1060

Overview

Folding@home is Stanford University’s charitable distributed computing project. It’s charitable because you can donate electricity, as converted into work through your home computer, to fight cancer, Alzheimer, and a host of other diseases.  It’s distributed, because anyone can run it with almost any desktop PC hardware.  But, not all hardware configurations are created equally.  If you’ve been following along, you know the point of this blog is to do the most work for as little power consumption as possible.  After all, electricity isn’t free, and killing the planet to cure cancer isn’t a very good trade-off.

Today we’re testing out Folding@home on EVGA’s single-fan version of the NVIDIA GTX 1060 graphics card.  This is an impressive little card in that it offers a lot of gaming performance in a small package.  This is a very popular graphics card for gamers who don’t want to spend $400+ on GTX 1070s and 1080s.  But, how well does it fold?

Card Specifications

Manufacturer:  EVGA
Model #:  06G-P4-6163
Model Name: EVGA GeForce GTX 1060 SC GAMING (Single Fan)
Max TDP: 120 Watts
Power:  1 x PCI Express 6-pin
GPU: 1280 CUDA Cores @ 1607 MHz (Boost Clock of 1835 MHz)
Memory: 6 GB GDDR5
Bus: PCI-Express X16 3.0
MSRP: $269

06G-P4-6163-KR_XL_4

EVGA Nvidia GeForce GTX 1060 (photo by EVGA)

Folding@Home Test Setup

For this test I used my normal desktop computer as the benchmark machine.  Testing was done using Stanford’s V7 client on Windows 7 64-bit running FAH Core 21 work units.  The video driver version used was 381.65.  All power consumption measurements were taken at the wall and are thus full system power consumption numbers.

If you’re interested in reading about the hardware configuration of my test rig, it is summarized in this post:

https://greenfoldingathome.com/2017/04/21/cpu-folding-revisited-amd-fx-8320e-8-core-cpu/

Information on my watt meter readings can be found here:

I Got a New Watt Meter!

FOLDING@HOME TEST RESULTS – 305K PPD AND 1650 PPD/WATT

The Nvidia GTX 1060 delivers the best Folding@Home performance and efficiency of all the hardware I’ve tested so far.  As seen in the screen shot below, the native F@H client has shown up to 330K PPD.  I ran the card for over a week and averaged the results as reported to Stanford to come up with the nominal 305K Points Per Day number.  I’m going to use 305 K PPD in the charts in order to be conservative.  The power draw at the wall was 185 watts, which is very reasonable, especially considering this graphics card is in an 8-core gaming rig with 16 GB of ram.  This results in a F@H efficiency of about 1650 PPD/Watt, which is very good.

Screen Shot from F@H V7 Client showing Estimated Points per Day:

1060 TI Client

Nvidia GTX 1060 Folding @ Home Results: Windows V7 Client

Here are the averaged results based on actual returned work units

(Graph courtesy of http://folding.extremeoverclocking.com/)

1060 GTX PPD History

NVidia 1060 GTX Folding PPD History

Note that in this plot, the reported results previous to the circled region are also from the 1060, but I didn’t have it running all the time.  The 305K PPD average is generated only from the work units returned within the time frame of the red circle (7/12 thru 7/21)

Production and Efficiency Plots

Nvidia 1060 PPD

NVidia GTX 1060 Folding@Home PPD Production Graph

Nvidia 1060 PPD per Watt

Nvidia GTX 1060 Folding@Home Efficiency Graph

Conclusion

For about $250 bucks (or $180 used if you get lucky on eBay), you can do some serious disease research by running Stanford University’s Folding@Home distributed computing project on the Nvidia GTX 1060 graphics card.  This card is a good middle ground in terms of price (it is the entry-level in NVidia’s current generation of GTX series of gaming cards).  Stepping up to a 1070 or 1080 will likely continue the trend of increased energy efficiency and performance, but these cards cost between $400 and $800.  The GTX 1060 reviewed here was still very impressive, and I’ll also point out that it runs my old video games at absolute max settings (Skyrim, Need for Speed Rivals).  Being a relatively small video card, it easily fits in a mid-tower ATX computer case, and only requires one supplemental PCI-Express power connector.  Doing over 300K PPD on only 185 watts, this Folding@home setup is both efficient and fast. For 2017, the NVidia 1060 is an excellent bang-for-the-buck Folding@home Graphics Card.

Request: Anyone want to loan me a 1070 or 1080 to test?  I’ll return it fully functional (I promise!)

Folding@Home on the Nvidia GeForce GTX 1050 TI: Extended Testing

Hi again.  Last week, I looked at the performance and energy efficiency of using an Nvidia GeForce GTX 1050 TI to run Stanford’s charitable distributed computing project Folding@home.  The conclusion of that study was that the GTX 1050 TI offers very good Points Per Day (PPD) and PPD/Watt energy efficiency.  Now, after some more dedicated testing, I have a few more thoughts on this card.

Average Points Per Day

In the last article, I based the production and efficiency numbers on the estimated completion time of one work unit (Core 21), which resulted in a PPD of 192,000 and an efficiency of 1377 PPD/Watt.  To get a better number, I let the card complete four work units and report the results to Stanford’s collection server.  The end result was a real-world performance of 185K PPD and 1322 PPD/Watt (power consumption is unchanged at 140 watts @ the wall).  These are still very good numbers, and I’ve updated the charts accordingly.  It should be noted that this still only represents one day of folding, and I am suspicious that this PPD is still on the high end of what this card should produce as an average.  Thus, after this article is complete, I’ll be running some more work units to try and get a better average.

Folding While Doing Other Things

Unlike the AMD Radeon HD 7970 reviewed here, the Nvidia GTX 1050 TI doesn’t like folding while you do anything else on the machine.  To use the computer, we ended up pausing folding on multiple occasions to watch videos and browse the internet.  This results in a pretty big hit in the amount of disease-fighting science you can do, and it is evident in the PPD results.

Folding on a Reduced Power Setting

Finally, we went back to uninterrupted folding on the card, but at a reduced power setting (90%, set using MSI Afterburner).  This resulted in a 7 watt reduction of power consumption as measured at the wall (133 watts vs. 140 watts).  However, in order to produce this reduction in power, the graphics card’s clock speed is reduced, resulting in more than a performance hit.  The power settings can be seen here:

GTX 1050 Throttled

MSI Afterburner is used to reduce GPU Power Limit

Observing the estimated Folding@home PPD in the Windows V7 client shows what appears to be a massive reduction in PPD compared to previous testing.  However, since production is highly dependent on the individual projects and work units, this reduction in PPD should be taken with a grain of salt.

GTX 1050 V7 Throttled Performance

In order to get some more accurate results at the reduced power limit, we let the machine chug along uninterrupted for a week.  Here is the PPD production graph courtesy of http://folding.extremeoverclocking.com/

GTX 1050 Extended Performance Testing

Nvidia GTX 1050 TI Folding@Home Extended Performance Testing

It appears here that the 90% power setting has caused a significant reduction in PPD. However, this is based on having only one day’s worth of results (4 work units) for the 100% power case, as opposed to 19 work units worth of data for the 90% power case. More testing at 100% power should provide a better comparison.

Updated Charts (pending further baseline testing)

GTX 1050 PPD Underpowered

Nvidia GTX 1050 PPD Chart

GTX 1050 Efficiency Underpowered

Nvidia GTX 1050 TI Efficiency

As expected, you can contribute the most to Stanford’s Folding@home scientific disease research with a dedicated computer.  Pausing F@H to do other tasks, even for short periods, significantly reduces performance and efficiency.  Initial results seem to indicate that reducing the power limit of the graphics card significantly hurts performance and efficiency.  However, there still isn’t enough data to provide a detailed comparison, since the initial PPD numbers I tested on the GTX 1050 were based on the results of only 4 completed work units.  Further testing should help characterize the difference.

Squeezing a few more PPD out of the FX-8320E

In the last post, the 8-core AMD FX-8320E was compared against the AMD Radeon 7970 in terms of both raw Folding@home computational performance and efficiency.  It lost, although it is the best processor I’ve tested so far.  It also turns out it is a very stable processor for overclocking.

Typical CPU overclocking focuses on raw performance only, and involves upping the clock frequency of the chip as well as the supplied voltage.  When tuning for efficiency, doing more work for the same (or less) power is what is desired.  In that frame of mind, I increased the clock rate of my FX-8320e without adjusting the voltage to try and find an improved efficiency point.

Overclocking Results

My FX-8320E proved to be very stable at stock voltage at frequencies up to 3.6 GHz.  By very stable, I mean running Folding@home at max load on all CPUs for over 24 hours with no crashes, while also using the computer for daily tasks.   This is a 400 MHz increase over the stock clock rate of 3.2 GHz.  As expected, F@H production went up a noticeable amount (over 3000 PPD).  Power consumption also increased slightly.  It turns out the efficiency was also slightly higher (190 PPD/watt vs. 185 PPD/watt).  So, overclocking was a success on all fronts.

FX 8320e overclock PPD

FX 8320e overclock efficiency

Folding Stats Table FX-8320e OC

Conclusion

As demonstrated with the AMD FX-8320e, mild overclocking can be a good way to earn more Points Per Day at a similar or greater efficiency than the stock clock rate.  Small tweaks like this to Folding@home systems, if applied everywhere, could result in more disease research being done more efficiently.

F@H Efficiency on Dell Inspiron 1545 Laptop

Laptops!  

When browsing internet forums looking for questions that people ask about F@H, I often see people asking if it is worth folding on laptops (note that I am talking about normal, battery-life optimized laptops, not Alienware gaming laptops / desktop replacements).  In general, the consensus from the community is that folding on laptops is a waste of time.  Well, that is true from a raw performance perspective.  Laptops, tablets, and other mobile devices are not the way to rise to the top of the Folding at Home leader boards.  They’re just too slow, due to the reduced clock speeds and voltages employed to maximize battery life.

But wait, didn’t you say that low voltage is good for efficiency?

I did, in the last article.  By undervolting and slightly underclocking the Phenom II X6 in a desktop computer, I was able to get close to 90 PPD/Watt while still doing an impressive twelve thousand PPD.

However, this raised the interesting question of what would happen if someone tried to fold on a computer that was optimized for low voltage, such as a laptop.  Lets find out!

Dell Inspiron 1545

Specs:

  • Intel T9600 Core 2 Duo
  • 8 GB DDR2 Ram
  • 250 GB spinning disk style HDD (5400 RPM, slow as molasses)
  • Intel Integrated HD Graphics (horrible for gaming, great for not using much extra electricity)
  • LCD Off during test  to reduce power

I did this test on my Dell Inspiron 1545, because it is what I had lying around.  It’s an older laptop that originally shipped with a slow socket P Intel Pentium dual core.  This 2.1 GHz chip was going to be so slow at folding that I decided to splurge and pick up a 2.8 GHz T9600 Core 2 Duo from Ebay for 25 bucks (can you believe this processor used to cost $400)?  This high end laptop processor has the same 35 watt TDP as the Pentium it is replacing, but has 6 times the total cache.  This is a dual core part that is roughly similar in architecture to the Q6600 I tested earlier, so one would expect the PPD and the efficiency to be close to the Q6600 when running on only 2 cores (albeit a bit higher due to the T9600’s higher clock speed).  I didn’t bother doing a test with the old laptop processor, because it would have been pretty bad (same power consumption but much slower).

After upgrading the processor (rather easy on this model of laptop, since there is a rear access panel that lets you get at everything), I ran this test in Windows 7 using the V7 client.  My computer picked up a nice A4 work unit and started munching away.  I made sure to use my passkey to ensure I get the quick return bonus.

Results:

The Intel T9600 laptop processor produced slightly more PPD than the similar Q6600 desktop processor when running on 2 cores (2235 PPD vs 1960 PPD). This is a decent production rate for a dual core, but it pales in comparison to the 6000K PPD of the Q6600 running with all 4 cores, or newer processors such as the AMD 1100T (over 12K PPD).

However, from an efficiency standpoint, the T9600 Core2 Duo blows away the desktop Core2 Quad by a lot, as seen in the chart and graph below.

Intel T9600 Folding@Home Efficiency

Intel T9600 Folding@Home Efficiency

Intel T9600 Folding@Home Efficiency vs. Intel Desktop Processors

Intel T9600 Folding@Home Efficiency vs. Desktop Processors

Conclusion

So, the people who say that laptops are slow are correct.  Compared to all the crazy desktop processors out there, a little dual core in a laptop isn’t going to do very many points per day.  Even modern quad cores laptops are fairly tame compared to their desktop brethren.  However, the efficiency numbers tell a different story.

Because everything from the motherboard, video card, audio circuit, hard drive, and processor are optimized for low voltage, the total system power consumption was only 39 watts (with the lid closed).  This meant that the 2235 PPD was enough to earn an efficiency score of 57.29 PPD/Watt.  This number beats all of the efficiency numbers from the most similar desktop processor tested so far (Q6600), even when the Q6600 is using all four cores.

So, laptops can be efficient F@H computers, even though they are not good at raw PPD production.  It should also be noted that during this experiment the little T9600 processor heated up to a whopping 67 degrees C. That’s really warm compared to the 40 degrees Celsius the Q6600 runs at in the desktop.  Over time, that heat load would probably break my poor laptop and give me an excuse to get that Alienware I’ve been wanting.