Donate your spare CPU cycles to World Community Grid

Here’s an interesting way to donate your spare CPU cycles to a good cause. This project has been around for a few years now, but I just started participating.

Running the software reminds me of my college days. There was a time when everyone I knew was running the SETI@Home screensaver (most likely on an AMD K6 or Celeron 300A!)

Clients are available for all major operating systems.

7 thoughts on “Donate your spare CPU cycles to World Community Grid”

  1. For most people, the cost would be minimal. BUT, if the machines are big enough or numerous enough, the expense can be quite high. If you have a hefty machine and live in California, even an individual machine can be a non-trivial expense.

    Here is the article I was referring to. The school estimated that it cost $1.5 million in less than a year directly attributed to additional electricity for machine usage and air conditioning.
    http://www.techdirt.com/articles/20091201/0027357144.shtml

  2. This is a great program, and even better is that there are a lot of cool companies creating additional incentives to get more people to participate. For example EasyNews usenet service gives you an extra gig of downloads for every seven days of processing time you volunteer your computer to make the world healthier and happier.

    btw ~ I still haven’t made it out to Microcontroller Monday at the ATX Hacker Space but look forward to meeting you there when I do!

  3. One word of caution – before you setup all of the office machines to do this, you might want to check the math and get the boss’ approval.

    Older computers use about the same amount of power under a load as they do at idle…. however, most anything newer than a Pentium 4 will show a power increase. The current model processors have an idle system requirement of 130 watts. Even a modest system can increase by 150 watts under load. If you have an enthusiast system with GPU assisted processing enabled you can hit 400 watts on a single gpu and 600+ watts on dual GPU without much effort and 1000+ watts is possible.

    Consider cheap electricity at 10 cents per KW hour. A small system will draw 280 watts x 12 hours at night x 365 days a year x .10 $/KWH = $122 per year in power. This is a low end estimate as there will be days where computational usage may be 18-24 hours or full load.

    So, $122 per year for a normal desktop, $250 year for a gaming machine, and $400+ per year for an enthusiast machine. (Your mileage may vary) There are several instances of network admins getting fired for doing this at work.

    I do not want to discourage people from this worthy cause – just be aware that your donation of time has an actual price tag.

    1. Duane,

      You’re absolutely right. Power consumption varies drastically with load on modern PCs. It’s easy to understand why BOINC (the software behind WCG) has options to throttle CPU usage – it’s for thermal reasons. Some PC’s can’t handle the stress of running full load for prolonged periods of time.

      I am currently running the software on two computers at home:

      1. My home server, an Intel Atom D510 system, documented here:
      https://mightyohm.com/blog/2010/09/budget-mini-itx-home-server-build/

      I have it configured to use 50% of available processors. The D510 is dual-core with hyper-threading, so it has 2 real + 2 virtual(?) cores. Using all 4 cores causes slowdown (a peer tells me this is because they all share the same cache), so 50% is a good option. The 2 cores in use are set for 100% CPU utilization. This results is 2 low priority threads running 24×7 on the machine. Linux is pretty good about managing process priority, so I don’t see any appreciable slowdown except when the system is heavily overloaded anyway. Power consumption is higher, but this system idles at 30W and is somewhere around 40W full load, so I doubt I’ll notice the change in my power bill. On the other hand, the Atom sucks at number crunching, so it doesn’t contribute much to the grid.

      2. My work PC, a quad-core i5-760 with a 3D card. I still turn the machine off at night, so BOINC is only running in the background during the day. I have it set to run while the machine is in use, but only use 50% cores at 100% load. Since I still have 2 “real” cores reserved for my day to day use, I never notice BOINC is running in the background. Power consumption for this PC is 85W idle and 200W full load, so it’s probably somewhere in between while BOINC is running on 2/4 cores and I’m working.

      So between the two machines, I have 10W extra burning on the Atom 24×7 and ~50W on the desktop machine, but only during the day. Based on your math that should be $9/yr for the Atom and maybe $40/yr for the desktop? Not too bad. But you’re right, it’s not zero.

Leave a Reply