My huge thanks goes to @taiganaut for help with the figures.
The claim that Bitcoin is a huge waste of electricity was based on flawed calculations.
Bitcoin isn't wasting electricity. Well, at least not as much as everyone says it is. What it is doing is much harder to figure. But let's try.
The claim that Bitcoin was a huge waste of electricity were based on widely quoted statistics from Blockchain.info—now removed from the site, but which are still wildly cited by commentators such as PandoDaily. These figures are simply wrong, and some simple math shows this, if anyone had bothered to check.
Bitcoin works by solving cryptographical math puzzles the hard way in order to secure its transaction record. A distributed computer network, comprised of every computer that is “mining” Bitcoin, processes individual “hashes” looking for the solution, and when the problem is solved, the network moves onto the next problem. There is no way to fake transactions, without having more computers than all of the network combined. The network reports how much computing power is working on the problem via the global hashrate, which is how fast the entire network is churning through the hashes. This computing power is what uses the electricity, and the Bitcoins that are generated in the mining process and distributed to the working computers is what makes it worth anyone's computing time.
On December 16th when PandoDaily published the article, they recorded a global hashrate of 7,000,000 Gigahash per second (Gh/s). That's a lot of hash. Blockchain.info, when they were still listing the statistics, estimated the electricity consumption this required by using a rate of 650 watts per Gh/s. So, with a little multiplication we find that means:
the Bitcoin network was supposedly drawing 4.55 gigawatts.
Multiply that by 24 to discover that Bitcoin was purportedly using 109.2 gigawatt-hours per day.
But the actual Blockchain.info stat for energy consumption (which PandoDaily quoted) was 131 gigawatt-hours per day. The math, from the outset, doesn't make any sense.
If there was an extra Bangladesh plugged into the power grid, we might expect someone to notice.
Clearly, the Blockchain.info estimate formula was broken somewhere. But even the assumptions under which it was supposedly based seems terribly flawed. Given the figure of 650 watts per Gh/s, the yearly energy usage would be 39.85 terawatt-hours. This is more electricity than Bangladesh used in 2008. If there was an extra Bangladesh plugged into the power grid, we might expect someone to notice.
How much electricity does it actually take to generate one gigahash? There is no good answer to this question. You can mine Bitcoin with a cell phone, with your laptop, with a graphics card GPU, or you can mine with specifically designed ASIC chips installed into rack servers. Depending on the hardware you use, mining widely differs in its efficiency. This site estimates anything from 18,750 watts for a Raspberry Pi, to 0.001 watts per Gh/s for an ASIC. This site uses a average rate of 10 watts per Gh/s. The comments on this article cite anecdotal evidence of between 10 and 0.5 watts per Gh/s.
At 10 watts per Gh/s, Bitcoin would use the generating capacity of the Loon Lake hydroelectric plant.
If we take 10 watts per Gh/s, and multiply that by the current global hashrate (as of the afternoon of December 20, 2013) of 8,353,557 Gh/s, that's 83.54 megawatts. So it would use just over 2 gigawatt-hours per day, or 731.8 gigawatt-hours per year. This is just over the capacity of the Loon Lake hydroelectric plant, near Sacramento, California.
What about a rate of 1 watt per Gh/s, which the above sources seemed to think was fairly plausible in the near-term? That gives us 8.35 megawatts, given the current global hashrate. This is about the power output of two GE E60C locomotives, which pull Amtrak trains. Is this more accurate a figure, because it seems more reasonable? Who knows. Calculations of scale are difficult to judge, which is why we're in stuck with this myth to begin with.
At 0.1 watts per Gh/s, Bitcoin would use the same electricity as the yearly consumption of 674.5 average American homes.
Let's try the low end, with 0.1 watts per Gh/s, which we might assume a top-of-the-line ASIC mining units burns. The entire Bitcoin network would then consume 835.4 kilowatts, or 7.31 gigawatt-hours per year. That is the same yearly consumption as 674.5 average American homes.
I considered attempting to work out energy usage by studying non-cryptocurrency uses of mining hardware, but at this point in the game, this is futile. The ASIC machines specifically designed for mining Bitcoin are much more efficient at hashing than they possibly could be at anything else. You can't use an ASIC to do cancer research. You would have to redesign the machine from the chip up. It's been fun in the past to compare the Bitcoin network to the world's supercomputers. But this is a sketchy estimate as well—it's based on the number of FLOPS that used to be necessary to achieve a particular hashrate, and using the global hashrate to make an estimated FLOP rate with simple multiplication (which blows the combined FLOP rate of all the world's supercomputers out of the water). But the machines doing the hashing are much better at it than any supercomputer would be. If you opened up cgminer on a supercluster, you wouldn't be able to mine at nearly so efficient a rate. So the comparison between miners and supercomputers is really not meaningful any longer.
But we can look at the price of ASICs. Butterfly Labs, who is actually shipping units (at least shipping some units... there is a lot of vaporware in the ASIC marketplace), sells a 50 Gh per second miner for $2500. Running a 50 Gh/s miner for one day should get you 4,320,000 Gh, or 0.027648 BTC, according to my calculations, or $17. So it would take you 147 days to make back your initial investment. But, you still have to pay for power. 10 cents per kilowatt hour is a decent price in the United States.
Let's try it with 10 watts per Gh/s.
That's means the ASIC draws 500 watts, which will run you 12 kilowatt-hours per day, or 1.20 cents.
What about with 1 watt per Gh?
That's 1.2 kilowatt-hours per day, or 12 cents.
At 01. watts per Gh, that's only 1.2 cents a day.
We can say one thing for sure—electricity is not the expensive part of investing in an ASIC miner. Mining has to be fairly energy efficient, or it simply wouldn't happen. Just for fun, let's look at the limit-case.
Take that entire $17 that you would make from the Bitcoins mined in a single day with a 50 Gh/s ASIC.
If you spend all of that $17 on electricity, just breaking even, you would be buying 170 kilowatt-hours per day, running a 7.08 kilowatt load.
This means your ASIC you be using 141.6 watts per Gh/s, certainly not as much as the original 650 watts per Gh/s estimate. And, more importantly, if you were sucking 7 kilowatts through a standard 110 volt household circuit, you would be drawing over 63 amps. Most circuit breakers are set to trip somewhere around 20 amps, so you don't light your walls on fire.
Stealing one kilowatt-hour earns you as much money as returning two beer cans to the store for the deposit.
Charles Stross seems to think that people will steal electricity to mine Bitcoin. But that's just not the case. Attaching your hardware investment to an easily discoverable crime doesn't make any sense to save a few dollars a day, at least not for most miners. Think about it this way—stealing one kilowatt-hour earns you as much money as returning two beer cans to the store for the deposit. Even if you are running as much stolen electricity as you can through your home circuits without your junction box bursting into flames, you would probably wouldn't take the risk of being caught if you could make the same money lugging a few boxes of bottles back to the supermarket. To make the crime worth it, you would have to be draining enough energy to run a small factory, and someone would notice.
The forums report that people are mining. They would not be doing so if they were losing money. That is hardly a scientific study of Bitcoin energy expenditures, but the math can give us a pretty good idea of what is plausible. And what the math shows is that most of the panic about Bitcoin energy usage is poorly understood and founded on bad estimates.
The LCD TVs sold in one quarter of 2012 use more electricity in a year than the Democratic Republic of the Congo.
One last thing, for comparison. I happened to come across this study by the Electric Power Research Institute, that judged the annual power consumption for a variety of household devices. An average iPad (averaged between models) uses about 9 kilowatt-hours per year. Apple has sold 170 million iPads, so all the iPads of the world use 1.53 terawatt hours per year. A 42” LCD TV uses about 204 kilowatt-hours per year. In the first quarter of 2012, 43,131,000 LCD TVs were sold. So that is 8.8 terawatt-hours per year—more electricity than produced by the Democratic Republic of the Congo, used by the TVs shipped that quarter alone. There are estimated to be 65 million residential dishwashers in the United States, which on average consume 300 kilowatt-hours per year, for a grand total of 19.5 terawatt-hours per year. It would take the entire San Onofre nuclear power plant, running at full capacity all year, to provide that amount of power.
It is difficult for things to seem simple and logical when comparing economies of scale. But reality is what people do. People buy iPads, they run dishwashers, they leave the TV on so the cat isn't lonely when they aren't at home. And they mine Bitcoin. As inadvisable as any of these things are, and as foolishly squandered as our energy resources are around the whole world, it is hard to find any reason to point the finger at Bitcoin specifically.