Will Proof-of-Work Die a Green Death?
In the Cryptography mailing list, John Gilmore recently brought up and interesting point. One of the oft-debated ways to fight spam is to put a form of proof-of-work postage on it.
Spam is an emergent property of the very low cost of email combined with the effect that most of the cost is pushed to the receiver, not the sender. The thinking goes that if you can trivially increase the cost to the sender, it disproportionately affects the spammer, and thus tilts the economics back to us from them.
The proposition has always been debatable. Laurie and Clayton wrote a paper in 2004 challenging the idea, and I’ve never seen a full refutation of it. Moreover, the balance may even be tipping more to the spammer. The major problem with proof-of-work is that legitimate senders are often on limited devices like smartphones and the spammers are on compromised servers. Systems to harness compute power in graphics cards such as OpenCL can unbalance the system.
There is also the related problem that the costs of power and cooling (which is another way to say power) of a computer over its life are often more than the hardware costs. This has been a huge fly in the ointment of grid computing.
Gilmore, however, says:
Computers are already designed to consume much less electricity when idle than when running full tilt. This trend will continue and extend; some modern chips throttle down to zero MHz and virtually zero watts at idle, waking automatically at the next interrupt.
The last thing we need is to deploy a system designed to burn all
available cycles, consuming electricity and generating carbon dioxide, all over the Internet, in order to produce small amounts of bitbux to get emails or spams through.
I think he’s got it spot on, and whatever we do, Proof-of-Work is now in the recycling bin.
Photo “Proof of Living” by yuankuei.
People in the econ / monetary world always knew this. The thing is, in the world of money, there is a sort of law that says a good money is made of something that has no better use (think here gold or paper). The corollary to this is that if you “burn something” to use it for money, you are actually destroying something that is useful; history shows, economics explains, that this will not make a good money.
A sort of thought experiment to see this is possible is to think of the North American “potlatch” ceremony. There, Chiefs from tribes get together once a year, and to show how powerful they are, they destroy more and more things. The one that wins is the one that destroys the most. It’s clear, it’s simple, it’s obvious. It is also decidedly non-economic, and likely the tribes would massacre the chief that decided to do it more than once per year…
The rest of us will not adopt a process with a positive feedback loop of destruction built in, no matter how elegant it sounds. PoW is as dead as potlatch.
Except Gilmore’s claim is false given today’s computers.
Computers’ power consumption does not scale quasi-linearly with use: they are basically a step function. (Okay, so less of a step function because modern cores can run at several different power levels, but there are basically 2-3 steps.) Disk power is mostly a step. Power consumption by RAM is mostly a step (you need to keep repowering those quickly-dissipating bits.) Yes, if you computer is completely hibernating, its power consumption can be taken quite low. But the marginal cost of a PoW is basically zero.
This, in fact, is the motivating call behind Google’s “proportional computing” — they would love power consumption to scale linearly with utility. But we’re a far cry from it.