AlphaGo uses more power than 3000 humans

Update 2017: OpenAI used a single machine to beat a Dota champion → DENDI 1v1 vs BOT AI - TI7 DOTA 2. I may be underestimating the speed of development.

AlphaGo recently defeated the world Go champion. Go was thought to be unbeatable for computers, but machine learning cracked it.

This year, AlphaGo will challenge multiple players — and support players at the Future of Go Summit in Wuzhen, China. It seems humans are completely outpaced.

Mind the energy consumption

But when you think about the games, keep in mind that AlphaGo likely uses several Kilowatt of energy: According to Wikipedia it had around 2000 CPUs and 250 GPUs when it played against Lee Sedol, each likely taking more than 300W of electricity. On top of that it used millions of simulated training games to sharpen its skills.

Humans on the other hand have around 100W of power (and can’t easily charge up), with only about 30W of that available to power their brains. Even when counting only the GPUs, AlphaGo has a higher power consumption than 3000 humans. That’s more power consumption than all professional Go players together (around 1000).

Learning from AlphaGo?

Also Brady Daniels showed on youtube how he considers the games of AlphaGo to be pretty easy to understand (but hard to beat), while the article about the Go Summit in Wuzhen quoted players who said that seeing AlphaGo win was liberating: They dared to try new moves.

And this might actually have pretty plausible1 explanations: Different from humans, AlphaGo could play millions of death matches — against itself. The equivalent of a street fighter who only fights professionals now and then and could follow up on each fight by train endlessly in real battles to see which of the strategies actually works out in typical battles. Humans just don’t have that much lifetime.

The interesting part you need to keep in mind when you try to learn from AlphaGo: If you try to generalize on this without testing your understanding in real battles, you might built ceremony which looks like AlphaGo but doesn’t work. Essentially Cargo Culting Go.

(It would be interesting whether we could parallelize humans well enough to counter the processing power of AlphaGo with our much lower energy consumption).

»Could every one of us have an AlphaGo?«

To answer this question, we need to look at the energy consumption and the world energy production.

Assuming 2000 CPUs at 300W each, AlphaGo would currenty use 600kW of Energy. With around 3 Terawatt world Electricity generation, the world could keep about 5 million AlphaGo computers running — if we did not need the energy for anything else. That’s only one per 1000 people, one per 100 people if we only run it a few hours per day, and only if all use the same training which gets copied over.

(note that consuming as much energy as 3000 human bodies does not mean that AlphaGo actually costs as much as 3000 skilled human workers, since electrical energy is pretty cheap (600kW likely only cost 600€ per hour, assuming a rate of 0.1ç per kWh, and it’s bound to get cheaper with the switch to renewable energy sources) and food is only a small part of the cost of living in industrialized countries. The minimum subsistence level in Germany only provides 4.40€ for food per day and person, which translates into roughly 50ç per work hour, but due to rent, heating and other expenses, the minimum hourly salary is about 10€, and the cost of highly skilled labor is about 100€ per hour. One AlphaGo needs as much energy as 3000 humans but only costs as much as 6 skilled workers per hour — which is a dilemma for industrialized nations, because the economic incentives do not fit the actual resource consumption).

But if Moore’s Law holds (halving of the energy requirement per computation about every two years), an AlphaGo of 2037 should only require 600W and by 2047 your smartphone should be able to run an AlphaGo. The development speed of computer hardware is awesome (but very expensive).

Algorithmic improvements might make it available far sooner.

Related: Having the power to run 5 million robots with AlphaGo technology who are trained on StarCraft sounds pretty worrying — and might become reality within a few years. The tech for that is being made right now. In case you have a better use in mind: TensorFlow and Sonnet are available as Free Software (not implying that winning in StarCraft is a bad use by itself — it could also serve as training to do desaster aid and similar).


  1. That this sounds plausible does not mean that I’m right. 

Inhalt abgleichen
Willkommen im Weltenwald!
((λ()'Dr.ArneBab))



Beliebte Inhalte

sn.1w6.org news