Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, there are ~100B neurons, ~1Q synapses, but how much compute is the brain actually using over time?

Some quick googling gives this:

- Generation of an action potential seems to use ~2.5×10^−7 J [0]

- The brain consumes around 20W during normal activity

This seems to imply that there are around 8×10^7, call it 10^8, activations per second [1].

Apparently, the average neuron has 1000 synapses. Let's say each synapse requires 10 mulacc operations per activation. Doing that math gives about 10^12 FLOPs/s [2].

Integrate that over 18 years, and you get roughly 5.7×10^20 FLOPs [3].

PaLM required 2.56×10^24 FLOPs to train [4]. So, we have (way more than) enough compute, we're just not using it efficiently. We're wasting a lot of FLOPs on dense matrix multiplication.

There's plenty of wiggle room in these calculations. I checked over the math, but I'd appreciate if someone would let me know if I've missed something.

    [0]: https://link.springer.com/article/10.1007/s11571-018-9503-3
    [1]: https://www.wolframalpha.com/input?i2d=true&i=Divide%5B20+W%2C2.5%E2%80%89%C3%97%E2%80%89Power%5B10%2C%E2%88%927%5D+Joules%5D
    [2]: https://www.wolframalpha.com/input?i2d=true&i=Power%5B10%2C8%5D+Hz+*+1000+*+10+flop
    [3]: https://www.wolframalpha.com/input?i2d=true&i=Power%5B10%2C12%5D+Divide%5BFLOP%2Cs%5D+*+18+years
    [4]: https://blog.heim.xyz/palm-training-cost/#:~:text=PaLM%20(2022)-,2.5e24,-10x***


There is a long history of connectionist attempts trying to ballpark the brain compute to constrain AI timelines, going back to von Neumann/Turing/Good. The most recent one would be https://www.openphilanthropy.org/brain-computation-report You can see in Figure 1 that your 10^12 steady state is the very low end. If you're interested in seeing where your envelope estimate differs from the others, well, it has the references.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: