Chasing Rabbbits

AI & The Energy Efficient Brain

The human brain uses 20 watts of power to do its squishy supercomputing.

AI models take a bit more—watts in the gigas (which is fancy speak for 1 billion).

Despite improving energy efficiency, AI infrastructure is still using a lot of power and leaving quite a footprint.

From IEEE:
From IEEE: Bar chart showing increasing carbon emissions from training AI models, 2012\u20132024.

For context, it seems that generating one of those Miyazaki style images from the hot new imagegen model uses about the same amount of energy as fully charging your phone.

This makes power consumption an attractor for innovation—on both the supply and demand sides (nuclear power part 2: (clean) electric boogaloo?).

Which makes this "Super-Turing AI" idea pretty cool.

Oh, and there's the fact that it returns to the brain-mimicking attempts of neural nets and the like.

From the ScienceDaily post:

"Super-Turing AI," which operates more like the human brain. This new AI integrates certain processes instead of separating them and then migrating huge amounts of data like current systems do.

Like the brain...

Super-Turing AI is revolutionary because it bridges this efficiency gap, so the computer doesn't have to migrate enormous amounts of data from one part of its hardware to another.

This is impressive...

In a test, a circuit using these components helped a drone navigate a complex environment -- without prior training -- learning and adapting on the fly. This approach was faster, more efficient and used less energy than traditional AI.

The researcher quoted (extensively) in the post specifically calls out backpropagation, the backbone of modern machine learning. Iterating/innovating on that could be "big news, if true," as they say.