What is AGI?

Many conversations center around scale as the missing ingredient. More chips. More parameters. More training data. More power.

More scale gets us closer to human. Maybe, eventually, to human (or even better than human).

I’m not so sure. Does capability equal human?

An airplane can fly, but not like a bird. More capability doesn’t change category. It just makes the tool more capable.

An LLM works on statistics. It asks, “What character/word/number/token comes next?” And it decides based on its weighted parameters and training data.

Is that what our brain does? Is that how it works?

An LLM hasn’t experienced anything other than the chat you’ve had with it. It can predict and generate the language of grief, love, awe, guilt, hope, worship, and fear. But it hasn’t experienced.

Human doesn’t just mean capability.

We suffer. We desire. We intend. We love. We believe.

Call it spirit. Call it soul. Call it self-awareness.

Whatever word you choose, it points to something other. Something not explained by scale and capability.

I won’t raise my eyebrow at an LLM until it walks, figuratively or concretely, up to the table, sits down, and says, “Here’s what I wanna do…”


Discover more from johnmaconline

Subscribe to get the latest posts sent to your email.

Pin It on Pinterest

Share This

Discover more from johnmaconline

Subscribe now to keep reading and get access to the full archive.

Continue reading