It’s easy to anthropomorphize AI. It really seems like a person on the other end of that channel.

Lots of people writing about AI talk about what it thinks and what it wants. Depending on how you define thinking, you can argue it thinks. 

But for sure it doesn’t want.

It’s incentivized. It works towards those incentives based on the parameters and data it’s been given. But it doesn’t want anything. To want means to have desires, goals, or a will. AI doesn’t wake up and decide that it doesn’t want to work towards its incentives. It doesn’t wonder. It doesn’t think about its purpose. It doesn’t feel meaning. 

AI isn’t plotting to take over the world, nor is it striving to be your best friend. AI might, however, be mis-incentivized. And if that happens, then, yes, we may have some problems.

Instead of asking or even thinking about what AI wants, the real question is: What do we want? 

It’s up to us to figure that out. 

Subscribe to Have New Articles Delivered Right To Your Mailbox

Subscribe to Have New Articles Delivered Right To Your Mailbox

Thank You! Please watch your email to confirm the subscription.

Pin It on Pinterest

johnmaconline
Subscribe to Have New Articles Delivered Right To Your Mailbox
Subscribe To Receive Latest Articles Directly To Your Mailbox
Share This