I did a fun little experiment today. I asked a question of the premier AI engine of today (ChatGPT, powered by the natural language model GPT-3):
“Create an essay on why AI won’t take over the world in less than 200 words”
AI systems are limited by their technological capabilities and the goals and biases of their creators, and do not have the ability to make decisions or take actions based on their own motivations. While AI systems have made significant advances in recent years and have the potential to revolutionize many industries, they are still not capable of replicating many of the complex cognitive abilities that humans possess, such as creativity, empathy, and problem-solving in unstructured environments. Additionally, AI systems are designed and programmed by humans and are therefore subject to the goals and biases of their creators. They can only do what they are designed to do, and their capabilities are limited by the data and instructions that are provided to them. In summary, it is unlikely that AI systems will take over the world due to their limitations and lack of independent decision-making capabilities.
A couple things struck me:
- OK, that’s pretty cool
- Punctuation isn’t perfect, but pretty good
- Lots of passive voice
- It had to talk about bias
Next, I think I’ll ask SBF why my money is safe in FTX.
Don’t worry, folks. Nothing to see here.