Why Generative AI is more dangerous than you think
A lot has been written about the dangers of generative AI in recent months and yet everything I’ve seen boils down to three simple arguments, none of which reflects the biggest risk I see headed our way. Before I get into this hidden danger of generative AI, it will be helpful to summarize the common warnings that have been floating around recently:
- The risk to jobs: Generative AI can now produce human-level work products ranging from artwork and essays to scientific reports. This will greatly impactthe job market, but I believe it is a manageable risk as job definitions adapt to the power of AI. It will be painful for a period, but not dissimilar from how previous generations adapted to other work-saving efficiencies.
- Risk of fake content: Generative AI can now create human-quality artifacts at scale, including fake and misleading articles, essays, papers and video. Misinformation is not a new problem, but generative AI will allow it to be mass-produced at levels never before seen. This is a major risk, but manageable. That’s because fake content can be made identifiable by either (a) mandating watermarking technologies that identify AI content upon creation, or (b) by deploying AI-based countermeasures that are trained to identify AI content after the fact.
- Risk of sentient machines: Many researchers worry that AI systems will get scaled up to a level where they develop a “will of their own” and will take actions that conflict with human interests, or even threaten human existence. I believe this is a genuine long-term risk. In fact, I wrote a “picture book for adults” entitled Arrival Mind a few years ago that explores this danger in simple terms. Still, I do not believe that current AI systems will spontaneously become sentient without major structural improvements to the technology. So, while this is a real danger for the industry to focus on, it’s not the most urgent risk that I see before us.
This excerpt was taken from an article published in venturebeat.com. To read the full article, please click on this link.