By allowing companies to sell simulated humans, we leave ourselves open to a new way to be manipulated by the illusion of love and therefore possibly exploited by the processes of addiction.
I was an early adopter of the internet: I got online in 1990, before web browsers even existed, because a friend founded a local service that had text-based discussions similar to those now found on Reddit. This came in handy when I began a long-distance relationship. In 1996 I wrote one of the first articles about how addictive connecting virtually can be. But my partner was a man I had met in person — not a bot designed by a corporation to simulate a paramour. When I learned that artificial intelligence chatbots, now often called companions, were becoming popular and intended to be used for therapy or socializing, I was immediately concerned. As someone with a history of heroin addiction who has spent years researching compulsive behavior, I know a lot about what hooks people and how tech companies can manipulate these factors to sustain engagement. And as a person on the autism spectrum, I also know how online worlds can be especially sticky for those who have difficulty with real-world socializing. As I write in a recent guest essay for Times Opinion, love is essentially the template for addiction. The biological systems that allow us to persist through the difficult parts of relationships can also misdirect us toward harmful compulsions, like seeking drugs or gambling online. While loving relationships expand our worlds, addictions shrink them. Just like internet use, A.I. companions aren’t all bad — in fact, they might have some therapeutic value, as a way to get extra support or to practice social skills or cognitive techniques for fighting anxiety and depression. But, as I explain in my essay, allowing corporations to sell such products without safety limits is a recipe for disaster — especially in this case, where A.I. companions might be working not only to hook you on their tech but also to change your politics or sell you other products. It’s hard enough to figure out what real love is and to find it in a world of false fronts without having to fight corporate deceptions as well.
We hope you’ve enjoyed this newsletter, which is made possible through subscriber support. Subscribe to The New York Times. Games Here are today’s Mini Crossword, Wordle and Spelling Bee. If you’re in the mood to play more, find all our games here. Forward this newsletter to friends to share ideas and perspectives that will help inform their lives. They can sign up here. Do you have feedback? Email us at opiniontoday@nytimes.com. If you have questions about your Times account, delivery problems or other issues, visit our Help Page or contact The Times.
|