PHOTO
Democrat Shamaine Daniels is running for Congress, eyeing a seat held by Trump-aligned Republican Representative Scott Perry, who played a key role challenging the 2020 election results.
Daniels, who lost to Perry by less than 10 points last year, hopes a new weapon will help her underdog candidacy: Ashley, an artificial intelligence campaign volunteer.
Ashley is not your typical robocaller; none of her responses are canned or pre-recorded. Her creators, who intend to mainly work with Democratic campaigns and candidates, say she is the first political phone banker powered by generative AI technology similar to OpenAI's ChatGPT. She is capable of having an infinite number of customized one-on-one conversations at the same time.
Ashley is one of the first examples of how generative AI is ushering in a new era of political campaigning in which candidates use technology to engage with voters in ways increasingly difficult to track.
To some, it is an exciting new tool for conducting high-quality conversations on a large scale. Others worry this will worsen disinformation in the polarized landscape of American politics already battling "deepfakes," realistic but fabricated videos and images created using AI algorithms.
Over the weekend, Ashley called thousands of Pennsylvania voters on behalf of Daniels. Like a seasoned campaign volunteer, Ashley analyzes voters' profiles to tailor conversations around their key issues. Unlike a human, Ashley always shows up for the job, has perfect recall of all of Daniels' positions, and does not feel dejected when she's hung up on.
"This is going to scale fast," said 30-year-old Ilya Mouzykantskii, the London-based CEO of Civox, the company behind Ashley. "We intend to be making tens of thousands of calls a day by the end of the year and into the six digits pretty soon. This is coming for the 2024 election and it's coming in a very big way. ... The future is now."
For Daniels, the tool levels the playing field: as the underdog, she is now armed with another way to understand voters better, reach out in different languages (Ashley is fluent in over 20), and conduct many more "high bandwidth" conversations. But the development worries many, including OpenAI CEO Sam Altman, who testified in Congress in May that he was "nervous" about generative AI's ability to compromise election integrity through "one-on-one interactive disinformation."
The technology, which learns from reams of internet data, has become so good at realistic conversations that in recent months people have fallen in love with, and declared themselves married to, AI-powered chatbots.
Mouzykantskii said he is fully aware of the potential downsides, and does not intend to take any venture capital funding which might entice him to prioritize profits over ethics.
And like OpenAI, he is setting up an unusual governance structure: a committee empowered to force him to publicly disclose anything of concern about the company. Civox has decided to give Ashley a robotic-sounding voice and disclose she is an AI, despite not being legally required to do so.
Mouzykantskii and his co-founder Adam Reis, former computer science students at Stanford and Columbia Universities respectively, declined to disclose the exact generative AI models they are using. They will only say they use over 20 different AI models, some proprietary and some open-source.
Thanks to the latest generative AI technologies, Reis was able to build the product almost entirely on his own, whereas several years ago it would have taken a team of 50 engineers several years to do so, he said.
LEGAL GRAY AREA
Few legal guardrails regulate this way of using AI.
"I don't know under what federal law that would be illegal," said Robert Weissman, president of the nonprofit consumer advocacy organization Public Citizen.
Michigan is one of a few states that have passed or are in the process of debating legislation to regulate deepfakes in elections. Pennsylvania, where Daniels is running, has no such legislation.
No rules directly apply to what Civox is doing. Federal Trade Commission regulations ban telemarketers from making robocalls to people on the Do Not Call Registry, but the list does not apply to political calls - and Civox's activity, with its "personalized" messages, does not qualify as robocalling.
The Federal Communications Commission prohibits campaign-related autodialed or prerecorded voice calls, including autodialed live calls to cell phones without the recipient's prior consent. The FCC is also beginning a formal inquiry into how AI technology impacts illegal and unwanted robocalls. The Federal Election Commission has begun looking into whether to regulate AI use in campaigns.
None of these rules apply to the way campaigns are using Mouzykantskii's technology.
Mouzykantskii said he welcomes regulation, noting the technology's potential to spread misinformation. Other companies will likely create AI callers that sound nearly identical to a real human and not disclose the caller is AI-generated, he said.
"This should provoke thoughts about just how close we are to some version of the future that has previously only been available in sci-fi movies and books," he said. "And that is reason alone for regulators and legislators, in not just the United States but globally, to start paying attention."
David Fish, 63, enjoyed hearing from Ashley despite that he could tell instantly it wasn't a human.
"This one kept my attention," he said. "The thing I really liked was that it identified itself as AI and didn't try to fool me."
(Reporting by Anna Tong in San Francisco and Helen Coster in New York; Editing by Kenneth Li and Richard Chang)