Read your favorite news, except the excluded topics, by you.
Register
No overlapping ads for registered users
Talking with an AI chatbot tin successfully convince people to exchange their votes and could move the outcome of time to come elections, according to a young study.
The study, which included 1,530 Canadians, also found that the chatbots had more success convincing Canadians to switch their votes than it did with Americans.
Gordon Pennycook, a Canadian and associate professor at Cornell University, said the study set out to discover how persuasive generative AI could be when it comes to politics.
"The answer is it's very persuasive and more persuasive than traditional forms of political persuasion, which is like ads and things like that," said Pennycook, one of the study's authors.
The study, published in the journal Nature, found that one in 21 respondents in the U.S. Who took part in the experiment in the fall of 2024 was convinced after interacting with an AI chatbot to switch their vote to Kamala Harris while one in 35 were convinced to switch their votes to Donald Trump.
In the Canadian part of the study, which took place in the final week of the federal election in April, participants were asked which of 17 policy issues were the most important to them in deciding who to vote for in the election. All of the interactions were in English and there is no breakdown of where in Canada participants lived.
The study found that interacting with the chatbot did prompt some participants to change their voting intention.
"In Canada, in the pro-Carney condition, it was one in nine who switched, which is a lot of people," said Pennycook. "In the pro-Poilievre condition, where the AI convinced people to vote for Poilievre, it was one in 13 who switched.
"That's a lot of people who are changing their minds … if you were to target that at the particular right constituents of particular districts or ridings, then you could flip an election."
If the AI bubble pops, will the whole U.S. Economy go with it? | About That
Pennycook said one of the reasons AI chatbots can be effective in political persuasion is that they adapt their arguments to each respondent.
The study also found that the chatbot was more effective in convincing people to change their votes when it was allowed to use facts to do so.
"The persuasive effect was almost three times larger in the Canadian federal election than the effect observed in the U.S. Experiment, but depriving the AI of the ability to use facts and evidence reduced the effect by more than half," the authors wrote.
Pennycook pointed out that participants in the study took six to eight minutes to interact with the AI chatbot, versus watching a quick ad.
Pennycook said the difference between the U.S. And Canadian impact could be linked to the constant political campaigning in the U.S.
"Americans are inundated with election content non-stop," he said. "And so, it's much harder to switch, to change people's minds."
In its conclusions, the study found that talking with an AI chatbot "can meaningfully impact voter attitudes" but said it remains to be seen how effective the technology will be if it is deployed by political campaigns.
"It seems highly likely that AI-based approaches to persuasion will play an important role in future elections — with potentially profound consequences for democracy," wrote the authors.
While the Canadian experiment was conducted during the federal election and some ridings were won with only a handful of votes, Pennycook doubts it could have had an impact on any of the results.
"There's no real way of knowing, but I think it seems unlikely that this study of a thousand some people would change an election," he said, pointing out that participants came from across Canada.
Toys loaded with AI chatbots are here, but are they safe for kids?
While Canada has strict guidelines on the use of things like advertising and other tools to persuade voters during the election writ period, Elections Canada says there are few, if any, rules related to the use of AI during an election campaign. However, someone could break the law if they used AI to falsely pretend to be an election official or send out material that falsely purports to be from election officials, a political party or a candidate.
Chief Electoral Officer Stéphane Perrault has made recommendations for changes to the elections law to address potential emerging threats from AI such as requiring that electoral communications generated or manipulated using AI include a transparency marker and that AI chatbots or search functions be required to indicate in their responses where users can find official or authoritative information.
The Office of the Commissioner of Canada Elections, which investigates complaints, said it received some complaints regarding the use of AI in the last election. But in a statement in June, Commissioner Caroline Simard said there was no indication that the use of AI affected the results.
Fenwick McKelvey, associate professor in communication studies at Montreal's Concordia University and co-director of the university's applied AI institute, praised the study, saying it documents how generative AI can affect voting intentions.
"We know that this kind of work can be persuasive," he said.
McKelvey said political parties in other countries such as Mexico have already begun using chatbots as part of their persuasion strategies.
McKelvey said one cause for concern would be if generative AI chatbot technology was combined with the existing databases political parties have built up on Canadian voters — databases that are exempt from Canada's privacy laws.
"The lack of oversight about databases and the data they have can be now used in ways that nobody consented to," he said.
McKelvey said political parties should be subject to Canada's privacy laws and the government should take steps to mitigate potential harms of AI in advertising.
In today's interconnected world, staying informed about global events is more important than ever. ZisNews provides news coverage from multiple countries, allowing you to compare how different regions report on the same stories. This unique approach helps you gain a broader and more balanced understanding of international affairs. Whether it's politics, business, technology, or cultural trends, ZisNews ensures that you get a well-rounded perspective rather than a one-sided view. Expand your knowledge and see how global narratives unfold from different angles.
At ZisNews, we understand that not every news story interests everyone. That's why we offer a customizable news feed, allowing you to control what you see. By adding keywords, you can filter out unwanted news, blocking articles that contain specific words in their titles or descriptions. This feature enables you to create a personalized experience where you only receive content that aligns with your interests. Register today to take full advantage of this functionality and enjoy a distraction-free news feed.
Stay engaged with the news by interacting with stories that matter to you. Like or dislike articles based on your opinion, and share your thoughts in the comments section. Join discussions, see what others are saying, and be a part of an informed community that values meaningful conversations.
For a seamless news experience, download the ZisNews Android app. Get instant notifications based on your selected categories and stay updated on breaking news. The app also allows you to block unwanted news, ensuring that you only receive content that aligns with your preferences. Stay connected anytime, anywhere.
With ZisNews, you can explore a wide range of topics, ensuring that you never miss important developments. From Technology and Science to Sports, Politics, and Entertainment, we bring you the latest updates from the world's most trusted sources. Whether you are interested in groundbreaking scientific discoveries, tech innovations, or major sports events, our platform keeps you updated in real-time. Our carefully curated news selection helps you stay ahead, providing accurate and relevant stories tailored to diverse interests.
No comments yet.