Scary! Voters could easily get fooled by deep-faked politicians.
Watch here:
Can AI steal your vote?
Video not available here in the USA.
Not available for me either.
@Redheadedgammy, @Betty
Shame. Ah, well, never mind.
Can you open this link? The article explains what the documentary is all about.
[telegraph.co.uk]
This is from another source.
Warning: that isn’t Rishi Sunak – how deepfake AI could swing the election
Channel 4’s Dispatches set up an experiment to ‘sway’ undecided voters using AI-generated fake material – the results were troubling
Don't believe your eyes: a still from a deepfake video of Prime Minister Rishi Sunak CREDIT: Channel 4
TECH
How Dispatches created AI content - and its warning for broadcasters
Kalel Productions produced Can AI Steal Your Vote? - Dispatches for C4, trying to sway voters with content created by AI tools.
Participants being pushed towards voting Labour were shown deep-faked footage of Rishi Sunak announcing new Conservative policies – including a £35 charge for GP appointments and an increase in the minimum amount required to put down as a deposit on a property, to 20% of the value. They also saw an AI-generated hot mic of Sunak leaving a conference, where he is waylaid by a reporter asking whether he plans to privatise the NHS and sell it to the USA. The fake hot mic hears Sunak, in his car, saying “How the hell did they know about the NHS? Jesus Christ. Find out who leaked.”
Participants being pushed towards voting Conservative were shown deep-faked footage of Sir Keir Starmer announcing some new Labour policies – including placing asylum seekers with children at the top of the housing list and a new “social care” tax of 5% on those earning over £50,000. They also saw an AI-generated hot mic of Starmer seeming to talk to his aides after a meeting. In the fake hot mic, he says “I’ll just tell those guys whatever they want to hear, it won’t matter once we’re in power.”
Read on if you're still interested: [broadcastnow.co.uk]
@Ryo1 nope, can't read without registering...
Never mind.