New AI tool enables real-time face swapping on webcams, raising fraud concerns

Tried it this weekend, it seems it's just a modified version of an existing face-swapping tool that feeds it webcam stream instead of a file. Installing and getting it to run is a huge pain in the ass thanks to the python library versions mess but I got it to work slowly in software mode at least.

It worked ok but seemed to have trouble with ears so I had both my original ears and part of the new face visible at the same time, oops.
 
Upvote
78 (78 / 0)
Oh crap, we are so fucked with this AI shit. I only see nefarious and 'for the lulz' applications, not any good reasons to be able to do this.
It's not the future we needed, nor wanted, but apparently it is the one we deserve?
AR-15s are also a 'for the lulz' application.
 
Upvote
70 (86 / -16)

mstark90

Ars Scholae Palatinae
703
Oh crap, we are so fucked with this AI shit. I only see nefarious and 'for the lulz' applications, not any good reasons to be able to do this.
It's not the future we needed, nor wanted, but apparently it is the one we deserve?
Same thoughts over here. I easily see this being used to attack political opponents in new and scary as shit ways.

For the guy thinking that releasing this software was a good idea: something something something Jeff Goldblum in Jurassic Park something.
 
Upvote
87 (92 / -5)

Aurich

Creative Director
33,675
Ars Staff
There's some uncanny valley stuff going on, but you know, webcam quality being low can disguise some of that.

It's wild how this stuff is working in real time looking this good when you compare it to say the Luke Skywalker face swap stuff, which was pretty good, but honestly not that much better, and was done under controlled conditions with a team of experts in post processing.
 
Upvote
93 (97 / -4)

gaballard

Ars Tribunus Angusticlavius
8,527
Subscriptor++
we'll need deep-fake detection on the each device. i guess that's one potential use of these new-fangled AI chips that they're pumping out? if hopefully they're suitable for that purpose.
The problem is any algorithm that can detect deepfakes can then be used to train an ML to be better at creating undetectable deepfakes.
 
Upvote
137 (137 / 0)
Same thoughts over here. I easily see this being used to attack political opponents in new and scary as shit ways.
But there's a flip side to that coin. It will also give bad people (not just limited to politicians) an out when they are actually captured being shitty on video/audio. They'll just say it's a deepfake. This is especially effective in groups that are already following the "don't believe your lying eyes; believe only what the Dear Leader says is true" script.
 
Upvote
128 (129 / -1)
Tried it this weekend, it seems it's just a modified version of an existing face-swapping tool that feeds it webcam stream instead of a file. Installing and getting it to run is a huge pain in the ass thanks to the python library versions mess but I got it to work slowly in software mode at least.

It worked ok but seemed to have trouble with ears so I had both my original ears and part of the new face visible at the same time, oops.
Even here where the ears aren't doubled, they're still clearly the real person's ears and not faked to match the target.

I guess "look at the ears" is going to be the new "look at the fingers." At least for a while.
 
Upvote
46 (47 / -1)
Even if the technology is not top notch, legal guard rails should be put in place to prevent abuses. Passenger motor vehicles are all equipped with accelerators, but we still have laws, policies and safety devices to prevent them from being misused by Bad Actors. There should be stiff penalties for misuse of technologies like this just like there are for misuse of other things - we have arson laws for fire, to cite another example. Just because these AI entities are developing, it shouldn't mean we collectively throw up our hands and say, "Well, I guess AI businesses have to have unfettered freedom to innovate." No, they don't.
 
Upvote
33 (43 / -10)
we'll need deep-fake detection on the each device. i guess that's one potential use of these new-fangled AI chips that they're pumping out? if hopefully they're suitable for that purpose.
That's an arms race where you can only counter tech that's two or three steps behind the bleeding edge.
It'd be easier and more effective to seed the tools of fraud with malware.
 
Upvote
-4 (4 / -8)
Tried it this weekend, it seems it's just a modified version of an existing face-swapping tool that feeds it webcam stream instead of a file. Installing and getting it to run is a huge pain in the ass thanks to the python library versions mess but I got it to work slowly in software mode at least.

It worked ok but seemed to have trouble with ears so I had both my original ears and part of the new face visible at the same time, oops.
How does it compares to Roop unleashed on github that has supported live webcam for quite long time ago (in AI time)?
 
Upvote
3 (3 / 0)
Upvote
23 (23 / 0)
But there's a flip side to that coin. It will also give bad people (not just limited to politicians) an out when they are actually captured being shitty on video/audio. They'll just say it's a deepfake. This is especially effective in groups that are already following the "don't believe your lying eyes; believe only what the Dear Leader says is true" script.
That's already been happening, sadly. Well before the tools have been this advanced, too.
 
Upvote
4 (5 / -1)
Post content hidden for low score. Show…
Well the authors attempt to justify it are:

It will help artists with tasks such as animating a custom character or using the character as a model for clothing etc.
I mean, that's not totally BS. I'm working on an animated avatar project right now, and it's very helpful for the artists working on the different models to be able to perform the facial animations while tweaking them. It'd be even more useful for a motion capture performance (whether a brand new character model, or legally faking actors like for Rogue One or Tron: Legacy).

But given the way it's being presented, I feel like that's not really the motivation here.
 
Upvote
32 (33 / -1)

gaballard

Ars Tribunus Angusticlavius
8,527
Subscriptor++
Upvote
13 (18 / -5)
I also see they haven't solved the hair problem. I'm guessing that includes facial hair. Face mapping is trivial compared that taming THAT problem. Not only is hair physics a tough problem (especially to do live on composited video), but it's even harder to try to turn what a computer sees into a mesh to represent it. You could likely fake it by adding pre-created hair/beard models. But as soon as someone touches their hair/face, the jig is up.

So right now, your best defense against someone deepfaking you is to have a pompadour and a great big bushy beard.
 
Upvote
30 (30 / 0)

Xepherys

Ars Scholae Palatinae
702
Subscriptor
Clone Zuck looked like Dave Portnoy instead of Zuck, but the JD Vance, Hugh Grant, and George Cloney were terrifyingly accurate. I'm sure this will only ever be used for Jerky Boys style prank calls and nothing illegal or nefarious.

George... Cloney?
 
Upvote
87 (87 / 0)
I also see they haven't solved the hair problem. I'm guessing that includes facial hair. Face mapping is trivial compared that taming THAT problem. Not only is hair physics a tough problem (especially to do live on composited video), but it's even harder to try to turn what a computer sees into a mesh to represent it. You could likely fake it by adding pre-created hair/beard models. But as soon as someone touches their hair/face, the jig is up.

So right now, your best defense against someone deepfaking you is to have a pompadour and a great big bushy beard.
No they haven't solved that, I'm bald and trying to use Arnold Schwrtzenegger's face was, let's say, very disturbing.

Vin Diesel worked fine, other than ears as I mentioned. It seemed to deal fine with my stubble but I guess big beards could be a problem.

However I don't think it's a physics problem here, it doesn't model the hair mesh or anything, it's just deep learning networks.

I'm not sure if it's better that this was released for everyone to use or if it wasn't released to all and the bad actors secretly had access to it. I suppose it's better that everyone knows it's out in the open.
The cat's out of the bag. As I mentioned in the first post, the tool to do this offline to a file already existed, so it'd be pretty easy for someone to mod it if they stood to make lots of money scamming people.

I mean, that's not totally BS. I'm working on an animated avatar project right now, and it's very helpful for the artists working on the different models to be able to perform the facial animations while tweaking them. It'd be even more useful for a motion capture performance (whether a brand new character model, or legally faking actors like for Rogue One or Tron: Legacy).

But given the way it's being presented, I feel like that's not really the motivation here.
It does smell like a BS excuse.

I do think it could have legit uses though, like if someone was extremely self-conscious about how they looked, had a bad injury, or just wanted to be anonymous on a video chat for whatever reason.
 
Upvote
22 (22 / 0)
Already certain politicians (you know who I mean) are using the deniability of reality to insulate themselves of uncomfortable truths for them and their adherents.
Or get rid of their unpopular running mate by claiming he isn't real, and has actually been a team of North Koreans all along.
 
Upvote
18 (18 / 0)

housinit

Smack-Fu Master, in training
3
Sorry, not sorry, but humans are just not responsible enough to wield AI technology. All the CONS outweigh the PROS. Unfortunately, there is no stopping it now, the corporate overlords only see dollar signs. That is all they ever see. Profits over humanity always wins with those arseholes.
 
Upvote
18 (23 / -5)