[EDIT: I now think this probably wasn't a deepfake, but a recording the attacker made when using the same attack against the previous person. See the comments.]

 

I think someone just tried to phish my Facebook account, including a fake video of a FB friend. Here's the conversation: 

Them, via FB Messenger, 9:32am:

Please ,I was trying to login in my instagram page on Facebook my new phone and they ask me to find someone to help me receive a code, Facebook gave me two friends suggestions and you one of them, the other person isn't online. will you Help me receive the code please?

Me:

I'm sorry you're having trouble logging in! Just so I can make sure your account hasn't been hacked, how did we meet?

Them:

[Calls me over FB Messenger, audio isn't working but it does look like them. I'm completely convinced at this point.]

Me:

Audio wasn't working, but I did recognize you
What do you need me to do?

32665, over SMS:

NNNNNNNN is your Facebook password reset code [this number has previously sent me FB resets]

Them:

Send me the code sent to you minute ago

Me:

Hmm, those look like the code to reset the password to my account?
Can we call again?

Me:

[I try to call them back, doesn't go through]

Them:

Nahh it's for my instagram

Them:

Having bad connections here

Them:

Send me the code ?

Me:

sorry, I'm still worried your account has been hacked -- can we do another call?

Them:

[Calls me over FB Messenger, audio is still not working, and the video feels slightly off. Ends quickly on their end. Possibly it's even the same video from last time?]

Me:

We're you able to hear me?

Them:

My connections

I've reported their account as hacked.

Things that made me suspicious:

I don't think FB has any sort of account recovery that looks like this

This is exactly what an attempt to hack my FB account would look like

9:30am, even though that makes it 6:30 where they live

Video call didn't have any audio

They couldn't receive incoming video calls

Text did't feel like them, though I don't know them that well.

Here's a screenshot I took during the second video call:

Even with all those things, the video call would normally have been very convincing, and it did briefly convince me. I could easily see it fooling someone who didn't know about deepfake video.

Comment via: facebook

New Comment
9 comments, sorted by Click to highlight new comments since:

On FB people pointed out that this could just be a recording with the audio removed

worth editing the post title for this.  Scammers are getting more clever, but not targeted deepfake clever.  I'm sure it'll happen sometime, and reports are useful but only when not over-stating the state of the technology.

Edited!

Given that getting the video recording is trivial achieved by just calling the person.

Faking audio is easier than faking video, so given the audio problem issue I don't think this is how someone who wants to create a deepfake would do it.

A spelling mistake like the wrongly placed comma "please ,I was" is also unlikely for any attack that's sophisticated enough to be a deepfake attempt.

A spelling mistake like the wrongly placed comma "please ,I was" is also unlikely for any attack that's sophisticated enough to be a deepfake attempt.

I agree with the other points, but not this:  sophistication is not a scalar.  It's quite possible to have access to sophisticated tools (which replicate and scale easily), but be sloppy or bad at English orthography (and not realize it).  

I don't think this is useful evidence for deep-fake scam over video-replay scam or vice-versa, but it could easily be evidence for either scam over actual help attempt.  It depends entirely on how out of character such a misplaced comma would be for this particular friend.

At the moment deep-fake technology does not replicate and scale easily. Those attacks where it gets used are likely either high-stakes espionage or about stealing a significant amount of money. 

This is one of those "The future is already here — it's just not evenly distributed" situations.  Training is hard and expensive.  Using is not.  Whether you need to retrain for a given target is an architectural decision -- it does make it harder to train (but sublinearly in targets).

Faking audio is not easier if all you have to go on is what's public on FB: lots of pictures but not generally recordings of people speaking.

A scammer can look for outliers. If someone wants to target you specifically, there's a good chance that they can find a friend of yours that has more audio online than just what's public on FB.