[EDIT: I now think this probably wasn't a deepfake, but a recording the attacker made when using the same attack against the previous person. See the comments.]
I think someone just tried to phish my Facebook account, including a fake video of a FB friend. Here's the conversation:
Them, via FB Messenger, 9:32am:
Please ,I was trying to login in my instagram page on Facebook my new phone and they ask me to find someone to help me receive a code, Facebook gave me two friends suggestions and you one of them, the other person isn't online. will you Help me receive the code please?
Me:
I'm sorry you're having trouble logging in! Just so I can make sure your account hasn't been hacked, how did we meet?
Them:
[Calls me over FB Messenger, audio isn't working but it does look like them. I'm completely convinced at this point.]
Me:
Audio wasn't working, but I did recognize you
What do you need me to do?
32665, over SMS:
NNNNNNNN is your Facebook password reset code [this number has previously sent me FB resets]
Them:
Send me the code sent to you minute ago
Me:
Hmm, those look like the code to reset the password to my account?
Can we call again?
Me:
[I try to call them back, doesn't go through]
Them:
Nahh it's for my instagram
Them:
Having bad connections here
Them:
Send me the code ?
Me:
sorry, I'm still worried your account has been hacked -- can we do another call?
Them:
[Calls me over FB Messenger, audio is still not working, and the video feels slightly off. Ends quickly on their end. Possibly it's even the same video from last time?]
Me:
We're you able to hear me?
Them:
My connections
I've reported their account as hacked.
Things that made me suspicious:
I don't think FB has any sort of account recovery that looks like this
This is exactly what an attempt to hack my FB account would look like
9:30am, even though that makes it 6:30 where they live
Video call didn't have any audio
They couldn't receive incoming video calls
Text did't feel like them, though I don't know them that well.
Here's a screenshot I took during the second video call:
Even with all those things, the video call would normally have been very convincing, and it did briefly convince me. I could easily see it fooling someone who didn't know about deepfake video.
Comment via: facebook
This is one of those "The future is already here — it's just not evenly distributed" situations. Training is hard and expensive. Using is not. Whether you need to retrain for a given target is an architectural decision -- it does make it harder to train (but sublinearly in targets).