AI, Fakes, and You

by Karl Denninger Market-Ticker.org

It has long been “de-rigueur” to attempt to prank someone in a command position of some sort on the phone with a “fake call.” Historically this is not the easiest thing to pull off but it has been done with some of the funniest being perpetrated by radio hosts of various sorts.

Today this threat is much-more serious; so-called “machine learning” can now, for virtually anyone who has their voice and image out in the public where it can be harvested, used to “mimic” said person not only over audio but video as well.

This raises extremely-serious questions when it comes to both civil and government actions: Is the person actually giving that speech or participating in that conversation? If they’re not literally standing in front of you it is no longer reasonably possible to be certain of that unless you have agreed in advance to something that only each of you would know and would use, once, to authenticate such a conversation. You then must negotiate another one in person where there is no reasonable possibility of interception because your first one has now been exposed to the public and could be re-used. In other words the only way you and anyone else can authenticate any conversation other than in person is by the use of what is called a “one-time pad”, an unbreakable form of cryptography since nobody can possibly discover (other than pulling your fingernails off) the secret.