Deepfakes: videos that use AI to create a realistic representation of real people saying and doing things, they have -not- done. Right now it is mostly face swapping.
Here is a DeseretNews article discussing detecting deepfakes.
Such fraudulent videos are often made by feeding a large number of images to a computer that uses machine learning to recreate a person’s face and natural expressions. Then, the fabricated face is superimposed onto a video of someone else.
Most recently, in August, researchers at UC Berkeley published a new method that goes beyond cutting and pasting a person's face onto a new body and allows for full-body manipulation. A demo video shows how the team was able to project a professional dancer's moves onto another person.
Deepfake technology — orginially developed for communication, educational and entertainment applications — has since been exploited to put celebrities in pornographic scenes and experts worry the convincing videos could create broader damage by undermining political campaigns, scamming people and inciting violence.
Manipulated videos could be used to threaten someone with false evidence of an affair or sabotage a competitor’s career with fabricated evidence of a racist comment, according to Odavya. A deep-fake video of a politician declaring war might actually start one.
Even if a video can be disproven, it might not matter.
“Very often even when stuff can be decisively refuted, the refutation doesn’t travel as quickly as the initial scandalous claim,” said Sanchez. “In a political campaign context, if something scandalous drops in late October, it might not matter that experts can show it’s a fake two weeks later.”
BYU students are working on receiving a grant for making AI/Alexa speak as a human so that a person cannot tell the difference. They call her "Eve"Earlier this year, Google unveiled a new product called Duplex which has the ability to call and make an appointment for you. It says “um” and “mmhmm” and phrases like “gotcha, thanks.” It hesitates and pauses like a human would, and as a result, when Google tested it, people on the other end of the line couldn’t tell they were talking to a robot.
https://www.heraldextra.com/news/local/ ... b0ec4.html
https://news.byu.edu/news/alexa-can-we- ... nversationThe BYU team is one of eight teams chosen worldwide to compete in the second annual Alexa Prize Challenge by Amazon. The team received a $250,000 grant to work on Eve, short for Emotive Adversarial Ensembles, and work on cracking conversational artificial intelligence.
The team will work on the project into the summer for the chance to win $1 million for the university and $500,000 to split among the team.
The ultimate goal, set by Amazon, is to have Eve be so engaging, a human will continue a conversation with her for 20 minutes.

