The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
The Forum - On Line Opinion's article discussion area



Syndicate
RSS/XML


RSS 2.0

Main Articles General

Sign In      Register

The Forum > General Discussion > Can you trust the video of someone is real not fake.

Can you trust the video of someone is real not fake.

  1. Pages:
  2. Page 1
  3. 2
  4. 3
  5. 4
  6. All
AI and computer software is now available to enable a video to be made of anyone (if you have enough face photos) doing and saying anything.

It is good at the moment as you will see by the videos on the article but give it a few years and the results will be astoundingly better.

http://www.abc.net.au/news/2018-09-28/fake-news-how-hard-is-it-to-make-a-deepfake-video/10313906

From the article.
To see what was possible we turned to a program called Deepfakes, machine learning software that made its way into the public domain via the dark corners of the internet, where it was unsurprisingly being used to make fake celebrity porn.

Compared to whatís available to researchers, itís pretty limited ó it canít recreate a whole scene, but what it can do is recreate a personís face, and put that onto another person.

To use the program you donít really need to know how to code ó all you need is a relatively fast computer. It does all the work, though you need to give it the right ingredients. Hereís how it works.

Down the bottom there is an example a video where they placed a re-creation of Malcolm Turnbullís current face onto a man who was standing on the steps of Parliament House in 1975 after The Dismissal of Gough Whitlamís government.

Again, itís far from perfect (Malcolm Turnbull was 21 at the time for starters) ó but it provides a sense of whatís possible. Imagine putting a politicianís face into a compromising scene, with a bit of assistance from some video editing software we could go a long way to making it more believable.

There are programs that can do the sound, voice there are not demonstrated.
Posted by Philip S, Friday, 28 September 2018 5:27:04 PM
Find out more about this user Recommend this comment for deletion Return to top of page Return to Forum Main Page Copy comment URL to clipboard
//To see what was possible we turned to a program called Deepfakes, machine learning software that made its way into the public domain via the dark corners of the internet, where it was unsurprisingly being used to make fake celebrity porn.//

I can put my raise my hand to having watched some Deepfake porn.... you can still tell it's fake. The faces are close, but they're not quite right.

As long as you've got an actual human being and not another computer doing the spotting, the technology has some way to go.
Posted by Toni Lavis, Friday, 28 September 2018 6:48:34 PM
Find out more about this user Recommend this comment for deletion Return to top of page Return to Forum Main Page Copy comment URL to clipboard
All that's required is the face of the person targetted to be maligned & a few very willing & just as evil performers & the scene is literally in the bag.
Posted by individual, Friday, 28 September 2018 6:56:55 PM
Find out more about this user Recommend this comment for deletion Return to top of page Return to Forum Main Page Copy comment URL to clipboard
But also that is what is available, what is not available could also be far more advanced.

For the voice the companies say they have got it but not released it, any bet they have but to selected organizations.

Taken from the article.
And what about sound?

By now youíve probably noticed that none of our videos have any sound. While similar research and technology is happening that can create a digital version of a personís voice in much the same way, there currently arenít any publicly available tools offer the same ability to use any audio clip to learn a personís voice.

A company called Lyrebird AI uses machine learning to offer a way to create your own digital voice, but has made the decision not to release a version that would allow you to copy anyoneís voice.

It released audio of Barack Obama and Donald Trump to show what is possible, but in a statement explained why it is keeping this capability back.

ďImagine that we had decided not to release this technology at all. Others would develop it and who knows if their intentions would be as sincere as ours: they could, for example, only sell the technology to a specific company or an ill-intentioned organisation. By contrast, we are making the technology available to anyone and we are introducing it incrementally so that society can adapt to it, leverage its positive aspects for good, while preventing potentially negative applications.Ē

And these concerns are not irrational ó in 2016 Adobe previewed a program called VoCo, which offered this capability, but never released it.
Posted by Philip S, Friday, 28 September 2018 7:12:37 PM
Find out more about this user Recommend this comment for deletion Return to top of page Return to Forum Main Page Copy comment URL to clipboard
It may be cheaper and more convincing to use a movie double , look-alike . Real audio can be cut into phrases and customised for the general theme intended with silent mouthing by the actor. Images can be tested for artificial pasting and the "oddity" is probably hard to overcome. But at least everyone knows it's possible , both doubles and photo-shop.
Posted by nicknamenick, Friday, 28 September 2018 7:22:23 PM
Find out more about this user Recommend this comment for deletion Return to top of page Return to Forum Main Page Copy comment URL to clipboard
The problem with a double doing it are that more people know a double does exist so there is doubt from the start that it is the real person and the more people that know the greater the risk someone will talk.

But if someone can sit in a room and do it all them self, perfect.
Posted by Philip S, Friday, 28 September 2018 7:51:55 PM
Find out more about this user Recommend this comment for deletion Return to top of page Return to Forum Main Page Copy comment URL to clipboard
  1. Pages:
  2. Page 1
  3. 2
  4. 3
  5. 4
  6. All

About Us :: Search :: Discuss :: Feedback :: Legals :: Privacy