Digitally altered ‘deepfake’ videos a growing threat as 2020 election approaches
Category: News & Politics
Via: perrie-halpern • 5 years ago • 28 commentsBy: Jake Ward/NBC News
The technology known as ‘deepfakes’ -- deceptive videos of real people altered using artificial intelligence -- are quickly beginning to damage our trust in the truth of video evidence. How will we trust the footage we see, especially with a presidential election approaching? Jake Ward hears from researchers in California teaching computers to tell the difference.
Tags
Who is online
77 visitors
Welcome to the brave new world.
with as much face time fearless leader has on video, this has some comical possibilities.
at the end of the day, when it happens, I expect the claim fake news to erupt from his supporters. it's easier to spell than digitally altered video and has the potential to generate much more spittle when said out loud.
Why not? Trump does and says what Trump wants to. Nobody is wearing that face on the networks every day for him. He markets that mug! He wants his mug to be the brand of planet Earth when aliens think about us.
Brave new world indeed. Nearly everybody has been clamoring for the future to be hear already. Nearly everybody wants more of the future in their lifetime. And the new world in many ways brings with it the same old deceits, worse case scenarios, and problems. All the digital electronics arrive and suddenly all our 'records' are subject to being digitally rifled through. We are told to buy more security - that's additional costs. The solution for having more of everything seems to be layers and more layers of security.
I get emails all the time about another agency I trusted to be beyond stealing from apologizing for 'outing' personal information of large groups of their clients. And we have people who swear technology is our reasonable, logical outgrowth. Too bad technology can't change the nature of and character of people.
I have to agree with your comment. We were warned by so many novels of a time when our technology, not used wisely, could turn on us, and now we seeing it happening all the time.
This is the saddest truth of all: We are using technology for good. But sometimes "evil-doers" sneak in and sell off the good of everything for wealth, power, and influence. Everything about us is now marketable - I think I saw on one of the Cambridge Analytics videos I researched. The World Wide Web is a groovy place to be. A timely place to do research. But then my browser has a monitor service that I signed up for to give me the sad, nowadays common refrain:
We have located your email, username, and password on the Dark Web. Please change it now.
Little old peace-loving me on the "dark web." So pathetically sad. Lucky for me, I have so many usernames and passwords that they litter four sides of two pages. I laugh at it sometimes and wonder about how long it took me to build that up.
Technology is "the bomb." I love it. I love it that people home-bound can go to schools far away. I love it that I can watch my security cameras remotely. I love it that I can talk to anybody in the country for the same amount as a regular call. Remember those cars that were having the stuck and speeding accelerator pedals, don't remember how that ended. Technology.
It is just that evildoers have to rain and f-up everything. Alas, when I think about it soberly, evil doers have to eat and buy toys too, right? I guess.
Now one has to ask: What are hackers going to do with deep-fake technology to the millions of uploaded videos across FaceBook and the like? People talking and yakking about everything under the sun can be great 'seed' for devilment.
Irony of deepfaking President Donald Trump is the equal opposite could be a kinder, honest, truth-telling man. Such a video, Donald Trump would find repugnant. I bring him up because he is staring us down in the frame.
This is very scary, and just bolsters my feeling that the world might have been better off if the internet was never invented. Of course such altered videos could also be played on television. Well, at least the late 1940s were relatively safe from this stuff.
Since the scientists and researchers as so brilliant in coming up with this stuff, I wish they would invent time travel. Beam me back to the late 1940s, Scotty.
Buzz,
I don't blame the internet. That brought information to us. What I do blame is the AI that people underestimate. They even invite surveillance into their homes, under the guise that it will make life easier. Even our cell phones can be used to spy on us. It amazes me that people don't get it.
I blame human beings with malicious intent. Technology enables both good and bad, depends upon the agent.
I agree with you about humans and that tech is neither good or bad, but given who we are, the outcome is predictable.
Yup, the outcome is predictable. We must always combat those who misuse modern inventions because malicious people will always exist.
Agreed, but how does one do that? By law or better technology?
Law and enforcement is the key, but engineers should of course make technology with means to best mitigate and detect malicious use (and of course with fail-safe mechanism for inadvertent misuse).
Seems like there isn't any solution but to live a life like a hermit during the pioneer days in order to be safe from it all.
Oh we get it. We're just so tired after many years of being told we need to "get some pizzazz in our world." I mean when you live down the grid, at some point you begin to feel 'dusty and gritty' and maybe even worse, you say "silly" stuff about tech and people go - "Huh"?
The worse I'd imagine are those elite tech personnel released or departing from their contracts out into the wilderness. . . . Hungry, or full of resentments. . . .
part of the problem is law enforcement uses a lot of video evidence now
Neither, because you cannot eliminate human nature. Even if this very day we created the best laws laws and systems today, within 100 years it would already be corrupted. Human nature will not allow anything else. It's that simple.
Actually, I think the best anyone can do is to do their best to do what is right in their little postage stamp of a world. To think about how one's actions will affect those around them. Don't repay wrong with more wrong. Help those who need it.
I would say both. You are correct about human nature of course, but the way we fight bad actors is to:
As time goes on we will have increasingly sophisticated cyber criminals/trolls and (no doubt) increasingly sophisticated cyber laws and technology to deal with it. It is very likely that this balancing act will continue in perpetuity.
I grew up with TV so I was taught that what is shown can be deceiving. All this does is highlight that literacy and solid reporting matters. Reporting requires more than just flashing a video on the screen and expressing opinion.
Actually the advent of AI enhanced video changes nothing. Selective editing by splicing film or video has been practiced for decades. Staged events intended to deceive have been around for decades, too. Photography, film movies, and video has been used to deceive consumers (and the public) for well over half a century. And news organizations have utilized the same techniques. Photoshop was introduced in 1990 and has been used to deceive consumers and the public ever since. Removing human involvement in manipulating images and videos really doesn't significantly change anything; humans have been doing this for decades.
We know when a film is edited and we can tell if a photo has been photoshopped. These are easily detected. But this is far more convincing and that makes it far more dangerous. What if some madman got on as the President and said we just sent nuclear missiles to Russia? No one can just splice that together. Do you think that the Russians would check it out first or just react?
Determining if a film has been edited requires a somewhat sophisticated forensic analysis. The public generally does not have the capability to do forensic analysis; the public must rely upon news sources to do the investigative journalism. The need for journalists to vet and verify source material really hasn't changed; journalists have only become somewhat lazy since it's so easy to just flash a video on the screen and express opinion. The real complaint here is that journalists are going to need to be more scrupulous in verifying material before using it in reporting or to express opinion.
A bigger concern than fake videos would be someone hacking the GPS system or creating ghost signals of ICBMs in transit. The world doesn't rely upon word of mouth any longer because it's been possible to easily manipulate that for quite awhile. AI enhanced video won't affect national security since that relies upon verification. Enhanced video creates a problem for news organizations since anyone can claim that a video is a fake. Reporters will need to do more work to publish a story; reporters will need to verify info just as has been required for national security.
A bigger concern would be product endorsements, activist political messaging, and rewriting history. But manipulation of images and audio to support those activities are already being used in a variety of ways to deceive the public. And news organizations really have adopted some of those techniques. The public should already understand that one shouldn't believe everything on the TV or internet.
Yes, AI enhanced video raises concerns about influencing public opinion through deception. But that isn't new, it's already being done. Using AI to replace humans performing the manipulation really doesn't change anything.
Yep. As grandpappy said, don't believe anything you see or hear, and only half of what you see with your own eyes."
Amazing, isn't it. We go to school to learn the truth, and then we're fooled every day by lies.
no matter how this goes, lawyers will make a fortune... LOL