╌>

Digitally altered ‘deepfake’ videos a growing threat as 2020 election approaches

  

Category:  News & Politics

Via:  perrie-halpern  •  5 years ago  •  28 comments

By:   Jake Ward/NBC News

Digitally altered ‘deepfake’ videos a growing threat as 2020 election approaches
The technology known as ‘deepfakes’ -- deceptive videos of real people altered using artificial intelligence -- are quickly beginning to damage our trust in the truth of video evidence. How will we trust the footage we see, especially with a presidential election approaching? Jake Ward hears from researchers in California teaching computers to tell the difference.

Tags

jrDiscussion - desc
[]
 
Perrie Halpern R.A.
Professor Principal
1  seeder  Perrie Halpern R.A.    5 years ago

Welcome to the brave new world.

 
 
 
devangelical
Professor Principal
1.1  devangelical  replied to  Perrie Halpern R.A. @1    5 years ago

with as much face time fearless leader has on video, this has some comical possibilities.

 
 
 
devangelical
Professor Principal
1.1.2  devangelical  replied to    5 years ago

at the end of the day, when it happens, I expect the claim fake news to erupt from his supporters. it's easier to spell than digitally altered video and has the potential to generate much more spittle when said out loud.

 
 
 
CB
Professor Principal
1.1.3  CB  replied to    5 years ago

Why not? Trump does and says what Trump wants to. Nobody is wearing that face on the networks every day for him. He markets that mug! He wants his mug to be the brand of planet Earth when aliens think about us.

 
 
 
CB
Professor Principal
2  CB    5 years ago

Brave new world  indeed. Nearly everybody has been clamoring for the future to be hear already. Nearly everybody wants more of the future in their lifetime. And the new world in many ways brings with it the same old deceits, worse case scenarios, and problems. All the digital electronics arrive and suddenly all our 'records' are subject to being digitally rifled through. We are told to buy more security - that's additional costs. The solution for having more of everything seems to be layers and more layers of security.

I get emails all the time about another agency I trusted to be beyond stealing from apologizing for 'outing' personal information of large groups of their clients. And we have people who swear technology is our reasonable, logical outgrowth. Too bad technology can't change the nature of and character of people.

 
 
 
Perrie Halpern R.A.
Professor Principal
2.1  seeder  Perrie Halpern R.A.  replied to  CB @2    5 years ago

I have to agree with your comment. We were warned by so many novels of a time when our technology, not used wisely, could turn on us, and now we seeing it happening all the time. 

 
 
 
CB
Professor Principal
2.1.1  CB  replied to  Perrie Halpern R.A. @2.1    5 years ago

This is the saddest truth of all: We are using technology for good. But sometimes "evil-doers" sneak in and sell off the good of everything for wealth, power, and influence. Everything about us is now marketable - I think I saw on one of the Cambridge Analytics videos I researched. The World Wide Web is a groovy place to be.  A timely place to do research. But then my browser has a monitor service that I signed up for to give me the sad, nowadays common refrain:

We have located your email, username, and password on the Dark Web. Please change it now.

Little old peace-loving me on the "dark web." So pathetically sad. Lucky for me, I have so many usernames and passwords that they litter four sides of two pages. I laugh at it sometimes and wonder about how long it took me to build that up.

Technology is "the bomb."  I love it. I love it that people home-bound can go to schools far away. I love it that I can watch my security cameras remotely. I love it that I can talk to anybody in the country for the same amount as a regular call. Remember those cars that were having the stuck and speeding accelerator pedals, don't remember how that ended. Technology.

It is just that evildoers have to rain and f-up everything. Alas, when I think about it soberly, evil doers have to eat and buy toys too, right? I guess.

Now one has to ask: What are hackers going to do with deep-fake technology to the millions of uploaded videos across FaceBook and the like? People talking and yakking about everything under the sun can be great 'seed' for devilment.

 
 
 
CB
Professor Principal
3  CB    5 years ago

Irony of deepfaking President Donald Trump is the equal opposite could be a kinder, honest, truth-telling man. Such a video, Donald Trump would find repugnant. I bring him up because he is staring us down in the frame.

 
 
 
Buzz of the Orient
Professor Expert
4  Buzz of the Orient    5 years ago

This is very scary, and just bolsters my feeling that the world might have been better off if the internet was never invented.  Of course such altered videos could also be played on television.  Well, at least the late 1940s were relatively safe from this stuff.

Since the scientists and researchers as so brilliant in coming up with this stuff, I wish they would invent time travel.  Beam me back to the late 1940s, Scotty.

 
 
 
Perrie Halpern R.A.
Professor Principal
4.1  seeder  Perrie Halpern R.A.  replied to  Buzz of the Orient @4    5 years ago

Buzz,

I don't blame the internet. That brought information to us. What I do blame is the AI that people underestimate. They even invite surveillance into their homes, under the guise that it will make life easier. Even our cell phones can be used to spy on us. It amazes me that people don't get it. 

 
 
 
TᵢG
Professor Principal
4.1.1  TᵢG  replied to  Perrie Halpern R.A. @4.1    5 years ago

I blame human beings with malicious intent.   Technology enables both good and bad, depends upon the agent.

 
 
 
Perrie Halpern R.A.
Professor Principal
4.1.2  seeder  Perrie Halpern R.A.  replied to  TᵢG @4.1.1    5 years ago

I agree with you about humans and that tech is neither good or bad, but given who we are, the outcome is predictable. 

 
 
 
TᵢG
Professor Principal
4.1.3  TᵢG  replied to  Perrie Halpern R.A. @4.1.2    5 years ago

Yup, the outcome is predictable.   We must always combat those who misuse modern inventions because malicious people will always exist.

 
 
 
Perrie Halpern R.A.
Professor Principal
4.1.4  seeder  Perrie Halpern R.A.  replied to  TᵢG @4.1.3    5 years ago

Agreed, but how does one do that? By law or better technology? 

 
 
 
TᵢG
Professor Principal
4.1.5  TᵢG  replied to  Perrie Halpern R.A. @4.1.4    5 years ago

Law and enforcement is the key, but engineers should of course make technology with means to best mitigate and detect malicious use (and of course with fail-safe mechanism for inadvertent misuse).

 
 
 
Buzz of the Orient
Professor Expert
4.1.6  Buzz of the Orient  replied to  Perrie Halpern R.A. @4.1    5 years ago

Seems like there isn't any solution but to live a life like a hermit during the pioneer days in order to be safe from it all.

 
 
 
CB
Professor Principal
4.1.7  CB  replied to  Perrie Halpern R.A. @4.1    5 years ago

Oh we get it. We're just so tired after many years of being told we need to "get some pizzazz in our world." I mean when you live down the grid, at some point you begin to feel 'dusty and gritty' and maybe even worse, you say "silly" stuff about tech and people go - "Huh"?

 
 
 
CB
Professor Principal
4.1.8  CB  replied to  Perrie Halpern R.A. @4.1.4    5 years ago

The worse I'd imagine are those elite tech personnel released or departing from their contracts out into the wilderness. . . . Hungry, or full of resentments. . . .

 
 
 
charger 383
Professor Silent
4.1.9  charger 383  replied to  TᵢG @4.1.5    5 years ago
Law and enforcement

part of the problem is law enforcement uses a lot of video evidence now

 
 
 
Drakkonis
Professor Guide
4.1.10  Drakkonis  replied to  Perrie Halpern R.A. @4.1.4    5 years ago
Agreed, but how does one do that? By law or better technology? 

Neither, because you cannot eliminate human nature. Even if this very day we created the best laws laws and  systems today, within 100 years it would already be corrupted. Human nature will not allow anything else. It's that simple. 

 
 
 
Drakkonis
Professor Guide
4.1.11  Drakkonis  replied to  Buzz of the Orient @4.1.6    5 years ago
Seems like there isn't any solution but to live a life like a hermit during the pioneer days in order to be safe from it all.

Actually, I think the best anyone can do is to do their best to do what is right in their little postage stamp of a world. To think about how one's actions will affect those around them. Don't repay wrong with more wrong. Help those who need it. 

 
 
 
TᵢG
Professor Principal
4.1.12  TᵢG  replied to  Drakkonis @4.1.10    5 years ago

I would say both.   You are correct about human nature of course, but the way we fight bad actors is to:

  • ensure we have proper laws to cover the bad actions (i.e.  if it is not against the law to digitally slander/misrepresent we have a problem)
  • enable law enforcers with proper technology to mitigate and detect bad actions

As time goes on we will have increasingly sophisticated cyber criminals/trolls and (no doubt) increasingly sophisticated cyber laws and technology to deal with it.   It is very likely that this balancing act will continue in perpetuity.

 
 
 
Nerm_L
Professor Expert
5  Nerm_L    5 years ago

I grew up with TV so I was taught that what is shown can be deceiving.  All this does is highlight that literacy and solid reporting matters.  Reporting requires more than just flashing a video on the screen and expressing opinion.

Actually the advent of AI enhanced video changes nothing.  Selective editing by splicing film or video has been practiced for decades.  Staged events intended to deceive have been around for decades, too.  Photography, film movies, and video has been used to deceive consumers (and the public) for well over half a century.  And news organizations have utilized the same techniques.  Photoshop was introduced in 1990 and has been used to deceive consumers and the public ever since.  Removing human involvement in manipulating images and videos really doesn't significantly change anything; humans have been doing this for decades.

 
 
 
Perrie Halpern R.A.
Professor Principal
5.1  seeder  Perrie Halpern R.A.  replied to  Nerm_L @5    5 years ago

We know when a film is edited and we can tell if a photo has been photoshopped. These are easily detected. But this is far more convincing and that makes it far more dangerous. What if some madman got on as the President and said we just sent nuclear missiles to Russia? No one can just splice that together. Do you think that the Russians would check it out first or just react?

 
 
 
Nerm_L
Professor Expert
5.1.1  Nerm_L  replied to  Perrie Halpern R.A. @5.1    5 years ago
We know when a film is edited and we can tell if a photo has been photoshopped. These are easily detected. But this is far more convincing and that makes it far more dangerous. What if some madman got on as the President and said we just sent nuclear missiles to Russia? No one can just splice that together. Do you think that the Russians would check it out first or just react?

Determining if a film has been edited requires a somewhat sophisticated forensic analysis.  The public generally does not have the capability to do forensic analysis; the public must rely upon news sources to do the investigative journalism.  The need for journalists to vet and verify source material really hasn't changed; journalists have only become somewhat lazy since it's so easy to just flash a video on the screen and express opinion.  The real complaint here is that journalists are going to need to be more scrupulous in verifying material before using it in reporting or to express opinion.

A bigger concern than fake videos would be someone hacking the GPS system or creating ghost signals of ICBMs in transit.  The world doesn't rely upon word of mouth any longer because it's been possible to easily manipulate that for quite awhile.  AI enhanced video won't affect national security since that relies upon verification.  Enhanced video creates a problem for news organizations since anyone can claim that a video is a fake.  Reporters will need to do more work to publish a story; reporters will need to verify info just as has been required for national security.

A bigger concern would be product endorsements, activist political messaging, and rewriting history.  But manipulation of images and audio to support those activities are already being used in a variety of ways to deceive the public.  And news organizations really have adopted some of those techniques.  The public should already understand that one shouldn't believe everything on the TV or internet.  

Yes, AI enhanced video raises concerns about influencing public opinion through deception.  But that isn't new, it's already being done.  Using AI to replace humans performing the manipulation really doesn't change anything.

 
 
 
Buzz of the Orient
Professor Expert
5.1.2  Buzz of the Orient  replied to  Nerm_L @5.1.1    5 years ago
"The public should already understand that one shouldn't believe everything on the TV or internet."

Yep.  As grandpappy said, don't believe anything you see or hear, and only half of what you see with your own eyes."

Amazing, isn't it.  We go to school to learn the truth, and then we're fooled every day by lies.

 
 
 
The Magic 8 Ball
Masters Quiet
6  The Magic 8 Ball    5 years ago

no matter how this goes, lawyers will make a fortune... LOL

 
 

Who is online



77 visitors