Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The Far Reaching Effects of AI - Mimicking the Human Voice

  1. Jun 8, 2018 #1


    Staff: Mentor

  2. jcsd
  3. Jun 8, 2018 #2
    The implications are frightening right?
  4. Jun 8, 2018 #3


    Staff: Mentor

    Yes, although folks have done social engineering like this before with a not so perfect voice and gotten away with it.

    In one case, years ago a crafty lawyer created a fake company and forged a fake letter to steal the domain address from another guy by stating he was an employee of their company and that they were transferring ownership to a new company. The internet registrar did it no questions asked and it took several years and a long court fight to get it back and many more years later to get paid for the loss. It was through offical looking letters and not through a fake voice but you get the idea of how it can be used. (See case of Kremens vs Cohen and the fight for an **redacted** domain name)
  5. Jun 8, 2018 #4
    Presumably you can create a video of anybody saying anything you like and it would be difficult to determine if it was fake. Imagine David Muir (ABC) breaking in and announcing "live" on site an alien invasion (H. G. Wells "War of the Worlds") . What will we be able to believe.
  6. Jun 8, 2018 #5


    Staff: Mentor

    VIdeos can be analyzed and debunked due to various artifacts found. Scientific American once posted an article about photo debunking where they looked at how shadows were cast and in many fake photos there was a clear discrepancy not obvious to the casual observer. I figure a similar scheme is used in debunking fake videos.


    Lack of resolution causes big problems though that are hard to debunk easily. There was a video of cars mysteriously jumping around on a roadway as if there was selective anti-gravity at work. The resolution didn't show the downed power line cable that was being dragged by a street sweeper that caused the cars to flip as it became more taunt.

  7. Jun 8, 2018 #6
    That was 10 years ago. Maybe things go a little more sophisticated.

    check this out
  8. Jun 8, 2018 #7


    Staff: Mentor

    This brings up the dilemma of group specialization where the folks who built it kick the ball down the line when it comes to the moral issue of using the technology. It’s similar to gun makers who don’t feel morally responsible to how their guns are used, or gunshops who sell the guns... each group refuses to take responsibility and so no one does and the technology is used for bad things.

    One inventor I knew loved to invent things he hated. Why? Because then he could patent it and prevent it from being made at least for awhile.

    Perhaps we need something like that for technology.
  9. Jun 10, 2018 #8


    Staff: Mentor

    I heard that discussed on NPR. The expert being interviewed said that the problem is asymmetric warfare. One can create a fake video in an hour but it takes 40 hours of skilled labor to debunk it. In addition, who funds the debunker and how are the debunked conclusions disseminated?

    But I see nothing new here. New technology has always been used for good and bad, and it always will. What else would you expect?
  10. Jun 10, 2018 #9


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted