Ads Top

The Dangers of AI

video manipulation
Image result for fake news face swap ai
    Many of us have heard a little about AI(artificial intelligence) and maybe even machine learning. These technologies have an untold amount of good and useful opportunities they can be applied to help us advance our software and technology so much faster.  We see so many promising applications in the medical field, science, and even video games.

     Many people of heard of and used the popular social media mobile application called Snapchat. This application has a facial detection software that tracks the user's face and morphs or replaces the users face with something else. Over time facial tracking software has been improved by AI and machine learning and is being used to make fake news with surprising results. The photos above show the concept of the software. The one specifically of Barack Obama is from the University of Washington with results that are almost unnoticeable. This kind of technology is constantly being improved and I don't believe it will be too long before an almost flawless version of the software will be made.  This can be unnerving in today's world already filled with plenty of fake news and now we might not even be able to trust videos of others. 
A software company called Lyrebird, based out of Canada, has invented an ai software that can take a short audio clip from any person and copy the sound of their voice entirely with any dialogue the user wants. 
Image result for star wars dead actor cgi

With software like these in tandem, it will become harder and harder to detect fake news.  In one of the more recent star wars movies, they were able to bring back the actor for "Grand Moff Tarkin in Star Wars." I hope this software in the future is used more good than bad and used to remake movies with our favorite but past actors and is used to improve the entertainment world in general. It would be sad to see a world in which you could no longer trust any video, audio, photo, or article. Photos and articles already being made with fake and misleading information sometimes.

You can read more about the "deepfakes" here.

No comments:

Powered by Blogger.