(509) 534-1530    Get SUPPORT

SCW Blog

SCW has been serving the Spokane Valley area since 1997, providing IT Support such as helpdesk support, computer support, and technical consulting to small and medium-sized businesses.

With Deepfakes, Seeing Shouldn’t Be Believing

With Deepfakes, Seeing Shouldn’t Be Believing

Machine learning and artificial intelligence have enabled us to accomplish great things with the computers we have access to today. However, it is equally true that these same innovations have also produced a great number of threats. One threat that is particularly dangerous in today’s day and age is the presence, and growing prevalence, of “deepfakes” in the media.

What Are Deepfakes, and How Are They Created?

A deepfake is a fraudulent piece of media that can be used to warp or fabricate someone’s words or actions. Whether it is in video format, or represented in a still image, this yet-imperfect technology is nevertheless very effective at convincing someone who doesn’t necessarily look too closely.

To create different formats of deepfakes, different combinations of tools and techniques can be used.

Video Deepfakes

With the right software, a video can be scanned so that any vocalized phonemes (the individual sounds that build up into full words) can be identified. Once the phonemes have been identified, they are matched with the proper facial expressions that make the sounds dictated by the phonemes, which are known as visemes. Using the original video, a 3D model of the subject’s face is built.

Combining these three factors together with a transcript creates new footage which can then be superimposed over the original. As a result, the person appears to say something that they didn’t.

Another method is carried out by mapping the expressions that one person makes in original footage and reapplying them to a second person’s face. This makes it possible to give “life” to old photographs, and even paintings.

Still Image Deepfakes

In a very short time - a mere five years, in fact - AI-produced still image deepfakes have transformed from unconvincing, low-quality representations to nearly perfect images that would easily fool one into thinking it was a real person’s picture. This was accomplished using something called a generative adversarial network. This network uses one AI to generate millions of images of faces, striving for photo-realism. Because no human being is going to be able to critique that many images, a second AI is tasked with identifying whether the picture is real, or if it was faked.

Neither AI is very good at its respective job when the process first begins, but before long, the images produced are completely indistinguishable from actual photographs.

Take, for example, the following picture.... Which of these two people is a deepfake image?

The answer: both of these images are computer-generated deepfakes, created by NVIDIA as a part of one of their initiatives.

Clearly, technology like this has the potential to seriously increase how capable people are to spread lies and rumors on the Internet - a dangerous prospect, to be certain.

How Can Deepfakes Be So Dangerous?

Simply put, they can have significant influence over people. For instance, it was just in May of this year that social media was flooded with a video of House Speaker Nancy Pelosi, who appeared to be drunkenly making a speech at an event for the Center of American Progress. She was not. Rather, someone had taken the original footage, reduced it to 75 percent of its original speed and adjusted it to maintain its pitch. This made her words sound slurred. This is not the only time this has happened, either. Earlier in May, more footage of Speaker Pelosi was manipulated to make her seem intoxicated as she spoke to the American Road & Transportation Builders Association, with another, similar video having been posted last year as well.

However you may feel politically, the fact still stands that these kinds of efforts are dangerous. While deepfakes have yet to appear in electoral disinformation campaigns, Congress was alerted to the possibility of America’s rivals leveraging them by the Director of National Intelligence, Dan Coats.

Consider this practical demonstration that was presented as part of Google engineer Supasorn Suwajanakorn’s TED Talk. Pulling from footage of former President George W. Bush, Suwajanakorn incorporated facial maps of assorted public figures and celebrities to speak for him. He also demonstrated how a single speech could be used to create four different models of President Barack Obama.

For his current project, Suwajanakorn is developing a browser plugin to help identify deepfake videos and cut back on the spread of false information and fraudulent video.

What Could This Do to Your Business?

You may be wondering what any of this has to do with you and your operations. Sure, your reputation is important to your business, but how likely is it that someone would actually create a deepfake to leverage against you?

Unfortunately, as time passes, all the more likely.

Let’s consider for a moment the possibility that someone was displeased with your services and stopped doing business with you. Someday (sooner than we’d like to think) deepfake technology could be accessible enough for them to use it to greatly tarnish your company’s reputation. The deepfakes created of Speaker Pelosi are some of the most rudimentary ones there are, and she has had to deal with major damage control as a result of them. Once this technology becomes more widespread, do you really want people who wish ill of your business to engage in similar activities because of them?

It’s likely safe to say that you don’t.

Of course, there are also various other threats that could easily do as much damage to your business’ operations or reputation as a doctored video could. Fortunately, SCW has the solutions to help protect you. To learn more about what makes you vulnerable, and how we can help resolve these vulnerabilities, give (509) 534-1530 a call.

A Guide to Help You Understand All the Digital Thr...
All You Need to Know Before Buying a Computer, Par...
 

Comments

No comments made yet. Be the first to submit a comment
Guest
Already Registered? Login Here
Guest
Thursday, March 28 2024

Captcha Image

Mobile? Grab this Article!

QR-Code dieser Seite