In a World of Widespread Video Sharing, What’s Real and What’s Not?
Skip to content
Policy Organizations Dec 22, 2022

In a World of Widespread Video Sharing, What’s Real and What’s Not?

A discussion with a video-authentication expert on what it takes to unearth “deepfakes.”

A detective pulls back his computer screen to reveal code behind the video image.

Michael Meier

Based on insights from

Nicola Persico

Bertram Lyons

With the rise of the smartphone, the last decade has seen an explosion in the production and distribution of video across the web. Today, people around the world have unprecedented ability to capture and disseminate everything from children’s dance recitals to police violence or war crimes.

But technological advances in AI have also encouraged the proliferation of deepfakes: seemingly realistic videos that contain inauthentic elements. Perhaps unsurprisingly, there is also a burgeoning industry developing around spotting these manipulated videos.

“When you have a digital video in your hand, there are a lot of questions that we can answer about it,” says Bertram Lyons, CEO of Medex Forensics, a software engineering company that supports source detection and authentication of digital video files. “Where’d it come from? Who created it? What was used to create it? What was the process that it went through to come to be as it is at this very moment?”

Lyons sat down with Nicola Persico, a professor of managerial economics and decision sciences at the Kellogg School, to discuss his company’s unique approach to analyzing a video, the industries that rely on video authentication, and the regulatory gray areas social-media platforms face when it comes to deepfakes.

This video has been edited for length and clarity.

Add Insight
to your inbox.

Nicola PERSICO: It seems like video and audio recordings exist along a continuum, from real to selectively edited, to manipulated or doctored, to faked.

Bertram LYONS: I think that’s a good way to put it. It’s definitely a spectrum. And I’ll say that the analysis of a given video object, in order to place some claim on its veracity, also needs to include a spectrum of approaches.

Most tools today are focused on content: looking for faces, looking for changes in shadows, and evaluating pixels. That’s content; there’s no context there. Our tool provides a context-based approach. What we do is a historical provenance analysis that tells you a video’s context before you even watch it.

PERSICO: Can you tell us about that process?

LYONS: If you think about it, the file itself is a piece of evidence that’s been constructed in some way. We take the objects apart within the file, and we look at them from a variety of perspectives. What are they? In what sequence were they put together? When we see a file, we can say, “This file is most similar to a file that’s gone through this or that process.” The goal is to understand every byte in the file.

PERSICO: You got into this field by working with the FBI. I can see how understanding historical provenance would be particularly useful for government agencies if they are trying to distinguish between, say, those who possess child sexual abuse material (CSAM) and those who actually produce it.

LYONS: Exactly. In the U.S. alone, about 45 million suspected CSAM videos come through national tip lines every year. A county prosecutor’s office is often tasked with investigating and putting charges in place when they discover that an individual has CSAM on one of their devices. Usually what happens in this case is there’s a forensic extraction of the video from the device.

And at that point, most investigators haven’t historically been able to really go much further. So the person gets a possession charge, which is a less hefty charge—unless they were caught in a distribution ring, where the video was captured somewhere else and was tracked back to that person. More difficult still is to find the people who are actually producing this material.

But our software can analyze these video files for the investigators and identify the context of these video files. We can demonstrate that only this screen-capture app on this type of device creates files in this way. Investigators can then see it is likely that this person created the video. So in such cases, we are able to help move a possession charge to a production charge, which is the ultimate goal of all CSAM investigators—to find the producers.

PERSICO: Authenticity looks like a big part of your business model. You talked about law enforcement in the context of distinguishing between production and possession for video. What industries or entities need to ascertain authenticity for video?

LYONS: We work with law enforcement and public safety here too, where questions around digital evidence—its believability and the ability to extract information and investigative leads from it—are an important part of the day-to-day work. Further down that pipeline is the legal field, where evidence is introduced into court to be used as part of adjudication. So lawyers and their supporters need the ability to say, is this particular piece of video evidence authentic? Can it be trusted? Ultimately, is it what it is purported to be?

We work with social-media organizations, where disinformation and misinformation are quite rampant, and video is often a player in that particular conversation. Those larger organizations have strong interest in understanding ways they can prioritize content moderation of video coming onto their platforms.

We also work with investigative journalists who are using digital video evidence as part of the raw data that goes into putting together media pieces. Investigative journalists are very interested in making sure they understand where the video they’re looking at came from so that they can interpret it correctly.

“The goal is to understand every byte in the file.”

Bertram Lyons

PERSICO: For instance, I know you have been working with The New York Times on verifying videos from the Ukraine conflict.

LYONS: Yes. Early on, we had the issue where a nuclear plant had been under fire by Russian troops and there was a cry from inside to stop the firing on this facility. Some videos had come out from inside of the plant itself, and they looked like what you might see on a 1960s science-fiction film. It just looked like that.

So our colleagues on The New York Times Visual Investigations team wanted to document this and write about it. They ran these files through Medex to try to identify context: How did the video get to Telegram? Does this represent what you might expect a camera-original file to look like when filmed within the Telegram app or uploaded to the Telegram app from the library on a particular phone? Or does it look like something that was generated through Adobe Premiere Pro or Sony Vegas and then uploaded through Telegram?

PERSICO: This gets to the question of who the guardians of authenticity should be. If it is the platforms on which videos appear, this would put a lot of risk on the heads of those platforms. To the related question of sharing copyrighted material, the EU recently decided to use a negligence standard, where “best effort” on the platform’s part is a defense for platforms that inadvertently share copyrighted material. In the U.S., the battle over the penalty for sharing copyrighted material was won by Google and Wiki, which favored no liability.

The guardians-of-authenticity discussion extends to deepfakes. How are deepfakes evolving? Has identifying them gotten more complicated over time?

LYONS: They’re evolving in a couple of ways. One, they’re more visually compelling: that’s the evolution that consumers are seeing. From the production perspective, as the technology’s getting more advanced, more people can create them. Five years ago, it was pretty hard to make them. Today, it’s easy to do it cheaply, if poorly. Anybody can put a deepfake app on their phone today and create a mediocre one. But it’s easier than it was to create a good one, too. Instead of it taking a team and many weeks to put something together, now it’s to the point where an individual who has any knowledge of computing can follow a set of instructions on a GitHub repository and have a pretty powerful deepfake tool running.

One good thing about it is that it’s become so widespread that people are aware of deepfakes. People are becoming more literate at the same rate that the videos are getting better and better.

PERSICO: Video authentication, like many issues where innovation touches on social norms, is an area where regulation is late to the party. It reminds me of the issues around Google Glass, where the rights over the distribution of video recordings arose. When a person appears on a video, who has the right to control its distribution: The person featured? The person who shot the video? The platform on which it appears?

It seems likely that conflict over the distribution of videos will grow exponentially with the continued rise of social media. The tweak here is that the issue of who owns a video—and who can restrict its distribution—may depend on whether the video is authentic or not. This raises other questions, like should inauthentic videos be less shareable than authentic ones? Who has legal standing to restrict its distribution?

What do you think the rules of the road are going to be going forward?

LYONS: I think for sure there’s a need for regulation. Platforms certainly have a responsibility to evaluate what’s being broadcast by users. The danger with deepfakes is mostly in these broader social platforms, where content can get out and stay longer and be seen and believed longer before it can be debunked.

It’s less of a danger when a video comes out from an individual or an actor’s platform. When it comes from the website of a single organization, the organizations with the greatest technology to analyze that—whether that’s news media or law enforcement—are going to be on it immediately trying to put an answer in place before it becomes really widespread. But when it comes out of these social-media platforms, it gets to quickly spread. So I think the regulations have to be focused on ensuring that platforms are making a best effort at identifying potentially harmful content at the time of ingestion into their networks, meaning they are evaluating uploaded content prior to re-encoding it for publication. Today, most platforms take a file from a user, upload it, re-encode it—either on the device or on the edge of their network—then evaluate it in a content-moderation pipeline. We leave a good deal of useful information on the table when platforms re-encode files before evaluating them.

From a digital-media perspective, that is one example of an area where regulations will be helpful to increase access to valuable data points that will be helpful in the larger content-moderation algorithms. The more data we can generate and store for downstream moderation, the more accurate we can become at identifying and reducing harmful media content.

Featured Faculty

John L. and Helen Kellogg Professor of Managerial Economics & Decision Sciences; Director of the Center for Mathematical Studies in Economics & Management; Professor of Weinberg Department of Economics (courtesy)

About the Writer

Fred Schmalz is the business and art editor of Kellogg Insight.

Most Popular This Week
  1. What Happens to Worker Productivity after a Minimum Wage Increase?
    A pay raise boosts productivity for some—but the impact on the bottom line is more complicated.
    employees unload pallets from a truck using hand carts
  2. 6 Takeaways on Inflation and the Economy Right Now
    Are we headed into a recession? Kellogg’s Sergio Rebelo breaks down the latest trends.
    inflatable dollar sign tied down with mountains in background
  3. How to Get the Ear of Your CEO—And What to Say When You Have It
    Every interaction with the top boss is an audition for senior leadership.
    employee presents to CEO in elevator
  4. 3 Tips for Reinventing Your Career After a Layoff
    It’s crucial to reassess what you want to be doing instead of jumping at the first opportunity.
    woman standing confidently
  5. How Offering a Product for Free Can Backfire
    It seems counterintuitive, but there are times customers would rather pay a small amount than get something for free.
    people in grocery store aisle choosing cheap over free option of same product.
  6. Which Form of Government Is Best?
    Democracies may not outlast dictatorships, but they adapt better.
    Is democracy the best form of government?
  7. When Do Open Borders Make Economic Sense?
    A new study provides a window into the logic behind various immigration policies.
    How immigration affects the economy depends on taxation and worker skills.
  8. Why Do Some People Succeed after Failing, While Others Continue to Flounder?
    A new study dispels some of the mystery behind success after failure.
    Scientists build a staircase from paper
  9. How Are Black–White Biracial People Perceived in Terms of Race?
    Understanding the answer—and why black and white Americans may percieve biracial people differently—is increasingly important in a multiracial society.
    How are biracial people perceived in terms of race
  10. How Has Marketing Changed over the Past Half-Century?
    Phil Kotler’s groundbreaking textbook came out 55 years ago. Sixteen editions later, he and coauthor Alexander Chernev discuss how big data, social media, and purpose-driven branding are moving the field forward.
    people in 1967 and 2022 react to advertising
  11. College Campuses Are Becoming More Diverse. But How Much Do Students from Different Backgrounds Actually Interact?
    Increasing diversity has been a key goal, “but far less attention is paid to what happens after we get people in the door.”
    College quad with students walking away from the center
  12. What Went Wrong at AIG?
    Unpacking the insurance giant's collapse during the 2008 financial crisis.
    What went wrong during the AIG financial crisis?
  13. Immigrants to the U.S. Create More Jobs than They Take
    A new study finds that immigrants are far more likely to found companies—both large and small—than native-born Americans.
    Immigrant CEO welcomes new hires
  14. Podcast: Does Your Life Reflect What You Value?
    On this episode of The Insightful Leader, a former CEO explains how to organize your life around what really matters—instead of trying to do it all.
  15. How Peer Pressure Can Lead Teens to Underachieve—Even in Schools Where It’s “Cool to Be Smart”
    New research offers lessons for administrators hoping to improve student performance.
    Eager student raises hand while other student hesitates.
  16. Why Well-Meaning NGOs Sometimes Do More Harm than Good
    Studies of aid groups in Ghana and Uganda show why it’s so important to coordinate with local governments and institutions.
    To succeed, foreign aid and health programs need buy-in and coordination with local partners.
  17. How Will Automation Affect Different U.S. Cities?
    Jobs in small cities will likely be hit hardest. Check how your community and profession will fare.
    How will automation affect jobs and cities?
More in Policy