• Skip to main content
  • Skip to secondary menu
  • Skip to footer

OSINT.org

Intelligence Matters

  • Sponsored Post
  • About
    • GDPR
  • Contact

Deconstructing Deepfakes: How do they work and what are the risks?

October 21, 2020 By admin Leave a Comment

Last month, Microsoft introduced a new deepfake detection tool. Weeks ago, Intel launched another. As more and more companies follow suit and more concerns arise about the use of this technology, we take a look in today’s WatchBlog at how this technology works and the policy questions it raises.

What is a deepfake?
A deepfake is a video, photo, or audio recording that seems real but has been manipulated using artificial intelligence (AI). The underlying technology can replace faces, manipulate facial expressions, synthesize faces, and synthesize speech. These tools are used most often to depict people saying or doing something they never said or did.

How do deepfakes work?
Deepfake videos commonly swap faces or manipulate facial expressions. The image below illustrates how this is done. In face swapping, the face on the left is placed on another person’s body. In facial manipulation, the expressions of the face on the left are imitated by the face on the right.

Face Swapping and Facial Manipulation
Deepfakes rely on artificial neural networks, which are computer systems that recognize patterns in data. Developing a deepfake photo or video typically involves feeding hundreds or thousands of images into the artificial neural network, “training” it to identify and reconstruct patterns—usually faces.

How can you spot a deepfake?
The figure below illustrates some of the ways you can identify a deepfake from the real thing. To learn more about how to identify a deepfake, and to learn about the underlying technology used, check out GAO recent Spotlight on this technology.

What are the benefits of these tools?
Voices and likenesses developed using deepfake technology can be used in movies to achieve a creative effect or maintain a cohesive story when the entertainers themselves are not available. For example, in the latest Star Wars movies, this technology was used to replace characters who had died or to show characters as they appeared in their youth. Retailers have also used this technology to allow customers to try on clothing virtually.

What risks do they pose?
In spite of such benign and legitimate applications like films and commerce, deepfakes are more commonly used for exploitation. Some studies have shown that much of deepfake content online is pornographic, and deepfake pornography disproportionately victimizes women.

There is also concern about potential growth in the use of deepfakes for disinformation. Deepfakes could be used to influence elections or incite civil unrest, or as a weapon of psychological warfare. They could also lead to disregard of legitimate evidence of wrongdoing and, more generally, undermine public trust.

What can be done to protect people?
As discussed above, researchers and internet companies, such as Microsoft and Intel, have experimented with several methods to detect deepfakes. These methods typically use AI to analyze videos for digital artifacts or details that deepfakes fail to imitate realistically, such as blinking or facial tics. But even with these interventions by tech companies, there are a number of policy questions about deepfakes that still need to be answered. For example:

What can be done to educate the public about deepfakes to protect them and help them identify real from fake?
What rights do individuals have to privacy when it comes to the use of deepfake technology?
What First Amendment protections do creators of deepfake videos, photos, and more have?
Deepfakes are powerful tools that can be used for exploitation and disinformation. With advances making them more difficult to detect, these technologies require a deeper look.

Sources:
SCIENCE & TECH SPOTLIGHT: DEEPFAKES
Deepfake Technology, Market Analysis

Filed Under: Workflow

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Footer

Recent Posts

  • LILT Assist and the Push to Turn Localization Into an Autonomous Operating Layer
  • Tranquility AI and Fivecast Turn OSINT Into Real-Time Intelligence Workflows
  • Pre-Ceasefire Surge: Israel Accelerates Operations as U.S.-Led Ceasefire Push Gains Momentum
  • Tehran’s Long War Thesis: Endurance as Strategy
  • The Caspian Strike and the Message Beneath It
  • Understanding the Basij and the Significance of the Reported Strikes in Iran
  • Japan Hesitates on Hormuz Patrols as Global Shipping Security Debate Intensifies
  • Why Russia Benefits from Tension in the Strait of Hormuz
  • Cuba’s Regime Under Pressure as Its Allies Weaken
  • China’s Taiwan Air Patrols Resume — But the Real Signal May Be Inside the PLA

Media Partners

  • Analysis.org
  • Opinion.org
Memory Market Reality Check: Micron’s Drop Ripples Across the Sector
The Rise of China’s Hottest New Commodity: AI Tokens
The $1.6 Trillion Infrastructure Rebound That’s Quietly Rewiring Power, Data, and Control
The Day Geopolitics Repriced Everything
FedEx Signals a Logistics Cycle Turn — Growth Returns, but the Real Story Is Structural Reinvention
Iran’s Strategy in the Strait of Hormuz
Broadcom’s AI Semiconductor Revenue Surges Past $8.4 Billion, More Than Doubling in a Single Year
CoreWeave’s $5B Moment: Hypergrowth, Heavy Debt, and the Real Cost of Being the AI Cloud of Choice
NVIDIA’s Q4 FY2026 Was a Scale Event: $68.1B Quarter, $215.9B Year, and Guidance That Shrugged Off China
Tempus AI Q4 and Full-Year 2025: When Precision Medicine Starts Behaving Like a Platform
The Reckoning Europe Chose Not to Prepare For
The Trap They Built Themselves: Iran’s Strategic Self-Defeat
The Ministry of Unreality: How Trump’s Witch Hunts Against Vaccines and Wind Energy Are Breaking America
A Grotesque Reenactment: Trump Charges the Windmills, America Pays the Bill
Strategic Overreach and the Collapse of Iran’s Leverage
The Gulf Divide Is Ideological as Much as Strategic
The Mullahs Are Finished — And It’s Time to Say It Out Loud
Immortal Man (Peaky Blinders): Style, Superstition, and Character Collapse
Insolvency or Framing? A Critical Reading of the “U.S. Government is Insolvent” Argument
Iran’s Strategic Breakdown: When Survival Instinct Turns Into Escalation

Media Partners

  • Market Analysis
  • Market Research Media
Betting the Backbone: A Multi-Year Positioning on AMD, Broadcom, and Nvidia
Nvidia’s Groq 3 LPX: The $20B Bet That Could Define the Inference Era
Why Arm’s New AI Chip Changes the Rules of the Game
A Map Without Hormuz: Rewiring Global Oil Flows Through Fragmented Corridors
RoboForce’s $52 Million Raise Signals That Physical AI Is Moving From Demo Stage to Industrial Scale
The Hormuz Crisis: Winners and Losers in the Global Energy Shock
Zohran Mamdani’s Politics of Confiscation
Beyond Shipyards: Stephen Carmel’s Maritime Warning and the Hard Reality of Rebuilding an Oceanic System
Memory Crunch: Why Prices Are Surging and Why Making More Memory Isn’t Easy
The End of Accounting as We Knew It
Netflix Price Hikes, The Economics of Dominance in a Saturated Streaming Market
America’s Brands Keep Winning Even as America Itself Slips
Kioxia’s Storage Gambit: Flash Steps Into the AI Memory Hierarchy
Mamdani Strangling New York
The Rise of Faceless Creators: Picsart Launches Persona and Storyline for AI Character-Driven Content
Apple TV Arrives on The Roku Channel, Expanding the Streaming Platform Wars
Why Attraction-Grabbing Stations Win at Tech Events
Why Nvidia Let Go of Arm, and Why It Matters Now
When the Market Wants a Story, Not Numbers: Rethinking AMD’s Q4 Selloff
BBC and the Gaza War: How Disproportionate Attention Reshapes Reality

Copyright © 2022 OSINT.org

Technologies, Market Analysis & Market Research and Exclusive Domains