• Skip to main content
  • Skip to secondary menu
  • Skip to footer

OSINT.org

Intelligence Matters

  • Sponsored Post
    • Make a Contribution
  • Market Intelligence
    • Technologies
    • Events
  • Domain Intelligence
  • About
    • GDPR
  • Contact

Deconstructing Deepfakes: How do they work and what are the risks?

October 21, 2020 By admin Leave a Comment

Last month, Microsoft introduced a new deepfake detection tool. Weeks ago, Intel launched another. As more and more companies follow suit and more concerns arise about the use of this technology, we take a look in today’s WatchBlog at how this technology works and the policy questions it raises.

What is a deepfake?
A deepfake is a video, photo, or audio recording that seems real but has been manipulated using artificial intelligence (AI). The underlying technology can replace faces, manipulate facial expressions, synthesize faces, and synthesize speech. These tools are used most often to depict people saying or doing something they never said or did.

How do deepfakes work?
Deepfake videos commonly swap faces or manipulate facial expressions. The image below illustrates how this is done. In face swapping, the face on the left is placed on another person’s body. In facial manipulation, the expressions of the face on the left are imitated by the face on the right.

Face Swapping and Facial Manipulation
Deepfakes rely on artificial neural networks, which are computer systems that recognize patterns in data. Developing a deepfake photo or video typically involves feeding hundreds or thousands of images into the artificial neural network, “training” it to identify and reconstruct patterns—usually faces.

How can you spot a deepfake?
The figure below illustrates some of the ways you can identify a deepfake from the real thing. To learn more about how to identify a deepfake, and to learn about the underlying technology used, check out GAO recent Spotlight on this technology.

What are the benefits of these tools?
Voices and likenesses developed using deepfake technology can be used in movies to achieve a creative effect or maintain a cohesive story when the entertainers themselves are not available. For example, in the latest Star Wars movies, this technology was used to replace characters who had died or to show characters as they appeared in their youth. Retailers have also used this technology to allow customers to try on clothing virtually.

What risks do they pose?
In spite of such benign and legitimate applications like films and commerce, deepfakes are more commonly used for exploitation. Some studies have shown that much of deepfake content online is pornographic, and deepfake pornography disproportionately victimizes women.

There is also concern about potential growth in the use of deepfakes for disinformation. Deepfakes could be used to influence elections or incite civil unrest, or as a weapon of psychological warfare. They could also lead to disregard of legitimate evidence of wrongdoing and, more generally, undermine public trust.

What can be done to protect people?
As discussed above, researchers and internet companies, such as Microsoft and Intel, have experimented with several methods to detect deepfakes. These methods typically use AI to analyze videos for digital artifacts or details that deepfakes fail to imitate realistically, such as blinking or facial tics. But even with these interventions by tech companies, there are a number of policy questions about deepfakes that still need to be answered. For example:

What can be done to educate the public about deepfakes to protect them and help them identify real from fake?
What rights do individuals have to privacy when it comes to the use of deepfake technology?
What First Amendment protections do creators of deepfake videos, photos, and more have?
Deepfakes are powerful tools that can be used for exploitation and disinformation. With advances making them more difficult to detect, these technologies require a deeper look.

Sources:
SCIENCE & TECH SPOTLIGHT: DEEPFAKES
Deepfake Technology, Market Analysis

Filed Under: Workflow

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Footer

Recent Posts

  • Cognyte Wins $5M Contract to Power Tactical SIGINT for Major EMEA Military Intelligence Agency
  • Huawei Africa Night 2025: Vision for “New Africa” or Blueprint for Dependency?
  • Longeye Raises $5M to Bring AI-Powered Investigations to Law Enforcement
  • Jared Kushner’s Bid for Electronic Arts: Soft Power, FIFA Politics, and the Israel Question
  • U.S. Preparations to Overthrow the Maduro Regime
  • Qatar Buys Influence Through AI Infrastructure: QIA–Blue Owl $3B Data Center Deal
  • Israel’s Strategic Position Beyond Public Opinion
  • Poland’s Calculated Bet: Bolstering Ukraine’s Long-Range Strike Capabilities
  • Is the U.S. Actually Planning an Invasion or Coup in Venezuela?
  • Tadaweb Secures $20M to Expand Human-Centric OSINT Platform

Media Partners

  • Analysis.org
  • Opinion.org
AMD’s Pullback Looks More Like a Pause — And Nvidia’s Beat May Be the Turning Point
PayPal Pay in 4 Arrives in Canada for the Holiday Rush
NuScale Power: The SMR Bet Moves From Concept to Commercial Deployment
The Waiting Game at the Bank of England
Maersk Q3 2025: The Quiet Rebuild of a Global Trade Powerhouse
Tempus AI: Scaling Into an Inflection Point
Palantir’s Explosive Q3: When “AI Leverage” Becomes a Revenue Machine
Nexperia, China, Netherlands: A Semiconductor Flashpoint in Europe’s Geopolitical Balancing Act
Jensen Huang and the AI Virtuous Cycle: The Economics of Infinite Acceleration
Cloudflare’s Q3 Beat, Reacceleration, and the Quiet Cash Engine Powering the “Connectivity Cloud”
Europe’s Telecom Awakening — The Huawei Breakup Feels a Lot Like the Russian Gas Divorce
Woke Journalism as a Camouflaged Form of Anarchism
Israel Surrounded by Failed States
It Was Qatar All Along: Qatar’s Network of Influence and the Long Campaign Against Israel and the West
Photo of the Day: Pro-Palestinian Mobs Harassing European Cities
Hamas’s “Yes” That Really Means “No”
Spain’s Boom Is a Corruption-Fueled Illusion
Europe to Erdogan: Don’t Teach Us How to Eat
Europe’s Imported Illusion: He must be an engineer
Erdogan’s Possible Collapse

Media Partners

  • Market Analysis
  • Market Research Media
U.S. Housing Market Turns Sharply in Favor of Buyers, But Affordability Remains a Wall
Europe’s Turning Point: Why Cutting Out Chinese Tech Isn’t Just Necessary — It’s Long Overdue
Nvidia Q3 FY2026 Earnings: Still the Center of Gravity in the AI Super-Cycle
Ghost Kitchens as Infrastructure: The Shift from Restaurants to Intelligent Food Networks
Why are AI stocks falling if Anthropic is buying $30B of Azure capacity?
Sony’s Spark, and the Strange Quiet That Followed
Celero Communications Secures $140M to Push the Optical Frontier of AI Infrastructure
NTT R&D Forum 2025, Tokyo — When Quantum Stops Being Theory
IIFES 2025, November 19–21, 2025, Tokyo Big Sight
China Played Trump, Again: Soybeans, Strategy, and Leverage
AppCoding.com — A Clear, Flexible Identity at the Center of the Software-Everywhere Economy
APIcoding.com — A Digital Asset Aligned With the Infrastructure of the Modern Software Economy
NewsInstances.com — A Digital Identity Built for Event-Driven Media and AI-Generated Reporting
Marketing Content Creation Services in 2025
Visual Storytelling and the Rise of Gamma in the AI Productivity Stack
The Trade Desk: Durable Growth, Wider Moats, and a Faster Flywheel on the Open Internet
Expedia Group: Reacceleration in Core Travel Demand and Strong B2B Tailwinds Push Results Above Expectations
BuzzFeed, Inc. – Q3 2025 Analytical Report
The Rise of the Micro-Series Phenomenon
Canva’s Creative Operating System: A Strategic Shockwave for the Design Industry

Copyright © 2022 OSINT.org

Technologies, Market Analysis & Market Research and Exclusive Domains