AI body-swap video of Stranger Things cast hits 14M views on X
A video made with Kling AI’s Motion Control 2.6 shows creator Eder Xavier swapping into Stranger Things stars, drawing 14 million views on X and prompting new warnings about full-body deepfakes.
A viral video on X, reportedly created with Kling AI’s Motion Control 2.6, features Brazilian creator Eder Xavier replacing his face and body with Millie Bobby Brown, David Harbour and Finn Wolfhard. The post has surpassed 14 million views this week, with copies spreading on other platforms.
The clips show full-body identity swaps, not just facial replacement. Technologists and investors pointed to rising production quality and low cost, noting that such tools can rapidly change how videos are made.
“We’re not prepared for how quickly production pipelines are going to change with AI,” wrote venture investor Justine Moore. “Some of the latest video models have immediate implications for Hollywood. Endless character swaps at a negligible cost.”
Security specialists warned that the same tools create openings for fraud and influence operations. “The floodgates are open. It’s never been easier to steal an individual’s digital likeness-their voice, their face-and now, bring it to life with a single image. No one is safe,” cautioned Emmanuelle Saliba, chief investigative officer at cybersecurity firm GetReal Security. “We will start seeing systemic abuse at every scale, from one-to-one social engineering to coordinated disinformation campaigns to direct attacks on critical businesses and institutions.”
Saliba pointed to thin safeguards and low barriers to entry. “For a few dollars, anyone can now generate full-body videos of a politician, celebrity, CEO, or private individual using a single image,” she added. “There’s no default protection of a person’s digital likeness. No identity assurance.”
Yu Chen, a professor of electrical and computer engineering at Binghamton University, described full-body character swapping as a step up in synthetic media. “These systems must simultaneously handle pose estimation, skeletal tracking, clothing and texture transfer, and natural movement synthesis across the entire human form,” he explained.
Earlier deepfakes often replaced only the face, leaving the rest of the frame intact. Detection systems looked for boundary mismatches or unnatural head motion. Chen noted those cues are less useful when the entire body is generated to match pose and movement, which reduces visible seams and other artifacts.
Beyond fraud and impersonation, Chen flagged other risks. “Non-consensual intimate imagery represents the most immediate harm vector, as these tools lower the technical barrier for creating synthetic explicit content featuring real individuals,” he said.
Both Chen and Saliba also pointed to political deception and corporate espionage. Fabricated “leaked” clips or executive impersonations could be used to bypass controls or harvest credentials. In corporate settings, Saliba warned, “a believable person on video lowers suspicion long enough to gain access inside a critical business.”
Creators have posted similar body-swapped clips of other scenes, including a version of Leonardo DiCaprio in The Wolf of Wall Street.
It remains unclear how studios or the actors depicted will respond. Chen urged developers working on publicly available models to implement safeguards while emphasizing shared responsibility among platforms, policymakers and users to avoid chilling legitimate uses.
Researchers are calling for new detection methods that focus on content-based signals rather than metadata. Chen encouraged teams to build detectors that pick up intrinsic statistical patterns that are harder to strip out and recommended pairing automated screening with human review and clear escalation paths for high-risk posts. He also pushed for liability rules and disclosure requirements for synthetic content, warning that the rapid spread of these tools will test any response systems soon.
The material on GNcrypto is intended solely for informational use and must not be regarded as financial advice. We make every effort to keep the content accurate and current, but we cannot warrant its precision, completeness, or reliability. GNcrypto does not take responsibility for any mistakes, omissions, or financial losses resulting from reliance on this information. Any actions you take based on this content are done at your own risk. Always conduct independent research and seek guidance from a qualified specialist. For further details, please review our Terms, Privacy Policy and Disclaimers.








