Over the past few weeks, YouTube creators have noticed something strange: their videos look different after being uploaded, even though they haven’t changed a thing. Viewers have reported “extra punchy shadows,” “weirdly sharp edges,” and a smoothed-out, plastic-like finish that makes content appear subtly artificial.
According to The Atlantic, YouTubers suspect the company is using AI-driven upscaling techniques to “enhance” video quality without creators’ knowledge or permission. Multimedia artist Mr. Bravo, who deliberately runs his clips through a VCR for an authentic retro feel, wrote on Reddit that the site’s filters ruin the VHS-style aesthetic: “It is ridiculous that YouTube can add features like this that completely change the content.”
Creators spot subtle, unwanted changes
Music YouTubers Rhett Shull and Rick Beato, who command more than 700,000 and five million subscribers respectively, also flagged the issue. Shull warned that the changes could erode trust in his work: “I think it’s gonna lead people to think that I am using AI to create my videos. Or that it’s been deepfaked. Or that I’m cutting corners somehow,” he told his viewers.
Beato described noticing that his own face appeared subtly altered: “I was like ‘man, my hair looks strange’. The closer I looked it almost seemed like I was wearing makeup,” he told BBC.
YouTube confirms ‘experiment’
YouTube has confirmed it is experimenting with video processing. A company spokesperson told The Atlantic that the platform is “running an experiment on select YouTube Shorts that uses image enhancement technology to sharpen content,” adding that these tweaks are not generated by AI but through “traditional machine learning to unblur, denoise, and improve clarity in videos.”
Rene Ritchie, YouTube’s head of editorial and creator liaison, echoed that explanation in a post on X, comparing the process to what modern smartphones do automatically when recording video.
Experts question YouTube’s framing
But experts told the BBC that the distinction between “machine learning” and “artificial intelligence” is blurry. Samuel Woolley, a disinformation researcher at the University of Pittsburgh, argued that YouTube’s framing “feels like a misdirection,” noting that machine learning is itself a subset of AI.
“What we have here is a company manipulating content from leading users that is then being distributed to a public audience without the consent of the people who produce the videos,” he said.
The issue highlights a broader concern: AI is increasingly mediating the content people consume, sometimes invisibly. Jill Walker Rettberg, professor at the Centre for Digital Narrative in Norway, told BBC that this shift could fundamentally reshape how audiences relate to media: “With algorithms and AI, what does this do to our relationship with reality?”
YouTube has not clarified whether creators will be able to opt out of these modifications. For some, such as Beato, the controversy is secondary to the platform’s overall impact: “YouTube changed my life,” he said. But for others, the creeping presence of AI risks weakening the bond of authenticity between creators and their audiences, an irony for a platform once built on the motto “Broadcast Yourself.”