Breaking
July 23, 2025

Spotify had to pull an AI-generated song that claimed to be from an artist who passed away 36 years ago erichs211@gmail.com (Eric Hal Schwartz) | usagoldmines.com

  • AI-generated songs by deceased artists, like Blaze Foley, have been falsely uploaded to Spotify
  • The streaming service is taking them down as they are spotted
  • The tracks slipped past Spotify’s content verification processes through platforms like SoundOn

Last week, a new country song called “Together” appeared on Spotify under the official artist page of Blaze Foley, a country artist shot and killed in 1989. The ballad was unlike his other work, but there it was: cover art, credits, and copyright information – just like any other new single. Except this wasn’t an unearthed track from before his death; it was an AI-generated fake.

After being flagged by fans and Foley’s label, Lost Art Records, and reported on by 404 Media, the track was removed. Another fake song attributed to the late country icon Guy Clark, who passed away in 2016, was also taken down.

The report found that the AI-generated tracks carried copyright tags listing a company named Syntax Error as the owner, although little is known about them. Stumbling across AI-made songs on Spotify isn’t unusual. There are entire playlists of machine-generated lo-fi beats and ambient chillcore that already rake in millions of plays. But, those tracks are typically presented under imaginary artist names and usually have their origin mentioned.

The attribution is what makes the Foley case unusual. An AI-generated song uploaded to the wrong place and falsely linked to real, deceased human beings is many steps beyond simply sharing AI-created sounds.

Synthetic music embedded directly into the legacy of long-dead musicians without permission from their families or labels is an escalation of the long-running debate over AI-generated content. That it happened on a giant platform like Spotify and didn’t get caught by the streamer’s own tools is understandably troubling.

And unlike some cases where AI-generated music is passed off as a tribute or experiment, these were treated as official releases. They appeared in the artists’ discographies. This latest controversy adds the disturbing wrinkle of real artists misrepresented by fakes.

Posthumous AI artists

As for what happened on Spotify’s end, the company attributed the upload to SoundOn, a music distributor owned by TikTok.

“The content in question violates Spotify’s deceptive content policies, which prohibit impersonation intended to mislead, such as replicating another creator’s name, image, or description, or posing as a person, brand, or organization in a deceptive manner,” Spotify said in a statement to 404.

“This is not allowed. We take action against licensors and distributors who fail to police for this kind of fraud and those who commit repeated or egregious violations can and have been permanently removed from Spotify.”

That it was taken down is great, but the fact that the track appeared at all suggests an issue with flagging these problems earlier. Considering Spotify processes tens of thousands of new tracks daily, the need for automation is obvious. However, that means there may be no checking into the origins of a track as long as the technical requirements are met.

That matters not just for artistic reasons, but as a question of ethics and economics. When generative AI can be used to manufacture fake songs in the name of dead musicians, and there’s no immediate or foolproof mechanism to stop it, then you have to wonder how artists can prove who they are and get the credit and royalties they or their estates have earned.

Apple Music and YouTube have also struggled to filter out deepfake content. And as AI tools like Suno and Udio make it easier than ever to generate songs in seconds, with lyrics and vocals to match, the problem will only grow.

There are verification processes that can be used, as well as building tags and watermarks into AI-generated content. However, platforms that prioritize streamlined uploads may not be fans of the extra time and effort involved.

AI can be a great tool for helping produce and enhance music, but that’s using AI as a tool, not as a mask. If an AI generates a track and it’s labeled as such, that’s great. But if someone intentionally passes that work off as part of an artist’s legacy, especially one they can no longer defend, that’s fraud. It may seem a minor aspect of the AI debates, but people care about music and what happens in this industry could have repercussions in every other aspect of AI use.

You might also like

​ 

This articles is written by : Nermeen Nabil Khear Abdelmalak

All rights reserved to : USAGOLDMIES . www.usagoldmines.com

You can Enjoy surfing our website categories and read more content in many fields you may like .

Why USAGoldMines ?

USAGoldMines is a comprehensive website offering the latest in financial, crypto, and technical news. With specialized sections for each category, it provides readers with up-to-date market insights, investment trends, and technological advancements, making it a valuable resource for investors and enthusiasts in the fast-paced financial world.