Is Your Favorite New Artist Actually an AI Robot?

0
5


It took 13 seconds for Suno, a popular AI music generator, to answer my prompt: a song about a robot musician at an open mic, hoping someone will follow him on Instagram. The only guidance: Make the bass funky.

Soon it produced “Digital Dreams,” — described as a “bass-driven, hypnotic rhythm with a syncopated bassline.” The lyrics: “My code’s alive, programmed to dream. But am I seen? Or just a machine?” Suno also offered multiple alternate song options.

If I wanted to, I could pay for a Suno subscription, secure the commercial rights to the song and pay a music distribution service to upload the track to a streaming platform.

Generating — and publishing — AI music really is that easy.

Is it good? Eh, not in my opinion, but I’m a big music geek. Would you know it was made by a robot? Depends on your ear. Does it matter? That’s up to you. (Although human musicians aren’t thrilled.)

For now, there’s little stopping anyone from posting AI tracks, as long as they hold the copyright and follow the platform’s rules. And if the algorithm likes it, your song could land on a playlist for anyone to discover.

AI music is here to stay. But as it quietly populates music streaming platforms, one question lingers: Will your subscription still deliver the same quality?

Streamers’ signal-to-noise dilemma

The major music streaming players are Spotify, Apple Music, YouTube Music and Amazon Music. Spotify leads with a reported 276 million global subscribers in Q2 2025 — up 12% from last year — and 696 million active users. Some of the smaller streaming companies include Tidal; radio-style platforms like Sirius XM and Pandora Premium; SoundCloud, embraced by indie artists; and Bandcamp, a non-traditional streamer that supports direct-to-artist sales. (Scroll to the end for details on how individual platforms approach AI.)

All these platforms now contain AI-generated music, and as these tracks become more ubiquitous, it raises a question: How much AI is too much? If AI tracks flood the services, will listeners start viewing the platforms as content mills and push back?

The explosion of AI could pose legal, reputational and business risks for streaming platforms. Artists and labels, meanwhile, are watching developments with alarm.

In January, I spoke with Franz Nicolay, keyboardist for the indie rock band The Hold Steady, academic and author of the book “Band People: Life and Work in Popular Music.” He said streaming encourages a passive relationship to music, so many listeners won’t notice the spread of AI tracks, especially if they’re listening to mood-based playlists like ambient, low-fi hip hop, soft electronic or instrumental jazz.

AI makes it cheap to crank out endless tracks, and streaming platforms are already glutted with tens of thousands of uploads a day.

Franz Nicolay, musician and author

“AI makes it cheap to crank out endless tracks, and streaming platforms are already glutted with tens of thousands of uploads a day,” says Nicolay. “If they’re not careful, Spotify and others risk going the way of Facebook — becoming slop hellholes where real artists pull their catalogs because the signal-to-noise ratio is so bad.”

AI is powering imitators

In July, I opened my Spotify app and was recommended a new release by Blaze Foley, an obscure outlaw country artist who was killed in 1989. Posthumous track releases aren’t unusual, but this one felt off.

The thumbnail showed a young punk singing into a microphone against stage lights, appearing nothing like the weathered, soulful Foley. Then I pressed play on the track, titled “Together,” and was met with a slow-rolling, generic country chord progression. The thin, tinny sound quality was as poor as its construction was uninspired.

In other words, it was AI slop.

About a week later, a friend texted me a 404 Media article detailing how Spotify had published an AI-generated Blaze Foley song along with another by a robotic imitator of Guy Clark, another deceased country artist — both without the permission of the rights holders.

I reached out to Craig McDonald, owner of Lost Art Records and manager of Foley’s streaming pages. He says he and his wife saw the imposter track on Foley’s artist page, but had no direct interactions with Spotify. He calls the imposter track a “total insult” to Foley’s memory, his fans and potential new listeners. “I assume after they heard that tune, they would not become Blaze Foley fans, because it’s certainly not up to Blaze’s standards,” he adds.

McDonald says he thinks music streaming platforms should label AI-created tracks. “We think that disclosure’s important. This, however, was a little bit different.” he says. “This was allowing a distributor to impersonate an artist. It’s the impersonation that’s more harmful than just posting AI content.”

In response to my inquiry, a Spotify spokesperson said: “We flagged the issue to SoundOn, the distributor of the content in question, and it was swiftly removed. This violates Spotify’s policies, and is not allowed. We take action against licensors and distributors who fail to police for this kind of fraud and those who commit repeated or egregious violations can and have been permanently removed from Spotify.” SoundOn is owned by ByteDance, the Chinese company that also owns TikTok.

The incident highlights an inherent problem with AI music uploads: There are no filters, no clear guardrails and no disclosure that any of the content on Spotify is AI generated. Apple Music and Amazon Music also don’t label AI content. YouTube labels all AI tracks created using its own generative AI tools and requires disclosures for “Synthetically generating music.” Creators are encouraged to voluntarily label their tracks, but YouTube may apply the label for them, if it deems necessary. All of the big streamers require submissions to be owned by rights holders; they all say there’s zero tolerance for impersonators or copyright infringement.

These incidents aren’t flukes. In 2023, a TikTok user uploaded an AI-generated track called “Heart on My Sleeve” to the social media app and music streaming platforms, claiming to be the work of music artists Drake and the Weeknd. Before the “Fake Drake” track was pulled by Universal Music Group for copyright infringement, it generated a reported 600,000 streams on Spotify streams, 5M views on TikTok views and 275,000 on YouTube.

Shortly after the Blaze Foley incident, New York City-based jazz saxophonist Chris Ward told me he saw entire albums attributed to a John Scofield impersonator. Unlike Foley or Guy Clark, the prolific guitarist John Scofield is alive and well. The albums credited to him, like “Background Guitarra Jazz,” were clearly designed to feed playlist algorithms. When I notified Spotify of the Scofield imitations, the content was quickly removed — only to reappear weeks later. But there are dozens of these instances being catalogued by Reddit users, across all genres.

File, Webpage, Guitar

“I’ve seen cloned AI versions of jazz records on YouTube and Spotify — stuff released under real musicians’ names that they never recorded,” says Ward. “Sometimes it even carries the liner notes of the original album. It’s fake revenue, fake art, and no one knows where the money goes.”

Eric Drott, a music theory professor at University of Texas at Austin, says that Spotify and other music streaming platforms limit their liability by requiring uploads to go through distributors. “It creates layers of mediation where bad actors can insert themselves,” says Drott. “And because digital reproduction is so cheap — and now purely AI-generated music is cheap — you can imagine scammers saying, ‘This will get taken down eventually, but can we rack up enough streams in the meantime to cash out?’ Multiply that by hundreds of obscure artists, and those numbers start to become significant.”

After the “Fake Drake” track was pulled from streaming platforms, UMG — which holds financial stake in multiple platforms — warned that AI-generated music puts the industry at a crossroads. The label said this “begs the question as to which side of history all stakeholders in the music ecosystem want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation.”

Jael Holzman, vocalist, lyricist and bassist of Ekko Astral — a punk band from Washington, D.C., whose debut album “pink balloons” was named Pitchfork’s record of the year in 2024 — recounted learning about shady AI-generated tracks masquerading as new music by deceased rappers like Nipsey Hussle and Pop Smoke. The tracks appeared to originate from someone in upstate New York who used AI to create fake rapper personas, complete with multiple alter egos and impersonations of real artists. In one case, rapper Riff Raff’s likeness and verse were used without permission, confirmed by the rapper’s management, says Holzman, who is also a journalist.

Musical creators are protected under the U.S. Copyright Act of 1976, but the courts haven’t addressed how AI-generated content should be treated, in particular when programs are trained on an artist’s protected works.

In a paper titled “Fake Drake? AI Music Generation Implicates Copyright and the Right of Publicity” by Hope Juzon for the Washington Law Review, the author writes, “If courts hold that the use of copyrighted inputs constitutes infringement, there may be a chilling effect on AI music generation as a whole. Alternatively, if courts find that this use is not infringing, rights holders may lose control over their works and fail to receive compensation for the use of their works.”

Other creative industries face parallel challenges of AI programs being trained on their work, including publishing, visual art, film and digital media. On Sept. 5, Anthropic, the AI company behind the large language model (LLM) Claude, agreed to pay a $1.5 billion settlement in a lawsuit brought by a group of authors and publishers.

The judge ruled that Anthropic had illegally downloaded millions of books to train its LLM. But the ruling wasn’t entirely one-sided: the judge held that Anthropic’s use of legally acquired books to train on AI can qualify as fair use. The ruling could set a legal precedent for how copyright cases are handled across other creative industries, including music.

Spotify is already a tough gig for most musicians

Emerging after the wild west days of Napster and Kazaa, legal streaming was a revelation. No longer did you have to blow your budget on individual records or wait days to illegally download an artist’s entire catalog. For roughly the price of a single album you can access a massive library of music on demand.

On-demand audio streaming is expanding every year, both in total plays and paid subscriptions. In 2024, on-demand audio streams in the U.S. increased by 6.5% to reach around 1.4 trillion, according to Luminate, an entertainment data and insights company. At the same time, paid streaming hit 100 million U.S. subscriptions, up from 75.5 million in 2020, according to the Recording Industry Association of America (RIAA).

There’s no denying that music streamers provide an incredible value proposition to consumers. For artists? Not so much.

Most artists earn very little from streaming due to a complex system of how streaming platforms calculate revenue and the existing royalties system.

On Spotify, for example, tracks need at least 1,000 annual streams before generating any payout. Then, Spotify distributes revenue through a pro-rata pool, paying roughly $0.003-$0.005 per stream. That means artists must achieve millions of streams to earn a substantial income, and even then payouts are divided among rights holders, which includes labels, producers, songwriters and performers.

Last year, Spotify paid $10 billion to the music industry, according to the company’s “Loud & Clear” report. Global recorded music revenue reached $30 billion in 2024, compared to $13 billion in 2014. Streaming accounts for more than two-thirds of that revenue. Spotify notes that the number of artists earning meaningful income (more than $1,000 per year) has tripled since 2017.

“Somehow we’ve made way more money in the 2020s selling CDs than we have on Spotify,” says Sam Smith, a bassist (and, full disclosure, my ever-patient bass teacher) whose band, Art Thief, hasn’t seen profit from streaming. “And I don’t know anyone who owns a CD player, but everyone I know uses Spotify.”

You could spend your $10 and only listen to my music [on Spotify] and I would not make $10 — I would not even make $5 because your $10 would be going toward the pot, which then also goes to enrich the top artists, then trickles down.

Thom Dunn, musician and writer

Thom Dunn, a musician and writer in Boston, puts it plainly: “You could spend your $10 and only listen to my music and I would not make $10 — I would not even make $5 because your $10 would be going toward the pot, which then also goes to enrich the top artists, then trickles down.”

Taylor Swift, the world’s biggest pop star, once cut ties with Spotify due to its ad-supported tier, which she said undercompensated artists. Swift’s albums returned in 2017, and in 2024, she reportedly earned over $100 million in royalties from the platform.

That’s all well and good for her, but there’s only one Taylor Swift and she’s already a billionaire. For the overwhelming majority of artists streaming is not a reliable source of income.

Spotify’s business strategy favors playlists, and Spotify CEO Daniel Ek has suggested AI may soon handle them. “The truth is, as good as we are at recommendations, if you really put your mind to it, you could create a better playlist yourself,” said Ek, in a Jan. 24 podcast. “I think five years to 10 years from now, that will not be true.”

With AI playlists, Ek said, “We’re going to introduce you to things that you probably thought, ‘No way in hell am I gonna be interested in this,’ and you’re going to be totally hooked.”

Low-visibility artists have a bleak choice: rely on luck of algorithms alone or accept lower royalties for track visibility through Spotify’s Discovery Mode tool. “Spotify encourages playlists over albums and artists,” says Dunn. “If I record a song and I don’t get it playlisted ahead of time before the release, the song is dead in the water.”

AI wave makes it even tougher for human artists

AI adds another layer of concern to an existing issue for working musicians. “You already have tons of people who are churning out music that’s simply to make background music,” says Dunn. “Then take it a step further and say, well, we don’t even need to pay this guy to do this — you can just have this AI do the same damn thing.”

As Liz Pelly documents in “Mood Machine: The Rise of Spotify and the Costs of the Perfect Playlist,” Spotify has allegedly licensed tracks from Swedish stock and background music producers to populate popular mood-based playlists. These “ghost artists” rapidly deliver content, which lowers royalty payouts for Spotify.

Pelly writes, “The songs were often made by anonymous session musicians — one of them, a jazz musician for hire, told me he would crank out dozens of tracks at a time, assigning them one-off monikers.” The tracks licensed by Spotify were then populated on “lean-back playlists” Pelly writes, and by 2023, Spotify had populated over 100 playlists with this type of background music, which it called “perfect fit content” or PFC.

The “perfect fit content” model could go further if humans were eliminated from the equation altogether. After all, robots can instantaneously produce endless tracks for playlists.

“The problem with fraudulent streaming or AI-generated fakes is that it takes money out of the revenue pool divided among all artists,” says Drott. “Independent artists are probably harmed more directly because they don’t have the same resources as major label artists. And because of minimum thresholds Spotify has instituted, if an AI track siphons enough listening away from a small artist, it could prevent them from hitting the payout threshold altogether, essentially demonetizing them.”

Spotify denies producing music in-house or favoring AI. A spokesperson said in an email, “Spotify does not prioritize AI-generated music, and it doesn’t cost Spotify less. All music on Spotify, including AI-generated music, is created, owned, and uploaded by licensed third parties.”

Holzman of Ekko Astral is “aggressively anti-AI art” in any form and her band refuses to work with anyone who uses AI. She says streaming is already pushing smaller acts to the margins, and AI is bound to exacerbate that. “I cannot believe that the rise of AI is going to actually choke out the unique artists among us who deserve to get acclaim.”

Nikolay adds that AI is likely to replace the bread-and-butter work of session musicians: demo tracks, jingles, and filler music — work that once paid bills. “It’s not replacing artistry so much as undercutting the economic floor musicians relied on.”

Smith says that, because of AI, it seems like any way to make money as a recording artist is on its way out, but it’s the lack of creativity that is most gutting. “I don’t really see what a computer has to express,” says Smith. “The only reason to use AI music is to see music as a product, and it’s like, well, now we can make this product for cheaper, if not free. But the point of music really is supposed to be expression.”

Like it or not, AI-generated music isn’t going anywhere

AI-generated music is more accessible than ever, with a growing number of platforms allowing anyone to quickly create tracks using tools like Suno, Boomy, Beatoven.ai, AIVA and Udio. But some in the industry say AI aids expression, enabling those without musical talent to will their own imagined songs into being, while also providing new tools to skilled musicians.

The producer Timbaland co-launched an AI-focused entertainment company called Stage Zero, which, in June announced its first AI “artist,” Ta Ta. Timbaland told Billboard, “I’m not just producing tracks anymore, I’m producing systems, stories, and stars from scratch. [TaTa] is not an avatar. She is not a character. TaTa is a living, learning, autonomous music artist built with AI. TaTa is the start of something bigger. She’s the first artist of a new generation. A-Pop is the next cultural evolution, and TaTa is its first icon.”

As TaTa demonstrates, it’s not just the music itself that can be crafted with AI — it’s an entire package. AI tools to create graphics, lyrics and videos enable anyone to create a fully realized AI artist.

Reddit users are particularly eagle-eyed at spotting AI tracks. One of the more popular artists mentioned is Aria Sai, a suspected bot artist who has garnered over 100,000 monthly listeners on Spotify and has social media accounts depicting an entirely AI-generated existence.

In June, The Velvet Sundown, a band composed of fake musicians with their own bio, band photos and three albums, drew controversy after Reddit users questioned whether the band was real. Media coverage followed, and on July 5, the Velvet Sundown confirmed on social media that it “is a synthetic music project guided by human creative direction, and composed, voiced and visualized with the support of artificial intelligence.” According to its Spotify page, the band receives nearly 350,000 listeners monthly.

“Whatever money Velvet Sundown made from the millions of streams they accrued — probably tens of thousands of dollars — that’s coming out of somebody else’s pockets.” says Drott.

“Independent artists have been unhappy with Spotify’s model since it launched, but now it’s at the point where there’s no meaningful income to lose, and Spotify isn’t necessarily giving them much visibility either,” says Drott.

“The risk Spotify faces is if they’re too permissive in letting the platform get overrun by AI slop, they could enter a death spiral where more and more human artists pull their catalogs. AI can also intensify streaming fraud, since bots can simulate listening.”

Holzman sees one silver lining to the proliferation of AI music: “I think that AI generally means that original and unique content rises to the top and is given a premium, but there are gonna be a lot of people who lose work and money.” She adds, “I am confident that people don’t want this stuff. That’s the bottom line.”

Artists are pushing back, too. Over the summer, a parade of artists including Deerhoof, Hotline TNT, Xiu Xiu, Leah Senior, David Bridie, and King Gizzard & the Lizard Wizard all pulled their music from Spotify — over the company CEO’s investments in AI-focused military technology. On Sept. 8, a coalition of 30 musicians in Seattle signed an open letter announcing they would pull their music from Spotify, criticizing the platform’s business model and its embrace of AI generated music.

Since most streaming platforms don’t label AI-generated content, it’s up to listeners to discern what’s real and what’s not. My own hot tip: If the quality is low and the track or album title sounds like something you’d type into a search bar, it’s probably made by a robot. One solution is to go directly to an artist’s page instead of shuffled mixes or algorithmically generated playlists.

And there’s always the old-fashioned option: Pay for your music directly from artists.

Compare how platforms address AI-generated music

  • Apple Music: No labeling. 

  • Amazon Music: No labeling.

  • Youtube Music: No labels by default, but creators  must disclose any AI content that appears realistic so the video can be labeled.

Upload and distribution rules

  • Spotify: Accepts AI content uploads through licensed distributors. Creators must hold copyrights. Any content that violates its policies, including artist imitators, is removed. 

  • Apple Music: Same policy as Spotify. Apple also requires one performer and producer credit for songs submitted through distribution services. 

  • Amazon Music: Same policy as Spotify. 

  • Youtube Music: Creators can upload tracks directly without a distributor. Content must follow copyright and disclosure guidelines. In 2023, YouTube struck a deal with Universal Music to set rules around AI content on the platform and launched the “Music AI Incubator” with the recording giant.  

  • Spotify: Leans into AI personalization, especially through playlist mixes and the AI DJ “X,” a digital assistant that adds voiceovers for playlists. 

  • Apple Music: Personalized playlists like its “For You” feature. Beginning this fall, with the release of iOS 26, Apple is adding AutoMix, an AI-generated DJ feature. 

  • Amazon Music: Integrates with Alexa. The platform is beta testing a new AI-powered search tool and AI-generated playlist tool. 

  • Youtube Music: AI-driven algorithms. 

Generative AI tools for creators

  • Youtube Music: Hosts a tool called Dream Track that enables users to create AI-generated tracks for shorts.

  • Sirius XM: AI is only used in advertising and support — not for music. 

  • Pandora Premium: Does not label AI tracks. Some users have reported AI content on the platform.

  • Tidal: Does not label AI tracks. Same distribution policies as Spotify, Apple Music and Amazon Music.

  • SoundCloud is one of the few music streamers to integrate AI-generative features for creators into its platform. That includes tools like Fadr for mashups and remixes; Soundful for generating what it calls “studio-quality music”; Voice-Swap to convert vocals into licensed artists’ voices; among other tools. 

A representative said in an email that it ethically embeds AI into its ecosystem: “In short, SoundCloud uses AI to strengthen the connection between artists and fans, not replace it.” The platform doesn’t build generative AI tools. It has what it calls a “robust content identification system to detect infringing uploads” like impersonation and unverified artists.

  • Bandcamp: No AI policies outside of typical legal ownership standards for content uploads. The platform’s main purpose is to facilitate direct-to-artist sales.

Representatives of Apple Music, Amazon Music, YouTube Music and Tidal did not respond to requests for comment on this story.

(Lead image: PhonlamaiPhoto via iStock)


LEAVE A REPLY

Please enter your comment!
Please enter your name here