
They went viral, amassing more than 1m streams on Spotify in a matter of weeks, but it later emerged that hot new band the Velvet Sundown were AI-generated – right down to their music, promotional images and backstory.
The episode has triggered a debate about authenticity, with music industry insiders saying streaming sites should be legally obliged to tag music created by AI-generated acts so consumers can make informed decisions about what they are listening to.
Initially, the “band”, described as “a synthetic music project guided by human creative direction”, denied they were an AI creation, and released two albums in June called Floating On Echoes and Dust And Silence, which were similar to the country folk of Crosby, Stills, Nash & Young.
Things became more complicated when someone describing himself as an “adjunct” member told reporters that the Velvet Sundown had used the generative AI platform Suno in the creation of their songs, and that the project was an “art hoax”.
The band’s official social media channels denied this and said the group’s identity was being “hijacked”, before releasing a statement confirming that the group was an AI creation and was “Not quite human. Not quite machine” but living “somewhere in between”.
Several figures told the Guardian that the present situation, where streaming sites, including Spotify, are under no legal obligation to identify AI-generated music, left consumers unaware of the origins of the songs they’re listening to.
Roberto Neri, the chief executive of the Ivors Academy, said: “AI-generated bands like Velvet Sundown that are reaching big audiences without involving human creators raise serious concerns around transparency, authorship and consent.”
Neri added that if “used ethically”, AI has the potential to enhance songwriting, but said at present his organisation was concerned with what he called “deeply troubling issues” with the use of AI in music.
Sophie Jones, the chief strategy officer at the music trade body the British Phonographic Industry (BPI), backed calls for clear labelling. “We believe that AI should be used to serve human creativity, not supplant it,” said Jones.
“That’s why we’re calling on the UK government to protect copyright and introduce new transparency obligations for AI companies so that music rights can be licensed and enforced, as well as calling for the clear labelling of content solely generated by AI.”
Liz Pelly, the author of Mood Machine: The Rise of Spotify and the Costs of the Perfect Playlist, said independent artists could be exploited by people behind AI bands who might create tracks that are trained using their music.
She referred to the 2023 case of a song that was uploaded to TikTok, Spotify and YouTube, which used AI-generated vocals claiming to be the Weeknd and Drake. Universal Music Group said the song was “infringing content created with generative AI” and it was removed shortly after it was uploaded.
It is not clear what music the Velvet Sundown’s albums were trained on, with critics saying that lack of clarity means independent artists could be losing out on compensation.
Pelly said: “We need to make sure that it’s not just pop stars whose interests are being looked after, all artists should have the ability to know if their work has been exploited in this way.”
For some, the appearance of the Velvet Sundown is the logical next step as music and AI combine, while legislation is fighting to keep up with a rapidly changing musical ecosystem.
Jones said: “The rise of AI-generated bands and music entering the market points to the fact that tech companies have been training AI models using creative works – largely without authorisation or payment to creators and rights-holders – in order to directly compete with human artistry.”
Neri added that the UK has a chance to lead the world in ethical AI adoption in music but said there needed to be robust legal frameworks that “guarantee consent and fair remuneration for creators, and clear labelling for listeners”.
“Without such safeguards, AI risks repeating the same mistakes seen in streaming, where big tech profits while music creators are left behind,” he added.
Aurélien Hérault, the chief innovation officer at the music streaming service Deezer, said the company uses detection software that identifies AI-generated tracks and tags them.
He said: “For the moment, I think platforms need to be transparent and try to inform users. For a period of time, what I call the ‘naturalisation of AI’, we need to inform users when it’s used or not.”
Hérault did not rule out removing tagging in future if AI-generated music becomes more popular and musicians begin to use it like an “instrument”.
Deezer recently told the Guardian that up to seven out of 10 streams of AI-generated music on the platform are fraudulent.
At present, Spotify does not label music as AI-generated and has previously been criticised for populating some playlists with music by “ghost artists” – fake acts that create stock music.
A spokesperson for the company said Spotify does not prioritise AI-generated music. “All music on Spotify, including AI-generated music, is created, owned and uploaded by licensed third parties,” they said.
