AI Songs Earn $12 Million in Royalty Scam

John Lister's picture

A man who earned $12 million in royalties after "writing" hundreds of thousands of songs has been charged with fraud. Michael Smith allegedly created the songs with artificial intelligence, then used bots to "listen" to the music on streaming services to generated revenue.

The case against Smith is not that the music itself was "not real" but rather that he was falsely claiming credit for listeners. Prosecutors say that not only did he steal money from the streaming sites, but that legitimate songwriters missed out. That's because some streaming sites divide a fixed royalty fund across all sites.

Creating a song specifically to take advantage of streaming algorithms is not in itself illegal. For example, one man made a full time legitimate living creating songs with titles designed to get plays from people using smart speakers. Songs such as "Fart Noises" cashed in on juveniles of all ages saying "Alexa, play fart noises."

Fake Listeners

That's not what Smith was allegedly doing. He's said to have used AI to make "music" whose main purpose was to meet eligibility requirements for appearing on streaming sites. The benefit of AI in this cases was most certainly quantity over quality, with track titles like "Zyme Bedewing" and artists such as "Calvinistic Dust". (Source: justice.gov)

Prosecutors say Smith then created as many as 10,000 fake listener accounts on the site and used cloud computing services to run virtual computers. Each had numerous tabs open, each of which was logged in to a fake account and "listened" to his music 24 hours a day.

At one stage his own calculations showed that on average his music catalog was getting 661,440 streams a day. A later email claimed he'd stepped up operations and has now made $12 million from more than 4 billion streams.

Streaming Sites Change Rules

Smith has now been charged with "three counts of wire fraud, wire fraud conspiracy and money laundering conspiracy charges." Prosecutors say Smith misrepresented his identity when creating the user accounts and in turn misled the streaming companies into thinking real people had been listening.

The BBC notes that while streaming companies may struggle to automatically verify what counts as "real" music, they are taking other steps to crack down on such schemes. For example, Spotify has increased the minimum number of streams a song must receive in a year before it is eligible for royalties. (Source: bbc.co.uk)

What's Your Opinion?

Is it inevitable that people will pull scams like this? Does the streaming royalties model make sense? Does it matter if streaming sites are full of AI-generated "songs"?

Rate this article: 
Average: 5 (2 votes)

Comments

Chief's picture

Of course it's inevitable.
Of course it's unethical.

Unfortunately, as everything streamlines to AI, this is an inevitable result.

For example, a vehicle was involved in an incident.
Insurance company had me take pictures.
The AI determined how much the insurance company was willing to pay.

What could possibly go wrong???