- NSW Police 18 times more likely to place Indigenous youth on secret watchlist
- In Japan, this man will pretend to be your dad for $275
- First Nations teen subjected to “brutal police assault” demands justice
- My life needs an undo button – let me explain
- Premier clamps down on ‘illegal’ Black Lives Matter protest
According to current legislation, creating an AI that plagiarises the material of existing artists for profit is totally fine. Robo-Zombie anyone?
Artificial intelligence’s trampling of what we created is a fairly weighty tome. Blue-collar work, white-collar work, modern contemporary art, all have come under the steel heel of our robot buddies. However, artificial intelligence is still in its infancy, so it behoves we paranoid simpletons to discuss the future ramifications of our creations.
Today, we ponder the question regarding the circumstances of pop music, precisely, what happens when artificial intelligence takes over the artificial nature of popular music. For example, if you engineer an artificial intelligence that mimics the work of Beyoncé, do you owe her a cut of any of the resulting profits? Further, is it even legal to do so?
The above represent a handful of questions The Verge poses in their new story about the topic. It’s certainly worth a read, but there is an important pull-quote from a rather important individual that we should note. Jonathan Bailey, CTO of audio tech company iZotope — summed up the issue thusly:
“I won’t mince words,” he told The Verge. “This is a total legal clusterfuck.”
He sounds fun. Fun in a believing-intimidation-is-comedy kind of way. The type of dude who would slap you on the back at the urinal and laugh after you pissed on his shoe. But the laugh doesn’t reach his eyes. That kind of dude.
Now, the current US copyright laws hark back to 1965, which doesn’t really cover who owns what when a computer is a major part of the process.
As it stands, the Queen B AI could crank out an entire album of “Lemonade”-esque tracks (give me that sweet Australian cover, Solo), and as long as none of them sounded too much like a specific flavour of Beyoncé song, the AI-generated music wouldn’t technically be infringing on her copyrights — in fact, the AI’s creator wouldn’t legally owe the artist a penny, per lawyer Meredith Rose in conversation with The Verge.
If we’re talking about the use of copyrighted songs to train an AI, the law is even more hopeless, with no explicit condition to save, prevent or compensate the original artist in this case. So, theoretically, if you created a Knowlesbot, and it took off, the money is all yours, pending the sentience of the AI, as they could arguably kill you, and make away with the cash. But that is a problem for another juncture.
The saving grace, of course, is that programmers have yet to come anywhere near creating an AI capable of such a thing, but this is clearly something that we’ll one day have to deal with. “It’s like the future of self-driving cars,” media-focused venture capitalist Leonard Brody told Fortune in October. “Level 1 is an artist using a machine to assist them. Level 2 is where the music is crafted by a machine but performed by a human. Level 3 is where the whole thing is machines.”
In the field of music, it already exists in some form, as tech songstress Taryn Southern shared her songwriting credits with an AI on her “I AM AI” album, or Iranian composer Ash Koosha, who released an album on which he sang the work composed by AI-powered software.
Clearly, we’re game on, and the ‘legal clusterfuck’ is just about to befall us. Which is a great name for a band. Like a death metal band, like Metallica, but made of circuits. It could work. Battery writes itself. They’d know, after all.