AI audio technology has finally got the attention of the music industry after someone named “ghostwriter” released a song using AI-generated vocals by Drake and The Weeknd. As of right now, YouTube, Tidal, and others have taken the song down after Drake’s label complained. You can still hear it on Spotify. I downloaded it in case it disappears. But I have questions…
First, is this really AI generated vocals by Drake and Weeknd or this is a publicity stunt by genius self marketer Drake and his production company to keep him in the limelight? One reason I ask this question is because the track is arguably the best song Drake–or his doppelgänger–has ever done. Listen to it: https://open.spotify.com/track/2kHtinKgD3zngDYFGmBSmv?si=8538dd04184349d0. That said, it reminds me of Drake’s “Wants and Needs” song a lot.
Second, IF this is indeed a fake Drake, it raises all kinds of questions about the legality of using a computer generated clone of someone’s voice for a profit endeavor:
If this becomes illegal (as it stands it is not unless the laws change, see: https://ipwatchdog.com/2020/10/14/voices-copyrighting-deepfakes/id=126232/), how will they define what constitutes someone’s official voice? I ask this because there is no way they will be able to definitively define someone’s voice. For instance, a lot of rappers can sound just like Drake, others could use parts of the voice synthesis but not all. What will be considered a violation?
How will they police voice cloning? I submit they will only be able to police it on the major platforms, and that’s increasingly going to be a problem given the rapid emergence of smaller publishing platforms like Moises (download the app on your phone to check it out). In other words, someone will be able to post an AI generated vocal on one of the smaller platforms and gain hundreds of thousands, if not millions, of followers before the label can even find it and demand it get taken down.
If nothing else, I do find it ironic that technology-loving artists who have built their careers by using technology to emulate other people’s voices and sounds are now mad that people are using technology to emulate theirs. Although, admittedly if I was a big artist and people started using my voice on their raps and it sounded just like me, I’m thinking I would have a problem with it too. After all, it’s my voice and thus my intellectual property.
In the end, I do think the laws should be changed to protect voices. It’s the right thing to do. I would encourage people to develop their own voices and stop copying others. The clone culture is not only ethically dicey but boring and slowly killing music. How about you do you? It’s just a thought.
And if this is just a marketing ploy, well, that raises even more ethical questions than those I have raised about voice cloning. Then again, big music labels have never been known for their ethics. 😌