From copyright, production and A&R to consumption, sync and advertising, artificial intelligence (AI) and machine-learning are already having an impact on the modern music industry. A panel at AIM’s Music Connected conference in London today explored this from an independent perspective.

The panel included Lydia Gregory from FeedForward AI; Simon Wheeler from Beggars Group; and Sophie Goossens from Reed Smith. Jeremy Silver from Digital Catapult moderated, starting by noting that ‘automation’ has actually been part of the music industry for a long time.

“You’ll remember that in the 1970s people got very exercised about drum machines, and thought they were going to replace every drummer in town,” he said, before citing Brian Eno’s work on ‘generative’ music in the 1990s, through to the new crop of AI music-creation startups, as well as the role machine-learning has played for streaming services like Spotify; automated mastering with startups like Landr.

Gregory talked about the most promising areas. “Just on the word AI, it’s really good for PR… but actually it’s not a thing. It’s never sentient. It’s a collection of technologies. So in terms of where AI will make the biggest impact on the industry from a commercial perspective, it’s already started. It will be an extension of ideas that have already begun,” she said. “Where it will have an impact is automating bits that creators find boring, or finding other ways to support the creative process.”

Wheeler agreed that it’s hard to talk about AI as just one thing. “It’s just a smarter form of computing. It’s like talking about blockchain: it’s a database! But the interesting ideas [for AI] are around assisting creativity. Being an accompaniment to a performance, to giving creative inspiration when you’ve got writer’s block… something that can introduce something, maybe based on what you’ve already created,  or maybe it’s out of the blue,” he said.

“If we seem that tools, it’s very exciting. If you see them as something that’s going to take over the world? Well, you’ve probably been watching too many sci-fi movies!”

Goossens spoke from a legal perspective. “It’s a lot to do with data: reading data, where the data comes from. But when you move into creative AI, a robot composing music, whether it composes music on its own, or it assists someone with the creative process… it’s not just any data, it is also data that can be protected by copyright,” she said.

“And when you mix those two and try to get your head around how it works from a legal point of view: who is the creator, who can reap the benefits from this creation, then you get into a world where you can either have a lot of fun, or you can have a lot of headaches!”

She added that the legal frameworks differ around the world: how this might work in the UK is different from France, from the US, from China… As much as copyright regulation has harmonised to some extent, there are still local differences.

Wheeler said that Beggars is keen to assess how startups are operating in this regard. “It’s a bit early for us to have a position. But when you have a tool being used by people to create things – it may add more or less depending on what the tool is – is it that much different from using  sampler or a sequencer?” he said.

“But when it’s ‘machine creates music’ then I think it’s a different proposition… It’s an area where lawyers can rack up lots of billing hours! There’s some conversations to be had there, but we’re in danger of overthinking this in many, many ways.”

Gregory talked about generative music. “It’s a long way off being able to match exactly what someone wants in a specific situation… more in the short term, this idea of co-creation. In the legal space it will get messier before we understand what the impact is. Perhaps it’s an idea that has been generated by a machine, or in a sync library where you start with a track that has been composed by a human, but then you can adapt that track – the person commissioning it can come in and say ‘I want to change the ending’…” In both cases, the question of who ‘owns’ the creation could be up for discussion. There is the potential for real disruption,” said Gregory.

Goossens cited the example of Sony’s ‘Flow Machines’ project, and the Beatles-esque song ‘Daddy’s Car’ which started with an AI that had been fed only Beatles songs, and then humans worked with some of the music it created to fashion it into a track.

“It reminds us that an AI computer software cannot live and compose in a vacuum. It has to be fed stuff in order to analyse and learn from the stuff it is fed with, in order to then produce something new and potentially creative,” she said.

“There’s nothing to stop any of this happening, and if people want to listen to it, that’s what’s going to happen. And if they don’t want to listen to it, it’s just going to live over there in its own little world,” said Wheeler.

Meanwhile, Gregory talked about the way human composition works: “You’ve got everything in your head that you’ve ever been inspired by, and now you have these lawsuits with Pharrell Williams and so on…” Referring to the ‘Blurred Lines’ lawsuit, which as she pointed out, hinges on proving whether a songwriter or artist had copied an older track – wittingly or unwittingly, which is the grey area.

“With an AI, all these systems have been trained on very closed datasets. We know what they’ve been trained on. If a lawyer comes in and says ‘this sounds too much like this track’ you can go in and see if it’s been trained on that,” she said. In other words: proving an AI’s plagiarism might be easier than proving a human’s.

The conversation turned to the independent sector, and what the opportunity might be around AI for these music companies. “Just be open-minded to what technology can bring and can offer,” he said, suggesting that machine-learning applied to music discovery may be more impactful than AI creation and co-creation.

“The key application which I can see at the moment is enabling services to get so much deeper into the catalogue, and then understand what our repertoire is, and understanding their customers, and matching that up,” he said. “That could enable them to go into the full depth of the catalogue, and start surfacing that for people. That’s of real value.” This is exactly what Gregory’s company, FeedForward AI, is working on.

Spotify now employs François Pachet, who was behind the Flow Machines project, as its director of ‘creator technology’. Is that an exciting thing for the industry, or a worrying prospect?

Goossens pointed to his past interviews about copyright. “One interesting idea that he had at the time, he said that he believed there is a creative act in choosing the training set… And obviously that would mean having a specific right on the choice made in the training set. Which is from a legal point of view, a very interesting thought,” she said. “Whether it is original or not to decide which songs I’m going to pick.”

Wheeler: “I’m sure that Spotify is looking at all sorts of different ways of trying to serve their customers’ needs. And if they can find ways of creating music to populate playlists… then I’m sure they’re investigating what that looks like… Does that then start to take up some of the pro-rata share [for label, of Spotify’s revenues] and then help Spotify’s bottom line? That’s a possibility.”

Goossens said there is something brewing in Europe at the moment: a debate around whether or not you should ask permission when you are analysing a copyright-protected work for these purposes. “If I decided to train my AI with the entire catalogue of Beggars, should I have to ask you permission, and do you have to give me a licence?” she said to Wheeler. “The idea that you might need permission to train your AI, in order to get access to the data? That’s a really big idea.”

“I think that’s definitely the case. We’ve had this discussion with services for a long time. We give you our music under very specific licence, and if it’s not in that licence, you can’t do it. And that includes things like sending it off for analysis and audio-fingerprinting,” replied Wheeler, referring to discussions outside the scope of AI. He said that labels are open-minded though. “If people want to create work which is based on the best of a certain label’s catalogue or a certain artist’s work, then I think that’s interesting. It’s possible, and we’d like to hear the results.”

Gregory added that following the recent revelations around Cambridge Analytica and Facebook, there will be questions around how any technology company or streaming service is analysing music and pulling out data, whether for AI or other purposes.

“There’s been so much emotional reaction in the press to the recent scandals, and I think quite rightly we need to ask these questions. But I don’t think law has caught up with it,” she said.

EarPods and phone

Tools: platforms to help you reach new audiences

Tools :: We Are Giant

With “fan communities” being on every artist’s team’s mind, we’re fans of the fact that…

Read all Tools >>

Music Ally's Head of Insight

Leave a comment

Your email address will not be published. Required fields are marked *