neepheid Posted 6 hours ago Posted 6 hours ago 55 minutes ago, edstraker123 said: I really don't but obviously didn't intend to be insulting. If you get joy from music crafting songs and performing them with your friends AI doesn't change that at all. Why should the joy from music creation only come from the way a particular group define it to be ? The processes are different but the end point is the same. It's the concept which I find insulting, I didn't mean you were personally being insulting to me directly. Sorry for the ambiguity. Quote
BigRedX Posted 6 hours ago Posted 6 hours ago 25 minutes ago, 12stringbassist said: I don't mind the idea of the use of AI to actually help to shape a sound, but when it comes to any actual performance aspect, it should come from a human. The drum and some of the synth parts for when my band plays gigs aren't "performed" by a human. Although I've done the programming, there was very little playing or performance involved. Most of it involved dragging "notes" around on a grid until I got the parts I wanted. Also most of the programmed synth parts are pretty complex but have lots of repetition. From previous experience we would struggle to find a keyboard player with the required technical ability to play them, regardless of the fact that on their own the parts are rather boring. IMO it's far better to hand these off to a machine that won't complain about them, and doesn't take up lots of additional room on stage or in the band transport. 1 Quote
Al Krow Posted 6 hours ago Posted 6 hours ago (edited) 18 minutes ago, neepheid said: Don't do that. The title of this thread is "AI in music", so let's keep it relevant? And no, I have never used ChatGPT. I have no desire to either. I've got a brain, and despite what my frequent word salad spewed around here might suggest, it's reasonably competent at times Yeah but what we're facing is a subset of AI impacting all our lives - the wider point provides the context for this discussion. So, for me, it's entirely relevant and, as musos, I don't feel we can say "it's ok to be using AI in X, Y and Z because it doesn't impact me guv and I'm very happy to use it when it's someone else's job on the line, but not in music", and then expect anyone to give us a fair hearing? At the end of the day our fates are going to be decided by what other humans want to use / pay for / listen to. I think kite-marks of authenticity (e.g. ai assisted) have got to be a way forward and that's a pressure point we can and should be pushing the industry to adopt? Edited 6 hours ago by Al Krow Quote
neepheid Posted 6 hours ago Posted 6 hours ago 1 minute ago, Al Krow said: Yeah but what we're facing is a subset of AI impacting all our lives - the wider point provides the context for this discussion. So, for me, it's entirely relevant and, as musos, I don't feel we can say "it's ok to be using AI in X, Y and Z because it doesn't impact me guv and I'm very happy to use it when it's someone else's job on the line, but not in music", and then expect anyone to give us a fair hearing? At the end of the day our fates are going to be decided by what other humans want to use / pay for / listen to. I think kite-marks of authenticity (e.g. ai assisted) have got to be a way forward and that's a pressure point we can and should be pushing the industry to adopt? I know, but I think it was a low blow to take my specific objection to Generative AI in the music context then frame it as some sort of argument about being against doctors getting help to identify cancer. Frankly, I was a little disappointed by that and it made me sad - I think you took that too far. 1 Quote
SimonK Posted 6 hours ago Posted 6 hours ago I suspect the effort that it may take to get AI to write a hit single might be similar to the effort of a person writing it from scratch... But for remastering/recording/adding instrumentation I think it is a tool that is here to stay, and sadly will take people's jobs. Quote
Al Krow Posted 5 hours ago Posted 5 hours ago 14 minutes ago, neepheid said: I know, but I think it was a low blow to take my specific objection to Generative AI in the music context then frame it as some sort of argument about being against doctors getting help to identify cancer. Frankly, I was a little disappointed by that and it made me sad - I think you took that too far. Gotcha. But it's being used as a tool to bring healing (medicine) - I'm surrounded by medics in my immediate and wider family(!) and in our cases joy (music) into folks' lives. In the case of cancer diagnosis and new medicines, generative ai is being used as part of the software. I also personally know software engineers and language translators whose jobs are being impacted by generative ai. We're not alone in this. Quote
neepheid Posted 5 hours ago Posted 5 hours ago 1 minute ago, Al Krow said: Gotcha. But it's being used as a tool to bring healing (medicine) - I'm surrounded by medics in my immediate and wider family(!) and in our cases joy (music) into folks' lives. In the case of cancer diagnosis and new medicines, generative ai is being used as part of the software. I also personally know software engineers and language translators whose jobs are being impacted by generative ai. We're not alone in this. Thank you for highlighting the personal angle which informed your argument, I appreciate that and understand a bit better why you took it down that road now. But this whole "everyone can be a musician" thing by typing a few words into an "AI magic music generator" can still get in the effing bin, and no amount of arguments, however framed, are going to dissuade me of that. It's delusional! Quote
MacDaddy Posted 5 hours ago Posted 5 hours ago We've been through this before in the 70s & 80s when computers took a lot of peoples jobs. Quote
Uncle Rodney Posted 5 hours ago Posted 5 hours ago 3 hours ago, TimR said: "And this next one is one that my computer wrote earlier"... ... and now looking for seasoned pros to help me play my songs live on stage. I don't have any money so I need dedicated musicians who are as passionate about my music as much as I am. 😂 drat I should have used AI to generate a better script! Quote
chris_b Posted 5 hours ago Posted 5 hours ago 27 minutes ago, MacDaddy said: We've been through this before in the 70s & 80s when computers took a lot of peoples jobs. I was there and we put a lot of people out on the streets. Quote
BigRedX Posted 4 hours ago Posted 4 hours ago 1 hour ago, MacDaddy said: We've been through this before in the 70s & 80s when computers took a lot of peoples jobs. 43 minutes ago, chris_b said: I was there and we put a lot of people out on the streets. In my industry (graphic design and artwork) the people who lost their jobs through the introduction of computers were those who couldn't be bothered to learn to use the new methods. I started off with the traditional methods of working in the 80s and those core skills I learnt were the things that meant I didn't mess up when I transferred over to the computer. For me I was just using a mouse and keyboard and looking at a screen instead of drawing pens, cow gum and rulers on a drawing board. It's also those core skills that keep me in business today where a lot of my work comes from sorting out "designs" done on the computer that can't actually be printed properly. 1 Quote
80Hz Posted 2 hours ago Posted 2 hours ago I think there's some nuance that's being missed here. Trained neural nets absolutely have an application in all sorts of areas, and tools that use this technology are often capable of doing things that were previously unachievable - I'm most familiar with the audio world (see stem splitting, reverb matching, etc) but we can lump in things like image uprezzing, spotting cancer, etc. There is no "intelligence" at work here, it's very effective pattern recognition. These tools have undoubtedly made my job easier, and the results I provide to my clients are better as a result. I also very much view keeping up with the latest and greatest tools (assuming they are, in fact, the greatest, rather than marketing hype) as worthwhile professional development. Generative AI, however, is a wholesale power grab of creative outputs by corporate forces (the creative inputs were provided by anyone who's imagery, writing, music or audio were part of their training data, whether they consented to that usage or not). To me that is very different from a "tool" - it's handing over the reins of creative production to the rentier class. Who stands to capture the value of this? There are reasons to be optimistic, however: Most creative people want to work with other creative people. I respect talent, and the creative achievements of my colleagues are what make it worthwhile. So I think the creative industries will persist in a different (perhaps diminished) form. Part of adapting to the new reality will be forming networks of likeminded professionals. Despite the constant noise that AI is going to take over everything, some more-intelligent-than-me computer science types believe we're already at the point where models won't get much "better" at stuff. There simply isn't the volume of training data available, or the cost of accessing and processing that data is too high. There future may well be in faster, more efficient, more targeted models that swing back to special purpose tools. AI generated output is rapidly becoming a marker of low effort and therefore low value of the resulting product. Or do you buy all those scammy lifehack products on AI generated YouTube prerolls? Even with gen AI not everyone can be an effective art director. We're due a massive market correction once the circular accounting between the AI pushers and Nvidia reaches critical mass. As well as wiping off a huge chunk off everyone's stock/pension portfolio, it will be the morning after the night before for the AI optimists. 3 Quote
Woodinblack Posted 1 hour ago Posted 1 hour ago 5 hours ago, Al Krow said: Do you feel the same about software engineers who have spent years training, getting a loan to see them through uni and can now be replaced with ai generating code at the press of a button? Well, actually no, and that is where a misunderstanding as how LLMs work, which is quite common really. I am a software engineer, and where I don't have to worry too much about the future as I will probably be dead within the decade, you can't just generate code at the touch of a button (and actually there is no button, it is just autocomplete). An LLM is very good as 'boiler plate' code, stuff you do over an over again, there is a lot of it, and it is good that it farms that out, so it does save time. However, an LLM as discussed here has no inteligence, it just has things it has copied from somewhere else. Its job isn't to solve a problem, its job is to show you 'what a solution to this problem would look like', and that is a huge difference. It is a language process, not a techical process. It doesn't understand the problem, just the overall look of the problem, which is why it is good at language and music. "What would a country song about a clam sound like" is an appearance issue, it doesn't have to know about what a clam is, or how it feels about anything, or why it cares about its truck breaking down or its dog dying. When I first used it it made a complex function which seemed perfectly to do what I asked. When I looked closer I realised it would come out with the wrong results, but it is very hard to spot, and AI can't fix it because it doesn't understand how it works, just how it should look. Its shown really clearly in the 'how many rs in a raspberry' problem that chat GPT had. AI isn't writing about something, it is writing something that it thinks a song should sound like, and for 95% of music that is enough, and it probably will kill a lot of music just because people won't be able to have it as an income, because for a lot of people that sort of music is enough, meaning ultimately music will go back to a niche hobby, like it was in the past, somthing people did for themselves, not for profit, like the guy on the piano in a pub. 3 Quote
Al Krow Posted 1 hour ago Posted 1 hour ago 55 minutes ago, 80Hz said: I think there's some nuance that's being missed here. Trained neural nets absolutely have an application in all sorts of areas, and tools that use this technology are often capable of doing things that were previously unachievable - I'm most familiar with the audio world (see stem splitting, reverb matching, etc) but we can lump in things like image uprezzing, spotting cancer, etc. There is no "intelligence" at work here, it's very effective pattern recognition. These tools have undoubtedly made my job easier, and the results I provide to my clients are better as a result. I also very much view keeping up with the latest and greatest tools (assuming they are, in fact, the greatest, rather than marketing hype) as worthwhile professional development. Generative AI, however, is a wholesale power grab of creative outputs by corporate forces (the creative inputs were provided by anyone who's imagery, writing, music or audio were part of their training data, whether they consented to that usage or not). To me that is very different from a "tool" - it's handing over the reins of creative production to the rentier class. Who stands to capture the value of this? There are reasons to be optimistic, however: Most creative people want to work with other creative people. I respect talent, and the creative achievements of my colleagues are what make it worthwhile. So I think the creative industries will persist in a different (perhaps diminished) form. Part of adapting to the new reality will be forming networks of likeminded professionals. Despite the constant noise that AI is going to take over everything, some more-intelligent-than-me computer science types believe we're already at the point where models won't get much "better" at stuff. There simply isn't the volume of training data available, or the cost of accessing and processing that data is too high. There future may well be in faster, more efficient, more targeted models that swing back to special purpose tools. AI generated output is rapidly becoming a marker of low effort and therefore low value of the resulting product. Or do you buy all those scammy lifehack products on AI generated YouTube prerolls? Even with gen AI not everyone can be an effective art director. We're due a massive market correction once the circular accounting between the AI pushers and Nvidia reaches critical mass. As well as wiping off a huge chunk off everyone's stock/pension portfolio, it will be the morning after the night before for the AI optimists. 24 minutes ago, Woodinblack said: Well, actually no, and that is where a misunderstanding as how LLMs work, which is quite common really. I am a software engineer, and where I don't have to worry too much about the future as I will probably be dead within the decade, you can't just generate code at the touch of a button (and actually there is no button, it is just autocomplete). An LLM is very good as 'boiler plate' code, stuff you do over an over again, there is a lot of it, and it is good that it farms that out, so it does save time. However, an LLM as discussed here has no inteligence, it just has things it has copied from somewhere else. Its job isn't to solve a problem, its job is to show you 'what a solution to this problem would look like', and that is a huge difference. It is a language process, not a techical process. It doesn't understand the problem, just the overall look of the problem, which is why it is good at language and music. "What would a country song about a clam sound like" is an appearance issue, it doesn't have to know about what a clam is, or how it feels about anything, or why it cares about its truck breaking down or its dog dying. When I first used it it made a complex function which seemed perfectly to do what I asked. When I looked closer I realised it would come out with the wrong results, but it is very hard to spot, and AI can't fix it because it doesn't understand how it works, just how it should look. Its shown really clearly in the 'how many rs in a raspberry' problem that chat GPT had. AI isn't writing about something, it is writing something that it thinks a song should sound like, and for 95% of music that is enough, and it probably will kill a lot of music just because people won't be able to have it as an income, because for a lot of people that sort of music is enough, meaning ultimately music will go back to a niche hobby, like it was in the past, somthing people did for themselves, not for profit, like the guy on the piano in a pub. A couple of excellent posts! Quote
Stu-khag Posted 50 minutes ago Posted 50 minutes ago 5 hours ago, chris_b said: When AI has its own programs on Radio 4 I'll start getting annoyed. Not sure if they will replace the presenters but a programme called Descript is making in-roads into editing for radio 4. Essentially you record what you need, feed it into the programme where it transcribes it all, you then turn that document into a paper edit of what you want kept,moved etc and it spits out a programme! It can summarise it all, add tags etc, fade in music where you want etc. Not so sure how it would work with a really creative programme and its still early days but its getting better. thankfully I deal with production management stuff so my job is safe for now. 1 Quote
ambient Posted 47 minutes ago Posted 47 minutes ago On 26/01/2026 at 12:55, EBS_freak said: Of course it should - but at the moment its largely misunderstood and not regulated properly. For all those against AI, I would wager a lot of them were also eager to watch the Beatles Get Back documentary and listen to "Now and Then", both of which would not be a thing without AI. That's using it in a positive way. I've used Logic’s stem split function, and iZotope’s RX quite a few times to remix and master cassette tape recordings from as far back as the ‘80s people. I'm totally against using it to generate music. Quote
80Hz Posted 20 minutes ago Posted 20 minutes ago @Woodinblack's post really nails where both danger and opportunity lie in industries where LLMs and other gen AI are taking hold. In some ways humans are a lot like LLMs - we use prior experience to apply a quick solution to a novel situation, and it's often "correct enough" to be effective. But our biases - our personal data sets - can misguide us. We're also intrinsically lazy (energy efficient, in my case) so we will tend to take the easily available shortcut if one is available. Good enough is good enough. The key then is to continue to cultivate the skills that an LLM can't replicate: programming needs critical and logical thinking, the ability to incrementally problem solve (<-- non-programmers take), so we will still certainly need those with a programmers skill set. The AI takeover has a feeling of inevitability because we can't see past the neoliberal world we've all been living in for the past 50 years or so. It's catnip for the >=management class because the "logic" that market forces and efficiencies must rule over everything else is unquestioned. This is why it's hard for me to separate generative AI from the larger picture of political economy. 1 Quote
80Hz Posted 12 minutes ago Posted 12 minutes ago 5 hours ago, neepheid said: But this whole "everyone can be a musician" thing by typing a few words into an "AI magic music generator" can still get in the effing bin, and no amount of arguments, however framed, are going to dissuade me of that. It's delusional! Wait, you're trying to say I can't become a cancer prompt doctor? Damn, guess I better cancel that custom shop preorder... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.