NI Insolvency
-
@David-Healey i can live with it. Playing with an ugly sampled violin. Then render with Ai
-
@hisefilo said in NI Insolvency:
Playing with an ugly sampled violin.
A sampled violin you say... but you just told me ACE studio will replace samples

-
@David-Healey lol you got the point. Itβs matter of time to get ai realtime generation on every home studio or top notch producers.
-
@David-Healey said in NI Insolvency:
Would that need to be part of HISE when other tools are already available to do those things?
Yes, like scriptnode, or sampler or synth stuff in Hise, like other tools are already available to do those things.
-
@resonant Can this kind of thing already be done using ONNX in scriptnode?
-
@David-Healey good question!!!!!
-
@David-Healey Probably, are there any examples?
-
@resonant Not sure, I haven't explored that side of HISE, but searching the forum will probably bring something up.
-
@David-Healey I see an opportunity on the dataset creation. Dataset is everything. Models can be copied. Improved.
The quality of the final product is strongly driven by dataset quality (i.e. 200 hours of a clarinet soloing. Tagged by the features you want)
-
@hisefilo Maybe an opportunity for a colab
-
What I'm referring here isn't creating plugins with AI.
Dan suggested this.
Yes, I believe this will be the intermediate stage. Where AI can help anyone make the tools they need use to create their music.
The stage after is to simply describe what kind of music (the specific parts) you want to make. That will remove the need for plugins.
Instead of downloading a bunch of different reverb plugins, you just tell the DAW exactly what type of reverb you want and it will create a custom one. Then if you make a really cool reverb you can share it with other people, heck even sell it if it's good enough. Wait... we've come full circle to selling plugins at this point!

Parallel to this is also likely the stage where the end-user just describes what music they want o listen to and it's generated on the fly.
Ahh.. but this is all supposition. Nobody really knows where this will all go.
It might all collapse in on itself and turn out to be a big disappointment. Or it might wipe out humanity entirely.
-
I'll add, with some amount of caution, that in my specific music niche I think there will always be a hard requirement for music made by humans, with tools made by humans, pressed onto vinyl made (supervised?) by humans, and played on arcane gramophones by humans.
-
@dannytaurus said in NI Insolvency:
Parallel to this is also likely the stage where the end-user just describes what music they want o listen to and it's generated on the fly.
We already have this, Suno.
I also think live musicians playing virtual instruments will still be a thing. People like to play.