8 Times more CPU consumption on Aida-X Neural Models
-
@Christoph-Hart This is a feature that many of us want, not just one person, please don't let the nervous attitudes and communication errors of some people prevent its development.
Because this promising feature will positively impact the entire product development workplace. And as you will appreciate, when the industry is just starting to use it and it's possible to make great products that haven't been done before, there's no point in removing it just for that reason.
Whenever you are available, it's ok. We all love your work. Just keep doing what you're doing, just like you did before.
-
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
I'd rather remove the entire RT Neural framework from HISE so it doesn't raise false expectations
-
@Dan-Korneff alright then it stays, the people have spoken :)
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
@Dan-Korneff alright then it stays, the people have spoken :)
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
Adding NAM support is super helpful for some people as the training scripts are solid and actively maintained. I'll be using RT-Neural on this end for the parameterized support. Still making a few tweaks to the training script I started with, but I’m planning to build around it soon.
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
There is also widespread adoption for the NAM format. Darkglass just released a NAM compatible product:
https://www.darkglass.com/creation/anagram/
https://www.tone3000.com/blog/unveiling-anagram-darkglass-electronics -
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
Yes definitely works, I use it for a tube preamp that is not a guitar amp, it works great and it replicates the nonlinearities of the original model successfully. It is great for guitar amps too.
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
As you know, anything can be used to copy other people's work. Even a simple IR loader or Faust node or Third Party node.
But at least for me and many professional developers here, the main goal here is to be able to use the models we trained ourselves.
And I think a effective RT neural node would definitely go viral in terms of usage of HISE.
-
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
@Christoph-Hart I'm using aidax in a real project, and would love to have NAM implemented.
-
@Christoph-Hart
If I might throw my humble 2 cents in....and my 2 cents are change left over from a whole career of working in guitar amp modeling software, having spent almost 20 years with the ReValver amp modeling software and another 4 years with the HeadRush pedals and software. The software market segment of simulating hardware has completely turned from component level modeling to leveraging NN modeling. Neural Network modeling allows us to do in a matter off days what took months or years to accomplish, with better results (most of the time). Beyond guitar products, NN modeling lends itself to preamp, channel strip, etc modeling, which is why it's such an opportunity for @orange. This has really nothing to do with simply downloading models from tonehunt.org, this is about implementing something that will be "how it's done" from this moment on. Here's a real-world scenario: I am currently working for a guitar amplifier manufacturer that has a 60+ year history in guitar amps. They would like to release plugin versions of every flagship amp they have ever made AND ship a plugin version with every new amp that they design and release. This is my project, NN modeling is how to do it, and I have to find the solution that lets me get it done.
Having said that, I have really fallen in love with HISE in a short time, but it might not be a realistic solution (at this point) for my project. I would hate to see all this not properly implemented in HISE. It would be such a missed opportunity.Having said all that.....we are about to enter the 2nd wave of NN modeling market disruption, which will truly change everything...again:
The first publicly-available parametric Neural Amp Model
Today, I'm releasing ParametricOD, a plugin that uses NAM's parametric modeling capabilities to give you a model of my overdrive pedal that is accurate across the full range of the pedal's knobs and switches.Get it here: Users | Neural Amp Modeler.GUI for the plugin.The plugin is available in VST3 and AU formats for macOS and VST3 for Windows with similar compatibility to the open-source snapshot plugin ("the NAM plugin"--or perhaps just "NAM"--to many users).This plugin is intended as a "concep
Neural Amp Modeler (www.neuralampmodeler.com)
Thanks for letting me get that out....
-
@Christoph-Hart I would like to share with you what I can think of to solve the performance problem:
While compiling the Neural node as a dll, I noticed that it did not embed the Neural network into it.
When loading this node into the Hardcoded FX, we need to load that complex, large Neural Network json file again, otherwise there is no sound. Could this be causing a performance decrease? It must be something obvious.
-
@JulesV nah, that will at most take a bit longer for the models to initialise but it won't cause any runtime performance difference.
No idea what's exactly causing this, but I would have to compile the other plugin and profile it then compare the build flags, which is a tedious process that I have currently no time for.
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
@Dan-Korneff alright then it stays, the people have spoken :)
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
@Christoph-Hart I'm only interesting in using models of things I built and trained myself. I plan on using this as part of a hybrid approach where I model certain physical circuits with NAM and then use DSP for other aspects. Not just grabbing stuff of ToneZone and putting a plugin wrapper around a single profile.
As @Dan-Korneff said, NAM has a much more solid and predictable training pipeline, not to mention an easy to use, free, fast, cloud based platform for the training. That's what makes it desirable. There really are a lot of options to make really cool stuff.
-
@Dan-Korneff said in 8 Times more CPU consumption on Aida-X Neural Models:
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
@Dan-Korneff alright then it stays, the people have spoken :)
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
Adding NAM support is super helpful for some people as the training scripts are solid and actively maintained. I'll be using RT-Neural on this end for the parameterized support. Still making a few tweaks to the training script I started with, but I’m planning to build around it soon.
I would love to see an example of a parameterised model up and running. The way I plan on profiling won't require parameters for the most part, but it would be useful every now and then.
-
I also just want to say @Christoph-Hart in the face of the really shitty attitude that person above showed. I love HISE so much. I never thought I'd be able to create a simple plugin that just changes the gain let alone the stuff I can do with HISE as someone with almost zero coding experience. It's such an awesome thing you've created.