8 Times more CPU consumption on Aida-X Neural Models
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
@JulesV I pushed the preliminary work I had on my mobile rig but it‘s not usable as it is. I‘ll need to do a few more performance tests and cleanup.
Almost 6 months passed. The problem still persists. Why is the neural node still unusable after all this time?
You need to solve this problem as soon as possible, my friend.
-
@Fortune just don't use it then.
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
@Fortune just don't use it then.
Wow, what an amazing answer you gave as the creator of HISE :)))
It has nothing to do with me bro, the question is this:
Will the Developers be able to use Neural networks with HISE or not?
Will your HISE be compatible with the future or not? That is the question.
It is obvious from your answer that you have failed to solve the source of the problem. But by acting like this, all you will do is alienate people and ruin your potential future businesses, nothing else.
-
@Fortune said in 8 Times more CPU consumption on Aida-X Neural Models:
It has nothing to do with me bro,
Then I guess its not a problem.
HISE is an open source project - you cannot expect any given piece of functionality to get added and to be supported because you want it.
However you are not stuck with the whims of @Christoph-Hart , if there's something you want and Christoph has not built it or not completed it to your satisfaction - then you can go build it yourself.
-
I'd rather remove the entire RT Neural framework from HISE so it doesn't raise false expectations, but it's way down on my priority list to spend any time on that issue and your communication skills just pushed it way further down.
15 upvotes to this post and that shit is gonzo.
-
@Christoph-Hart please don’t get away from implementing this, at whatever pace fits with your calendar. I understand the frustration of getting such behaviour from frustrated people, but one is not all. We all expect new stuff, fixes, and that "bloody function only I can dream about even if takes the rest of your life working like a mule to implement it and that I will just play with once and say « arrff… I finally don’t need it… » "
It is for sure something many of us are counting on, but teaching you on the way you have to do your job is not a correct behaviour.
As @Lindon said, since Hise is open source, one cannot hard claim a function he cannot implement by himself, but kindly ask, expect, cross fingers, and raise votes for his party.
Empathy, Hise, and
are the first building blocks for a better world!
-
@ustk yeah thats what we really want; more votes for partying....thats what you meant right?
-
@Lindon exactly, it’s my English that is gonzo
-
"gonzo" - word of the day!
gonzo
/ˈɡɒnzəʊ/
adjectiveinformal•North American
1.
relating to or denoting journalism of an exaggerated, subjective, and fictionalized style.
2.
bizarre or crazy. -
@Christoph-Hart This is a feature that many of us want, not just one person, please don't let the nervous attitudes and communication errors of some people prevent its development.
Because this promising feature will positively impact the entire product development workplace. And as you will appreciate, when the industry is just starting to use it and it's possible to make great products that haven't been done before, there's no point in removing it just for that reason.
Whenever you are available, it's ok. We all love your work. Just keep doing what you're doing, just like you did before.
-
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
I'd rather remove the entire RT Neural framework from HISE so it doesn't raise false expectations
-
@Dan-Korneff alright then it stays, the people have spoken :)
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
@Dan-Korneff alright then it stays, the people have spoken :)
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
Adding NAM support is super helpful for some people as the training scripts are solid and actively maintained. I'll be using RT-Neural on this end for the parameterized support. Still making a few tweaks to the training script I started with, but I’m planning to build around it soon.
-
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
There is also widespread adoption for the NAM format. Darkglass just released a NAM compatible product:
https://www.darkglass.com/creation/anagram/
https://www.tone3000.com/blog/unveiling-anagram-darkglass-electronics -
@Christoph-Hart said in 8 Times more CPU consumption on Aida-X Neural Models:
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
Yes definitely works, I use it for a tube preamp that is not a guitar amp, it works great and it replicates the nonlinearities of the original model successfully. It is great for guitar amps too.
To be honest: I‘m a bit „disincentivized“ in fixing the NAM stuff because I don‘t see a use case except for loading in existing models from that one website with potential infringement of other peoples work, but maybe my understanding of the possible use cases here is wrong.
As you know, anything can be used to copy other people's work. Even a simple IR loader or Faust node or Third Party node.
But at least for me and many professional developers here, the main goal here is to be able to use the models we trained ourselves.
And I think a effective RT neural node would definitely go viral in terms of usage of HISE.
-
Just out of curiosity: who is using this in a real project and does it work there? Like except for the NAM loader?
@Christoph-Hart I'm using aidax in a real project, and would love to have NAM implemented.
-
@Christoph-Hart
If I might throw my humble 2 cents in....and my 2 cents are change left over from a whole career of working in guitar amp modeling software, having spent almost 20 years with the ReValver amp modeling software and another 4 years with the HeadRush pedals and software. The software market segment of simulating hardware has completely turned from component level modeling to leveraging NN modeling. Neural Network modeling allows us to do in a matter off days what took months or years to accomplish, with better results (most of the time). Beyond guitar products, NN modeling lends itself to preamp, channel strip, etc modeling, which is why it's such an opportunity for @orange. This has really nothing to do with simply downloading models from tonehunt.org, this is about implementing something that will be "how it's done" from this moment on. Here's a real-world scenario: I am currently working for a guitar amplifier manufacturer that has a 60+ year history in guitar amps. They would like to release plugin versions of every flagship amp they have ever made AND ship a plugin version with every new amp that they design and release. This is my project, NN modeling is how to do it, and I have to find the solution that lets me get it done.
Having said that, I have really fallen in love with HISE in a short time, but it might not be a realistic solution (at this point) for my project. I would hate to see all this not properly implemented in HISE. It would be such a missed opportunity.Having said all that.....we are about to enter the 2nd wave of NN modeling market disruption, which will truly change everything...again:
The first publicly-available parametric Neural Amp Model
Today, I'm releasing ParametricOD, a plugin that uses NAM's parametric modeling capabilities to give you a model of my overdrive pedal that is accurate across the full range of the pedal's knobs and switches.Get it here: Users | Neural Amp Modeler.GUI for the plugin.The plugin is available in VST3 and AU formats for macOS and VST3 for Windows with similar compatibility to the open-source snapshot plugin ("the NAM plugin"--or perhaps just "NAM"--to many users).This plugin is intended as a "concep
Neural Amp Modeler (www.neuralampmodeler.com)
Thanks for letting me get that out....