@Lurch said in Neural Amp Modeler (NAM) in HISE:
EDIT: Ignore the Const Var at the end there, copied by accident, doing a few things at once.
this is funny because you could've used the edit to remove it
@Lurch said in Neural Amp Modeler (NAM) in HISE:
EDIT: Ignore the Const Var at the end there, copied by accident, doing a few things at once.
this is funny because you could've used the edit to remove it
@JulesV i haven't gotten around to actually implementing this, but it should work, instead of just reading the file as object, i would just paste the data in there
@Chazrox Console.print(trace(yourArray)) ;
@rglides You're gonna run into agreeing issues even when running a local open source model.
It does not have the training data for HISE, for the docs, likely for the interface screenshots or videos of it etc.
Even Context7 availability is limited by the tool calling ability, and output quality quickly degrades with context size (except on the latest frontier models).
The real benchmark would be to try and use something like this https://github.com/e-p-armstrong/augmentoolkit to add training data to a good open source LLM like Devstral. Then feed it the docs, the entire forum content, transcripts of David's videos etc.
You would then try to use it with a strong system prompt that discourages glazing and agreeing.
@d-healey how so? aren't you just loading it from the computer in that case?
EDIT: Ah, the IPP license clarifies this.
@d-healey if hise resorts to using the Ipp that's gettable via nuget or any of the ones that can be be found with direct links on the Intel website, then you could add it to the installer? There would then have to be an option that it's not included in the plug in, but found in the pc.
Kind of like video games of old used to pack C++ resists and direct X versions.
@Gab you can use a Webview and get the best performance that way.
Right click -> Find all occurences. It will look in all included files.
If you want to search even the non-included ones, use VSCode, open the Scripts directory in a workspace and use Ctrl+Shift+F (or H for replace) to search and/or replace in all of them.
@marcrex Ship ollama or a similar layer with your plugin, and a model of your choice. For this you probably want a good tool calling model, like Devstral. You will need the super light version. Unsloth has a super small quant https://huggingface.co/unsloth/Devstral-Small-2505-GGUF/blob/main/Devstral-Small-2505-UD-IQ1_S.gguf but I have no idea how well it runs. Check Gemma 1b as well.
Your system prompt needs to contain the functions that it has available for use and what each of them does.
Define these functions in HISE.
You'll run it by mounting the model, if not mounted, on plugin start, using BackgroundTask. These small models should be fast to mount anyway, so best to have it unload itself automatically after an hour of not using it.
The user can then write an instruction and the model is fed system prompt + instruction via BackgroundTask. The system prompt details what it can do, the instruction tells it what to do. You want to explicitly tell it to return a "function call" in a clearly specified format (for example doThis(arg1, arg2)
) and for that to be the only return, no other text.
You should then validate the return of the BackgroundTask call, by checking that it contains a function by using string operations, and if it validates, you would call eval(return). eval() will attempt to execute the string within the code. In this case it's a function that you already defined, so it will run.
In the end, the brunt of the work is you designing the tools (functions) that the model can call.
This is a very simple example, if you want to chain tool calls, you'll need to feed the output of the function back into the model along with the previous prompt so that it can continue doing the work if needed.
If you want an example of how to use eval(), I did a cool experiment here https://forum.hise.audio/topic/9086/hise-s-best-kept-secret-dynamic-plugins
These are the limits of my current knowledge on the subject, some of these workflows may already be outdated, but hopefully I gave you a good starting point to go and explore.
Before you download ollama and start fiddling with local inference, you can try it out instantly via OpenRouter. https://openrouter.ai/mistralai/devstral-small:free
This free endpoint will of course use any data you send for training, but you're not gonna be sending any sensitive info, just your system prompt any whatever the user types in. For a simpler implementation (though obviously not as rock-solid) you can offer the user to enter their OpenRouter API and use the free endpoint that way, or provide your own via cloud functions, at which point it becomes a service. However, this way everyone can run the proper 24B model; on macBooks with 32+ GB of RAM it runs fine, but on Windows, VRAM isn't super easy to come by.
@DanH if you need context
no, as you're not using polyphonic FX, which can only be used inside a processor anyway Not so fast, the effect you describe is how this flag is working wit...
Forum (forum.hise.audio)
@DanH it is if your network features reverb, delays etc
@pcs800 the best faust tutorials are unfortunately reading the source code of the library effects. Find it on github. Feed the main syntax docs into a good LLM like o3 or Gemini Pro, then ask it questions about the specific library effect that you're reading.
Of course, read the whole syntax reference yourself first.
I use this with a simple 4-node setup. It's pretty good and useful if you also link it up with other compare nodes so you can have dynamic logic that doesn't need explicit parameter setting but can react to various states in the network. Honestly no idea how ScriptNode went as long as it did without the compare node.
HiseSnippet 1719.3oc6X0sTaaDEVByZBljzjNzI8hdgtnWP5jgw17SBSu.yuILECdPTR5ULKRqw6Xoc0HsBvsSuuuBo2z9Zz65iPeD5i.uAomckDVqsKX6DBscJYlLdO+r62d9aO5zHj6Phh3gFlkNrS.wv79H6NLQqMZgoLic1zv7gn53HAIzJgz5cBvQQDWCSyBuTRvb5IMT+c4pqi8vLGRWRFFGwoNjco9TQWpMp8MTOuswtjCo94jdwZ63vYav83w.dJfJaDfcZiOkrGVJ1DHiWgiZYX9UnkptfyhtO2sZ0kVYYG7xjUZha1rxBtKt3xuXkJKrBdwWrLo7RFlE2xkJ3g1BrfDAa55b2N1s3myRNfinQzS7HxEULrgSNgrwFsndtMxLNQFFlnFcMUERLUyhpScoWQuqI6QJFVc0HuQybhqCRUFAHYlCRSl.oGircBoAhtbj3YFzNLvC1DC9l7PIQVCyeGsAGDfIl2G2lrcHr3JEla4xkelE7eO8qaFybDTNyhy1iKH6yl6ok9gRSW5GKY0KqlMGHO4wDx87HgCjsLbH75TbNVr+IjvmYcF1Klbkfv0W2lVb3roNI25bBxY6vnh8CHoq2l64JsUxe2uGvH0rA+5a2YSr.KcJoz.4BHgBpDNlaRNCRCRbQSi1jD0VvCfDg97ePjC2M1CKzCmjIZoL.6glOT5nXQTQm7IhiPLV4qMFaXg3iQMnBmVCFiSL.LBVpaCLllY9.zVMaRbDcA3jnsey3lFVdjSCKklFBmo7vue5RqseSkAl.9SntA6Agj.bH4PdCObm4hv9AdjC.L9LqS73NssoeOo+LjfDHrtTh4bZgYLhWz3jHU7CTwIiC3wBJ6z5XQH8Bn74dw91PccGxFonCnYNgLGJYcY4ZYbgMg4pV7N3uTlUjqMSYVIiYtzu8Hhy4gsUtizeaXNUhsORYjOt4EULVyyie9Fb+.ZZTJ3CTzZv85DzhynNRRIRjgz074wvgjBW3InCwTOYXscbDTnvcelMHr5cOyBGQBiTa7TnxyC+CBy2i6BrJtM1ALrcZfEsj4KxZOPvIIbdmqBR0vZ93jjG.0XK20rmJSLb.p8kVC4CklExU+IIDUtBR.mH4f1I5HIWGrmwQxP.48QsmIlvDHKEcJTB9tA32OZmEkZer1mYkYgtMQcpgV40uWBpqL5v9gHa5oLrGztCwo8M.3rmGLm38C5SlyfWDEPvs0A9L.vCIyqXzOjS03FrsSlGfFcqTeHN7ThHJ8oKFwIIwPYM2wU9dkCjN.UjLZfCgyC5KPRtHZWRSg5LfBTWj852lzn.nt05wP82jpiZTjlClK4hrT3sfpPtfITodVZ8UmiDn4rLcQx.7pPQs4yX2uIpqpiRD3HXkJh.LGzqI5kPUTCs6Cb.Yqfnz5TVZvP22EqiunGZWtpsfDHK7mWN61jySrC4kLIBR4Z5am2jzDG6I5i9ZwBtOfTUfrQN7M0sC9lBc.8zVi..GCL8E05GS.N+awz8QanBPjs0LB.SKuUEALn7VEiAk2pXbKk2VJKjuxcRhqVfduNse8su82VsOmVseoWmlRtqyo0fDR4tpOiru8ZvNsuq1GgH7YP6x4APSTg8Eke4pCST9sVkAUEogDRWtpdkgbA64BtF4Zw4z8VpX7CPQ7lhiOQcz5Q+kPWAn+ut7vUWd3hXuCpKeuZ2Lv51BgpuP8HCsP2mjq4v7B0e.r9lLjcHVnmNDmX35PrGo9DjsOmKZAeYU9hdHyYydMp292S6Dt5n2I7TnMi886ncAmt3idzO+tOn8sq6qFxEqy4s8wpO368dzKOIg4LnWiOinF9o5iw9L05l7PeqWR.6jLBrx0L9y+XXG+YvPO9y8cD.DNLDyhB3QP8x7owDe5gbFIRiZOZTcfZTUu3iHlou0IjzjRZJ1FLEUxm8kQrZdhMvLsMCVqsS0kSCHa4et5VL7IdDaBbyc2OxALL3dpBzH1Kh7ZpqnUk7J1kb07jeENzE7gNZSZpf9PMlb3Fpg9P2xI+.lATw6foC+gYxc2gCWTaxYSmgQapruzsXmQ7fzXEF+TTZg9Lp54g04Ld1va550OfHBomdJQ+IkAcgVSHvvm5eEkYqc.wifixka9k01EpUhCqqJrNV1hQeX9Cze84nD3ZIKdX8u2gBW3+zCEd3y9f5g+y1K9DDTWtpkxUZc23K+XLK4OFmgO1Ijeb5WHIS5umhBbuYoSoptbsUEiy5siVe3Iiicbz2p9Tr53p3BiqhKNtJtz3p3xiqhOebU7E2rhxWzS+bMYtAzlSisRFMhYRGNpzDi+B.BVVDB
Precisely because you can plug more compare nodes into the logic and chain them up, this is more flexible than what Chris could do.
When expanded:
This is what you want to do. You want audio going through HISE. Are you sure you built it with HISE_BACKEND_AS_FX=1
preprocessor definition?
@Adam_G ignore it, just plug the cables into the inputs
@ustk and compile the hise plug in with backend as fx flag
Processing Info-App.plist
▸ Compiling include_hi_dsp_library_02.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_dsp_library_01.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling MainComponent.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling Main.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling Exporter.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_lac.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling DialogLibrary.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling BinaryData.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_rlottie_1.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_rlottie.mm
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_rlottie_10.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_rlottie_12.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_rlottie_13.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
▸ Compiling include_hi_rlottie_3.cpp
❌ clang: error: unsupported option '-mpopcnt' for target 'arm64-apple-macos10.11'
etc. etc. etc. it keeps going
@Chazrox this way you learn some markdown which will undoubtedly come in handy!