Third Party Node + Offline processing?
-
Do third party nodes when loaded into hardcoded effect slots allow offline processing via the regular dsp network calls?
If not, what's the easiest way to process an IR through a third-party node? Something like:
const myCoolThirdPartyNode = Synth.getEffect("myCoolThirdPartyNode"); // or const myCoolThirdPartyNode = Synth.getSlotFX("myCoolThirdPartyNode"); inline function offlineProcess() { local impulseSize = 1024; myCoolThirdPartyNode.prepareToPlay(44100.0, impulseSize); local buffer = Buffer.create(impulseSize); buffer[0] = 1.0; // Dirac delta local channels = [buffer, buffer]; // process through third party node somehow here // then save the modified buffers as a stereo IR file or do other processing after } -
If by offline, you mean doing a heavy process without freezing the program / spiking cpu, you can create a new thread and run the process on that thread / chunk the process into blocks and do a chunk at a time.
-
@griffinboy Yeh I just mean running an audio file (dirac delta impulse) through a single hardcoded FX module and saving the output as a .wav (while ignoring any other incoming audio signal)
I've been doing it with a AudioLoopPlayer, MIDI Sequence and
Engine.renderAudio()but it's a bit clunky. I could also do it inside the C++ node but that involves global cable shenanigans and potential race conditions. -
Ah right right. I think this is a question for Christoph then! However he did answer something similar very recently. And I think global cables was the answer he gave
-
a buffer can be sent via cable data or external data (and maybe received by cable data only, as I don't know if an external data is bidirectional).
This might allow offline processing
-
@ustk You can generate the buffer inside the node without needing to pass anything in, the global cable would be used to pass the export filepath since C++ nodes don't support strings or file objects.
Best case scenario for me would still be using
DspNetwork.processBlock()but I'm not sure if it works in a hardcoded FX module, I think it's a pretty old method. -
Bumping/updating this, I can't use
Engine.renderAudio()because it's an FX plugin, and the scriptnode setup out of this thread won't compile, and crashes if I replace the Analyzer node with a manual gate. I also can't do it in the third party node, because I also have a Parametric EQ that I'd like to include in the "chain".I need this:
- Embedded audio file (already created, only 1024 samples)
- Pass audio file through 2 modules (third party node & Parametric EQ)
- Render output to .wav
Edit: deleted my old topic on this because apparently i have amnesia
Edit #2:
Enable Sound Generators FXalso doesn't work, since it's still muting the sound generators -
@iamlamprey So if I understand well, your Dirac impulse can simply be a constexpr array in your node.
Then once treated you can send the resulting buffer to the interface script using a cable and create your wav file from there.
Or I am not getting it? -
@ustk Yeh that would work, but I also have a regular old HISE Parametric EQ as part of the "chain" so I either need something like:
dspModuleA.processOffline(buffer); dspModuleB.processOffline(buffer); // then saveOr I'd have to recreate the parametric EQ in C++ (and a custom panel) which, yuck.
Update: I've gotten a bit further with the scriptnode method, I can compile the network if I remove the analyzer node, since my audio file has a fixed length I can set the record/play manually. The file player node is very temperamental and starts playback any time I compile for some reason. I'll keep noodling.
-
@ustk Turns out you're a genius and have already solved my problem 🥳:
https://forum.hise.audio/post/91573
Can replace the Sound Generator with a ScriptFX and use a midi container to get the gate information, tested and working in the compiled plugin.
Thank you very much ustk circa 2024
