Getting Audio out of the plugin - by dragging to the DAW..
-
...just like MIDI
Did I see something about this being available?
-
-
@Christoph-Hart @Lindon Both of these functions are interesting. Is there example documentation on them (dragging audio and MIDI) beyond the documentation reference? (It seems like there's a lot of possible variables in these tasks, especially audio—and then there's mechanics of how you expose the data to be physically dragged.) I'd really appreciate, thanks.
-
@clevername27 said in Getting Audio out of the plugin - by dragging to the DAW..:
@Christoph-Hart @Lindon Both of these functions are interesting. Is there example documentation on them (dragging audio and MIDI) beyond the documentation reference? (It seems like there's a lot of possible variables in these tasks, especially audio—and then there's mechanics of how you expose the data to be physically dragged.) I'd really appreciate, thanks.
An Example Snippet please.
-
@DabDab Thank you for your reply—are you asking me to provide a snippet, or suggesting that someone else do so? If the latter, I can't provide a snippet, as I am not familiar enough with the subject to create a framework.
-
Usually I'm waiting for the things that I implement for a project until some time has passed and the API has stabilized, but this is the case here.
You basically give the
Engine.renderAudio()
function an array ofMessageHolder
objects that contain a MIDI sequence and then it will render the audio on a background thread and continuously calls a supplied JS function with a callback that contains some state.inline function createEventList() { local events = []; local on = Engine.createMessageHolder(); on.setType(1); on.setChannel(1); on.setVelocity(127); on.setNoteNumber(55); on.setTimestamp(0); local off = Engine.createMessageHolder(); off.setType(2); off.setChannel(1); off.setNoteNumber(55); // calculate the length you want somehow local numSamples = getRenderLengthSamples(); off.setTimestamp(numSamples/8); local on2 = Engine.createMessageHolder(); on2.setType(1); // note on on2.setChannel(1); on2.setNoteNumber(56); on2.setVelocity(127); on2.setTimestamp(numSamples); local off2 = Engine.createMessageHolder(); off2.setType(2); // note off off2.setChannel(1); off2.setNoteNumber(56); off2.setTimestamp(numSamples); events.push(on); events.push(off); events.push(on2); events.push(off2); return events; } // this is being called periodically (~every 500ms) so you can update a progress bar inline function onRender(obj) { // contains the (full) audio signal that has been rendered so far... // an array of Buffer objects for each channel. local d = obj.channels; // the render progress from 0...1 local progress = obj.progress; // a bool when the render is finished local isFinished = obj.finished; } Engine.renderAudio(createEventList(), onRender);
-
@Christoph-Hart Thank you, kindly.
-
@clevername27 Yeah, I thought. But We have got a script from @Christoph-Hart .