Forum

    • Register
    • Login
    • Search
    • Categories

    Getting Audio out of the plugin - by dragging to the DAW..

    General Questions
    4
    8
    253
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Lindon
      Lindon last edited by

      ...just like MIDI

      Did I see something about this being available?

      HISE Development for hire.
      www.channelrobot.com

      Christoph Hart 1 Reply Last reply Reply Quote 1
      • Christoph Hart
        Christoph Hart @Lindon last edited by

        https://docs.hise.audio/scripting/scripting-api/engine/index.html#renderaudio

        clevername27 1 Reply Last reply Reply Quote 1
        • clevername27
          clevername27 @Christoph Hart last edited by

          @Christoph-Hart @Lindon Both of these functions are interesting. Is there example documentation on them (dragging audio and MIDI) beyond the documentation reference? (It seems like there's a lot of possible variables in these tasks, especially audio—and then there's mechanics of how you expose the data to be physically dragged.) I'd really appreciate, thanks.

          DabDab 1 Reply Last reply Reply Quote 1
          • DabDab
            DabDab @clevername27 last edited by

            @clevername27 said in Getting Audio out of the plugin - by dragging to the DAW..:

            @Christoph-Hart @Lindon Both of these functions are interesting. Is there example documentation on them (dragging audio and MIDI) beyond the documentation reference? (It seems like there's a lot of possible variables in these tasks, especially audio—and then there's mechanics of how you expose the data to be physically dragged.) I'd really appreciate, thanks.

            An Example Snippet please.

            Trance Producer and Presets Designer.

            clevername27 1 Reply Last reply Reply Quote 0
            • clevername27
              clevername27 @DabDab last edited by

              @DabDab Thank you for your reply—are you asking me to provide a snippet, or suggesting that someone else do so? If the latter, I can't provide a snippet, as I am not familiar enough with the subject to create a framework.

              Christoph Hart DabDab 2 Replies Last reply Reply Quote 0
              • Christoph Hart
                Christoph Hart @clevername27 last edited by

                Usually I'm waiting for the things that I implement for a project until some time has passed and the API has stabilized, but this is the case here.

                You basically give the Engine.renderAudio() function an array of MessageHolder objects that contain a MIDI sequence and then it will render the audio on a background thread and continuously calls a supplied JS function with a callback that contains some state.

                inline function createEventList()
                {
                	local events = [];
                	
                	local on = Engine.createMessageHolder();
                	on.setType(1);
                	on.setChannel(1);
                	on.setVelocity(127);
                	
                	on.setNoteNumber(55);
                	on.setTimestamp(0);
                	
                	local off = Engine.createMessageHolder();
                	off.setType(2);
                	off.setChannel(1);
                	off.setNoteNumber(55);
                	
                        // calculate the length you want somehow
                	local numSamples = getRenderLengthSamples();
                	
                	off.setTimestamp(numSamples/8);
                	
                	local on2 = Engine.createMessageHolder();
                	on2.setType(1); // note on
                	on2.setChannel(1);
                	on2.setNoteNumber(56);
                	on2.setVelocity(127);
                	on2.setTimestamp(numSamples);
                	
                	local off2 = Engine.createMessageHolder();
                	off2.setType(2); // note off
                	off2.setChannel(1);
                	off2.setNoteNumber(56);
                
                	off2.setTimestamp(numSamples);
                	
                	events.push(on);
                	events.push(off);
                	
                	events.push(on2);
                	events.push(off2);
                
                	return events;
                }
                
                // this is being called periodically (~every 500ms) so you can update a progress bar
                inline function onRender(obj)
                {
                	// contains the (full) audio signal that has been rendered so far...
                	// an array of Buffer objects for each channel.
                	local d = obj.channels;
                
                	// the render progress from 0...1
                	local progress = obj.progress;
                
                	// a bool when the render is finished
                	local isFinished = obj.finished;
                }
                
                Engine.renderAudio(createEventList(), onRender);
                
                clevername27 1 Reply Last reply Reply Quote 4
                • clevername27
                  clevername27 @Christoph Hart last edited by

                  @Christoph-Hart Thank you, kindly.

                  1 Reply Last reply Reply Quote 0
                  • DabDab
                    DabDab @clevername27 last edited by

                    @clevername27 Yeah, I thought. But We have got a script from @Christoph-Hart .

                    Trance Producer and Presets Designer.

                    1 Reply Last reply Reply Quote 0
                    • First post
                      Last post

                    26
                    Online

                    982
                    Users

                    6.6k
                    Topics

                    60.7k
                    Posts