Lunacy Audio presents CUBE
-
@nesta99 Thank you! Are you asking who designed the sample editor or how we programmed the sample editor?
-
@iamlamprey Yup! The Orbits system is only timers and so far we haven't had any stability issues related to the Orbits system. We're running a fast timer in the
onTimer
callback. The key for us was to really optimize any calls in the timer because every extra call can bump the processing up substantially. Use local variables, don't create new objects, avoid unnecessary API calls, etc. Also you'll want to make sure you setSynth.deferCallbacks(false)
so that the timer is in real-time and not deferred. -
@Casey-Kolb I find the deferCallbacks thing confusing. Can someone point me to a situation where you do want it and one where you don't want it? And why :grinning_squinting_face:
(I know there's an example above but how would the timer being deferred actually impact real-world usage)
-
@DanH said in Lunacy Audio presents CUBE:
@Casey-Kolb I find the deferCallbacks thing confusing. Can someone point me to a situation where you do want it and one where you don't want it? And why :grinning_squinting_face:
If you want to update the UI when the user presses a key but you don't want it to cause latency then you use it.
If you want to change the velocity of an incoming MIDI note then don't use it.
Generally your UI script should be deferred.
-
@d-healey Thanks David. So any animations I want to leave alone right?
My script is, in the words of the great @Christoph-Hart, " a gargantuan script sausage ".
Assuming I wish to keep it this way (all in oninit) where would I place the deferred command in relation to my animation timers?
-
@DanH said in Lunacy Audio presents CUBE:
So any animations I want to leave alone right?
Generally you don't want animations to cause latency so you would defer your callbacks, panel and engine timers are deferred anyway though so if you're using them it doesn't make a difference. It's only the synth timer that runs in the realtime thread.
Assuming I wish to keep it this way (all in oninit)
Is there another way?
where would I place the deferred command in relation to my animation timers
In the on init callback.
-
@d-healey said in Lunacy Audio presents CUBE:
Is there another way?
@DanH Are you using a text editor like Sublime? Or are you editing everything in one single file within HISE? The
include
file call is your best friend here You can make your scripts more modular that way. -
@Casey-Kolb said in Lunacy Audio presents CUBE:
The include file call is your best friend here You can make your scripts more modular that way.
This is the way to go, but everything is still in the on init callback, it just looks pretty and you can work with it like it's all separate.
-
What do you guys think about having an indicator in the different scripts to see if it is deferred or not?
Could it be a good request? -
@ustk We already have one. You really only need to defer the UI script though, unless for some reason you want to do non-realtime stuff in the real-time callbacks in a script that the user won't see - maybe there's a reason I'm not aware of.
-
@d-healey @Casey-Kolb Yep need to get my head around it.
I meant everything in oninit as in everything splurged out in the oninit in Hise rather than neatly compartmentalised into smaller scripts.
The issue I have is that so many of my UI controls and callbacks seems to end up in other parts of my script. I'm scared to move them around.
If I use the include file call does it behave like.... script script script (sausage part one), oh here's a include file call lets stick that here in the script, script script script (sausage part deux), oh here' another include file call lets stick that in the script here, and so on and so forth
i.e its more or less sequential
-
@DanH That's what coding standards are for - https://github.com/christophhart/hise_documentation/blob/master/scripting/scripting-in-hise/hise-script-coding-standards.md
-
@d-healey said in Lunacy Audio presents CUBE:
@ustk We already have one.
-
@DanH It's as if the contents of that file were right there in the script, but just replaced with that line of code
include
. Definitely will make your life WAY easier!In general, it's good to keep UI declaration and logic separated, so you might have different files for declaring different sections of your UI with other script files for their callbacks or other logic you might need. Though I'm guilty of breaking this all the time. It's hard to stay organized I think David probably knows the best ways.
-
@Casey-Kolb Thanks Casey, makes perfect sense :)
-
@Casey-Kolb yeah please, I want to know how. I kindly want to get the concept of how and what to do to create even a simple sample editor. Thanks !
-
@nesta99 There's a few things going on:
- AudioWaveForm connected to the Sampler Module, with the index set to the correct RR Group. (This displays the waveform).
- A slider on top of that waveform, that adjusts the Intensity of a Sample Start Modulator connected to the same Sampler Module.
- Various sliders connected to other Parameters and Modulators of the Sampler Module, like the Pan, Gain and a Constant Pitch Modulator.
- An ADSR section with Sliders connected to the AHDSR Envelope, added to the Sampler Module.
- A Button to control Sample Reverse via
Sampler.setAttribute(Sampler.Reversed, value);
You also needEngine.setAllowDuplicateSamples(false);
and the Sample Maps need to be in Monolith format.
I'm not sure how to do the Loop/One Shot stuff, I assume you just set the mode using
Sampler.setAttribute(index, value);
I'm doing some similar (albeit much uglier) stuff in my Player:
-
@iamlamprey Wow man, thanks so much. Will dive in today, and try!!!
-
@iamlamprey
https://youtu.be/glcXiUIi470Hello. I'm trying to add the audiowaveform here on the project, but as soon as I select it , Hise crashes... Is there any issue around this please? thanks.
-
@nesta99 Try the develop branch