Haha the easter bunny has delivered.
-
@Christoph-Hart Excellent! Will we be able to morph between samples in real-time?
-
@d-healey no, it's just offline file manipulation at the moment, but I'll plan to add support for wavetable creation & other synthesis features.
AFAIK the realtime module is not compatible with proprietary licenses or am I wrong?
-
AFAIK the realtime module is not compatible with proprietary licenses or am I wrong?
I think you are correct.
-
@d-healey Can you check whether it works on Linux?
- Clone the loris-tools repo
- Build the loris_library Dynamic library (both Debug and Release configs). This is completely self contained now so you don't need to have loris installed anymore. I've added a default Linux Makefile exporter but it might be possible that you have to customize it.
- Copy the .so files into
APPDATA/HISE/loris_library/
- Recompile HISE, go to the settings, tick EnableLoris (Development Settings) and restart.
- Open the Loris Toolbox project from the repo. If you're not flooded with error messages in the HISE console, then it works :)
That's the current Toolbox App:
I've implemented a few functions along with some custom algorithms (pitch locking, spectral gain adjustment, time dilation). I'm really surprised how well this sounds - one thing I've realized pretty early is that you have to subtract the resynthesised version from the original to get the residual noise part that you then can add after the processing as the resynthesis always chokes at the noise stuff.
-
Hi,
All compiled and loaded successfully here.
The only thing I had to do was rename the dll. The output from the makefile was
libloris_library_debug
and I needed to remove thelib
part for HISE to pick it up.I'm going to have fun playing with this now. If you think of any way you can get real-time morphing into HISE without infringing the license (or if it can be added optionally for GPL projects) I will be very happy :D
-
If there's nothing in the buffer preview and I click on it I get an error
Interface:! BufferPreview.js (134): API call with undefined parameter 2
If there is something loaded and I click the buffer preview, HISE crashes on mouse-up.
-
Thanks for checking it. I'd rather change the code in HISE that creates the filename - Linux adds the
lib
prefix automatically so you would have to rename it manually every time.I also got the crash on Windows, but on macOS it's not crashing for some reason... I'll debug it later, but it looks like an easy fix.
I'm not sure whether the realtime morphing is really necessary or if you could just create a bunch of intermediate samples with the morphing algorithm and then use the cross fade from the sampler - I don't think the different will be very noticeable and CPU wise it will still be faster rendering 8 samples simultaneously than resynthesizing the sample in realtime.
-
@Christoph-Hart Hmm that's a good idea. I'll give that a try.
-
@d-healey I think the main point here is to manipulate the samples so that the phases are perfectly aligned, then you can crossfade all you want.
There are multiple other use cases I see for this:
- create round robin variations (the dilate function is super helpful here because you can distort the attack phase and create natural variations like this)
- create sordino / muted variations with the spectral editor (just remove a few lower harmonics)
- create better sounding pitch variations than just changing the playback ratio (I've added a simple constant pitch modulator to the original preview so you can compare the sound of the repitched loris output against the "default" repitching done with linear interpolation.
- create perfect loops by morphing the loop crossfade instead of fading.
-
@Christoph-Hart Yeah I can also see a case for morphing between dissimilar samples, like sustain to flutter or sustain to vibrato. Or even between samples that are from completely different instruments or sources.
Would it be worthwhile my contacting the developer and seeing if we can get him to release it under MIT or perhaps dual license it for HISE?
-
@d-healey My gut feeling is that the performance is not good enough for realtime usage. A 8 second sample takes about 2 seconds to synthesise, so a single voice is 25% CPU, so playing a chord will eat up your entire CPU.
I think the library was never intended to be used in realtime and the amount of work to make this possible is more or less the same as writing a fast additive synth that uses the analysed partial data from scratch.
-
"a fast additive synth that uses the analysed partial data from scratch...."
-- so that would be a great idea to do - just do all your partial generating in batch and ship this specification file with an additive sound generator that knew how to read them...
-
@Christoph-Hart That makes sense. Ok I'll continue with the phase alignment idea. I'm already working with another developer on this angle too.
Oh actually you might be interested, it's the guy who made this - https://github.com/VirtualAnalogy/Paraphrasis
-
@d-healey said in Haha the easter bunny has delivered.:
@Christoph-Hart That makes sense. Ok I'll continue with the phase alignment idea. I'm already working with another developer on this angle too.
Oh actually you might be interested, it's the guy who made this - https://github.com/VirtualAnalogy/Paraphrasis
ooh, err, that'd be good.....
-
Is the save button implemented?
Does the pitch lock use the harmonify class (or whatever it's called)?
-
@d-healey Nope, the save button doesn't do anything yet, I'll try to think of a few ways of batch processing this stuff as you don't want to click on it for each sample export :)
the pitch lock is a custom script function - that's the great thing about the entire setup: you can just define a Javascript function that processes each breakpoint (=read grain) of each partial (=read harmonic).
The relevant function looks like this:
inline function pitchLock(obj) { // Repitch obj.frequency *= pitchFactor; // Apply noise gain (bandwidth is the noise amount, 1 = full noise, 0 = pure sinusoidal) obj.bandwidth *= noiseGain; // calculate the "real" harmonic local ratio = Math.round(obj.frequency / obj.rootFrequency); // calculate the fully locked frequency local lockedFrequency = ratio * obj.rootFrequency; // interpolate between the locked and original frequency obj.frequency = pitchLockAmount * lockedFrequency + (1.0 - pitchLockAmount) * obj.frequency; }
As you can see, it's almost pseudo-code style easy to write spectral manipulation functions, which makes it a super nice tool for playing around.
-
@Christoph-Hart Oh wow, that makes it quite simple to use then. Yes batch preprocessing would be good.
-
I'm thinking if I pitch lock all the dynamic layers, then align the peaks and troughs my chorusing issues should be eliminated. Is there a way to overlay waveform views so I can do the lining up in HISE?
-
what the heck is a Loris?
-