[Feature request] Lossy monoliths
-
Currently monoliths include the entire contents of the wav file, even the bits before and after the start/end markers. I would like an option to create the monolith without these parts.
I always have the original wav files so I can simply revert the sample map save mode if I need to readjust the start/end.
-
@d-healey Could just go ham into lossy and be able to also do OPUS monoliths. OPUS at 128 kbps is basically transparent, so if you exported at 160 or 192 this would be a huge reduction in size.
Though I'm not aware of any realtime implications of this.
Btw, can't you adjust start/end markers from the interface, this would then be impossible if these parts are cut away.
-
@aaronventure said in [Feature request] Lossy monoliths:
Btw, can't you adjust start/end markers from the interface, this would then be impossible if these parts are cut away.
Not sure what you mean. My goal is to make releases from my sustains within HISE without reencoding the entire sustain sample. Also have other uses.
I don't want to reduce the quality of the audio, I just want to avoid encoding data that will never be heard in the final product. Maybe I should have said destructive rather than lossy.
-
@d-healey Forgive me, I haven't experimented with the sampler that much. What I meant was wouldn't trimming the samples in the monolith break the ability to adjust start and end times within the script (looping etc)?
Also, if you already have sustains loaded, can't you just use that?
-
@aaronventure said in [Feature request] Lossy monoliths:
What I meant was wouldn't trimming the samples in the monolith break the ability to adjust start and end times within the script (looping etc)?
Probably, but I don't need to do that. The loops are already set in the sample map before they are encoded.
I want this as an option, not as the only option or even the default option.
@aaronventure said in [Feature request] Lossy monoliths:
Also, if you already have sustains loaded, can't you just use that?
No this is the problem. You can share a monolith between samplers, but the data within them must be exactly the same, and I have situations where this isn't the case. See this thread - https://forum.hise.audio/topic/535/different-sample-maps-shared-monoliths
-
@d-healey Actually there is a function in HISE which bakes all properties into a new wave file, replaces the references and encodes the sample map.
This does not only include start / end, but also pitch information and all the sample envelopes. All available in the HISE menu:
Tools / Apply Samplemap properties to files
BUT
I wrote this function three years ago and never used it so it would be the twist of the millennium if the function still works.
Try it out, but make a backup of your samples. Then make another backup of this backup and if you're at it, backup your entire hard drive. That's how much I trust this function to work :)
-
@Christoph-Hart Ah yes, now you mention it I recall that feature and apparently I did some work on it earlier this year which I have no memory of...
And there is a bug... https://github.com/christophhart/HISE/issues/494
With a possible solution that I need you to confirm doesn't break anything :) https://forum.hise.audio/topic/9081/bug-unwanted-pitch-shifting-when-using-apply-sample-map-properties-to-sample-files/5
-
@Christoph-Hart I've just been testing the apply sample map properties feature again, and it seems the multi-mic bug is still present.
I clicked on my old link to see what the fix was that I found but it takes me to the latest commits and the line of code has moved so I'm not sure exactly what I was referring to :D
Here's a very simple project you can use for testing. applysampleproperties.zip
-
Ok I think I've fixed this, and it was the simplest fix I've ever done - https://github.com/christophhart/HISE/pull/594
-
@d-healey hmm seems to fail if the sample map contains a duplicate sample, even when using
Engine.setAllowDuplicateSamples(false);
-
I think the way to deal with duplicate samples would be to:
- Check through all the samples for duplicates.
- If there are duplicates then find the earliest start time and latest end time that is used across the duplicates. And trim the sample accordingly.
- Adjust the start/end times in the sample map to work with the newly cut sample
@Christoph-Hart What do you think?
Edit: Actually this won't work because the other properties won't be applied. We need to create a duplicate wav of each sample with its own properties.
-
I found another bug, apply sample start and end causes the process to fail. If I apply just start or just end then it works.
-
@Christoph-Hart I think there are two problems.
The first is because I moved the property removal to after the apply loop, the start time delta is being repeatedly added. So to solve that I think it can be reset with each iteration of the loop, like this:Nope, this isn't required.The next problem I think is caused because in the part where you change the sample start, you swap nb/ob. So that the buffer that the sample end is working on is now shorter than the original - I could be misunderstanding this though...
Also selecting the normalisation option results in silence on the other mic positions. Gaaar
-
Well I can't figure it out, so I recall my original request for lossy monoliths.
This would have a couple of advantages over the apply sample properties suggestion.
- All trimmed monoliths would be smaller.
- It's none destructive as the original sample map would not be affected so you can easily revert and make adjustments.
-
Ahhhh I decided to have another go at it. I think I've solved it but unsure if the code is a little messy. I was thinking along the correct lines before, just I was not following the logic. The problem was that the start position was being applied multiple times to the end position (once for each mic position) whereas it should only be applied once. Here is the modified function.
void apply(const Identifier& id) { auto value = propertyData.getProperty(id); if (id == SampleIds::SampleStart) addDelta(-int(value), { SampleIds::SampleEnd, SampleIds::LoopStart, SampleIds::LoopEnd }); for (auto f : sampleFiles) { apply(id, f); } propertyData.removeProperty(id, nullptr); }
The
addDelta
part was previously invoid apply(const Identifier& id, File& fileToUse)
.Hoping that improving my C++ foo will make me more useful to the community :) or at least more useful to myself.