• HISE Blog Entries

    Pinned Moved
    1
    1 Votes
    1 Posts
    1k Views
    No one has replied
  • Free Reverse Delay built in RNBO

    8
    10 Votes
    8 Posts
    372 Views
    M

    @treynterrio you can use this code if you have 2 inputs and 2 outputs or use the previous one if you have one input and one output.

    The first code you can use in scriptnode in case you need to process, for example, mid and side separately.

    import("stdfaust.lib"); MAX_DELAY = 192000; // Phase-shifted phasor for reverse reading // dtime: delay time in samples // phase: phase offset for reading position phasor_phase(dtime, phase) = ((os.lf_rawsaw(dtime) + phase) % dtime) : int; phasor(dtime, phase) = phasor_phase(dtime*2, phase) <: <=(dtime), (*(-1) + dtime*2), _ : select2; // Single channel reverse delay processing reverse_delay = _ <: direct, delayed_and_reversed :> _ with { // Get current sample rate for scaling controls current_sr = ma.SR; // User controls with sample rate scaling delay_base = hslider("Delay Time[style:knob]", 0.5, 0.02, 1, 0.01); delay_time = min(MAX_DELAY-1, delay_base * current_sr); dry_wet = hslider("Dry/Wet[style:knob]", 0.5, 0, 1, 0.01); // Direct path with gain direct = *(1-dry_wet); // Delay into reverse buffer implementation delayed = de.delay(MAX_DELAY, delay_time); // Reverse buffer using rwtable reversed = rwtable(MAX_DELAY, 0.0, phasor(delay_time, 0) : int, // Write index _, // Input signal phasor(delay_time, delay_time/2) : int // Read index ); // Process delayed signal through reverse buffer with gain delayed_and_reversed = delayed : reversed : *(dry_wet); }; // Main stereo processing - apply reverse_delay to each channel independently process = reverse_delay, reverse_delay;
  • Some new on the graphic side!

    7
    1 Votes
    7 Posts
    98 Views
    A

    don't forget you can just feed the transcript into chatgpt and ask about it, though the accuracy will depend on the transcript accuracy which can suffer when acronyms are used or the speakers' english is not very good. If you feed in the transcript with timestamps, you can just ask for a timestamp for specific claims and check it out.

    Here’s a summary of Matt Gonzalez’s talk on JUCE Graphics: Past, Present, and Future:

    Introduction Background: Matt Gonzalez is a veteran audio developer with over 20 years of experience, known for contributions to JUCE 8's Direct2D graphics. Focus: This talk discusses the evolution, technical details, and future direction of JUCE's graphics system, particularly in moving from CPU to GPU rendering. The State of JUCE Graphics JUCE Strengths: Cross-platform compatibility. Excellent for audio processing. Historical Weakness: Graphics performance has been a longstanding pain point for JUCE users. Historically reliant on a CPU-based software renderer, which limits rendering speed and capability. Transition to GPU Rendering Direct2D Implementation: Matt reworked JUCE’s Direct2D renderer during the pandemic, enabling GPU-based rendering for JUCE applications on Windows. Direct2D bypasses the CPU for rasterization tasks, utilizing the GPU for significant performance improvements. Demonstrations showed up to a 10x speedup in certain scenarios (e.g., tiling and transforming images). Challenges in Transition GPU-Specific Issues: GPUs are optimized for 3D rendering, so 2D tasks like stroking paths, text rendering, and clipping present challenges. Legacy Compatibility: Ensuring compatibility with existing JUCE components and the software renderer required addressing legacy constraints. Examples: Maintaining support for nested renderers (Direct2D, OpenGL, GDI). Adapting the JUCE image class for GPU-based storage. Conceptual Shifts: Moving graphics tasks to the GPU necessitates new mental models, as GPU rendering involves massively parallel processing. Design Trade-Offs: Balancing ease of use for developers with cutting-edge performance required difficult decisions. Advancing JUCE Graphics

    New Features via GPU Rendering:

    Mesh Gradients: Advanced gradients with flexible control over color transitions and shapes. Real-Time Effects: Blur, embossing, and shadowing effects executed directly on the GPU. Dynamic Rendering: Procedurally generated UI elements like buttons, eliminating the need for pre-rendered assets.

    Future Goals:

    Replace the low-level graphics context (LLGC) entirely with GPU-based solutions. Build a cross-platform GPU renderer that works with Vulkan, Metal, and DirectX 12 for all platforms. Shift all rendering logic to shaders for maximum performance and flexibility. Prototype Metal Renderer Matt demonstrated a Metal-based renderer for macOS, achieving 4-5x speedups over Core Graphics. Key Advantages: Seamlessly integrates with existing JUCE graphics APIs. Opens possibilities for developers to leverage GPU shaders for custom effects and advanced visuals. Community Contributions MCAL Module: A new library developed by Matt, providing additional Direct2D features not included in JUCE 8, such as advanced gradients and image processing effects. Call to Action: Developers are encouraged to experiment with these tools and provide feedback to shape future development. Closing Thoughts JUCE’s transition to GPU rendering marks a significant leap forward, enabling richer, faster, and more visually dynamic UIs. Challenges remain in balancing legacy compatibility with innovation, but progress in tools like Direct2D and Metal sets the stage for a modern, cross-platform graphics pipeline. Resources & Contact Recommended resources: Metal by Example and JUCE forums for further exploration. Matt encourages discussions and questions through Discord for developers eager to delve deeper.

    This talk underscores JUCE’s journey from outdated rendering to modern GPU-based solutions, highlighting exciting new possibilities for developers.

  • [TUTO] C++ Third Party Node - XCode

    6
    6 Votes
    6 Posts
    86 Views
    griffinboyG

    @ustk

    Thanks for doing this. I intended to but I've been drowning in work lol

  • Things I've Learned about ML Audio Modelling

    8
    7 Votes
    8 Posts
    222 Views
    C

    @hisefilo I used this as a basis. The important part for now seems to be that your model is in the "keras" format. The aida-x ones are like that by defaults.

    https://forum.hise.audio//post/89837

  • [Tutorial] How to create a c++ custom node

    23
    15 Votes
    23 Posts
    909 Views
    O

    @griffinboy I'd love to get a look at your process there!

  • Found Another Tutorial Source of HISE Plugins Development

    2
    2 Votes
    2 Posts
    250 Views
    U

    Hello. I took this course. When I purchased the course, I received a free course on creating a GUI. I can't say I was very inspired by this course. Yes, it certainly lives up to its name. We can add ready-made fx and load buttons from knobman. This concludes the course. I got most of the information from the document, forum and video.

  • HISE on WINE

    6
    1 Votes
    6 Posts
    197 Views
    d.healeyD

    @modularsamples It's possible I've made some other fixes in my fork that I can't remember - or possibly it just works on Debian better than on Ubuntu, but I don't think this is the case as it works well on Linux Mint which is based on Ubuntu

  • NEATBrain Writeup

    11
    17 Votes
    11 Posts
    607 Views
    iamlampreyI

    @orange we could also just make our own with the neural node 😉

  • Thank you Christoph

    24
    12 Votes
    24 Posts
    1k Views
    clevername27C

    @obolig I appreciate that @Christoph-Hart has been a lot more active on the forum here. I may be in the minority in thinking that HISE doesn't need any more features – just more information about the ones that already exist. There's more than people realize, with direct access to FFT bins being the universal Turing machine of DSP.

    In terms of companies using HISE, within the context of the ones I work with and advise, there is potentially significant interest in the JavaScript engine (not the sampler elements). From my perspective, the outwardly-facing obstacle is that there simply needs to be more than one person behind it to provide the necessary stability, maintenance of knowledge assets, and overall mitigation of inherent risk that is inherent with incorporating external systems into existing commercial ecosystems.

  • i need a shirt 😅

    8
    3 Votes
    8 Posts
    451 Views
    d.healeyD

    @iamlamprey If you're using the Rhapsody template within HISE then there is no built in updater unfortunately. What I do is add the Rhapsody boilerplate as a git submodule to my projects, then I can just update it with git pull.

    Basically you just need to replace the boilerplate files you currently have with the same ones from here - https://codeberg.org/LibreWave/RhapsodyBoilerplate

    Then in on init you need to run Ui.createTemplate("your project name); and it will update the UI to the latest version, you can delete this function call afterward.

  • Happy Thanksgiving from the US!!!!

    1
    5 Votes
    1 Posts
    154 Views
    No one has replied
  • Introducing Rhodecase88! A plugin built by the HISE community!

    5
    8 Votes
    5 Posts
    357 Views
    virtuscapeaudioV

    @DanH Thank you Dan!!!

  • Wishing HISE Family a Very Happy Diwali

    5
    10 Votes
    5 Posts
    385 Views
    Matt_SFM

    @DabDab Thanks, Happy Diwali to you!

  • Automatic Installer for Windows: Inno Setup Script

    8
    3 Votes
    8 Posts
    1k Views
    O

    @bendurso Thank You! This helped me resolve the issue. I wrote the prompt for the LinkWindows file to look inside of the samples folder instead of the samples folder itself. Everything is working great now.

  • New Plug-In Released - MOD-EQ-1

    13
    6 Votes
    13 Posts
    648 Views
    DanHD

    @orange Yes the filter modules are all in scriptnode, the parameters are modulated by the lfos, and then global cables carry the values out to wherever you need them, which in my case was the EQ module.

    There's no blur actually, I just added a tiny bit on the webpage for general sexiness 😆

  • RNBO Video Tutorial for HISE

    15
    0 Votes
    15 Posts
    994 Views
    DabDabD

    @Phelan-Kane has made tutorials on it...

    https://youtu.be/64dTcwnP40o?si=sZ0JiMztyBq3-aSm

    https://youtu.be/-0tkr5ypLT4?si=wY8SXIS-YDT8Gw6-

    If you are interested please visit to his channel to learn more. He is a Max MSP certified instructor.

    https://forum.hise.audio/topic/8204/max-rnbo-hise-tutorial-vids-available

  • NEAT Player Source Code

    14
    5 Votes
    14 Posts
    798 Views
    Adam_GA

    @iamlamprey wow :astonished_face: i had no idea you could do things like warp and stretch the objects with painting. ill have to dig more for some examples

  • Complete Guide to Install Virtual Machine for Mac OSX and HISE with FAUST

    24
    0 Votes
    24 Posts
    3k Views
    Dan KorneffD

    @d-healey The feature in itself should work, but macOS is very picky about GPU specifications. You'll have to look up a GPU compatibility chart to see if your hardware will work.
    https://hackintoshenglish.fandom.com/wiki/GPUs_Compatibility

  • Romplur vs Maize Sampler still HISE Rocks

    13
    2 Votes
    13 Posts
    3k Views
    FrankbeatF

    Today I found out about another feature that makes HISE rock for me:
    The decay curve of a HISE modulator is way closer to an ideal logarithmic one than I know it from Kontakt, Battery and many other software instruments. This comes very handy to me as I love to model analog synth drum sounds in digital realm.

    To anyone looking for details on that:

    Battery 4 and Kontakt 5 have a very steep curve, tending to almost linear – which is very different to how a native sound will decay.

    HISE has a much finer finer curve, mimicking an almost perfectly logarithmic decay. With only a little imperfection: It breaks if it reaches -59.4 dbFS. See my comparison graphs below.

    One thing I was surprised of to learn was that the Decay Curve parameter seems to not really alter the shape significantly. Turning it up to 100% makes it still look logarithmic like on 0% but seems to extend the actual time value. But that's not a problem to me as I rarely ever want a non-logarithmic decay.

    envelope-graphs.gif

21

Online

2.0k

Users

10.9k

Topics

94.6k

Posts