HISE Logo Forum
    • Categories
    • Register
    • Login

    Help me understand Waveshaping

    Scheduled Pinned Locked Moved General Questions
    12 Posts 6 Posters 675 Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • C
      ccbl
      last edited by

      Hey folks. I thought I would take a crack at making an amp sim in HISE using the available DSP blocks. Would love some knowledgeable people to help fill in a bit of knowledge that's lacking in my area.

      When it comes to the DSP aspect, is waveshaping relative or absolute? What I mean by that is, does it analyse each wave and then apply the shaping to it as a sort of percentage? Or, is it absolute, so for instance if the peak of the wave doesn't meet the threshold, the wave would remain linnear? I hope I'm explaining myself properly.

      When it comes to the Shape FX, is there supposed to be a graphical representation of what's happening, or should I get an oscilloscope module to check what's happening to the wave?

      Is clipping actually the better approach here? I guess the issue is, I think there's only hard clipping available as a module, I couldn't set up for instance asymetrical soft clipping?

      griffinboyG orangeO clevername27C 3 Replies Last reply Reply Quote 0
      • griffinboyG
        griffinboy @ccbl
        last edited by griffinboy

        @ccbl

        Clipping is waveshaping.

        Waveshaping is taking an input sample and mapping it to an output sample.

        Audio samples make up waveforms.
        Samples range from -1 to 1, and an audio file is an array of these numbers. A sine wave for example could start at 0, increase to 1 and then back to 0, then to -1 then to 0, and repeat.

        Waveshaping basically looks at the current sample, say this sample is 0.75, and it maps this to a different sample value and outputs that value instead.
        This is raw digital distortion of the wave. You basically take a wave shape, and transform each point in order to end up at a different wave.

        Often continuous functions are used.
        309ca949-50b6-45b0-89cd-88c5bba445e2-image.png
        So that you get a smooth transformation.
        Waveshaping is linear and memoryless.

        Waveshaping is static and will generally not sound like a guitar amp.
        Because analog amps are nonlinear and will have a different effect depending on past audio. Rather than just looking at the current sample and transforming it.

        d.healeyD C 3 Replies Last reply Reply Quote 6
        • d.healeyD
          d.healey @griffinboy
          last edited by

          @griffinboy That is one of the best and easiest to understand descriptions of waveshaping I have read. Well done!

          Libre Wave - Freedom respecting instruments and effects
          My Patreon - HISE tutorials
          YouTube Channel - Public HISE tutorials

          1 Reply Last reply Reply Quote 1
          • C
            ccbl @griffinboy
            last edited by

            @griffinboy thanks for the explanation! I will probably look to the RT-Neural implementation once it seems like there's a known pipeline.

            For now I just want to learn some stuff. Maybe I'll try out different ways of modulating the waveshape based on some kind of analysis. RMS maybe?

            griffinboyG 1 Reply Last reply Reply Quote 0
            • C
              ccbl @griffinboy
              last edited by

              @griffinboy Second question I guess. Is this all affected by the sample rate. -1/1 at 24bit has a much tighter ranger than -1/1 at 32bit float right?

              LindonL 1 Reply Last reply Reply Quote 0
              • LindonL
                Lindon @ccbl
                last edited by

                @ccbl said in Help me understand Waveshaping:

                @griffinboy Second question I guess. Is this all affected by the sample rate. -1/1 at 24bit has a much tighter ranger than -1/1 at 32bit float right?

                the range is exactly the same no matter what the bit size or sample rate, (1 to -1)
                Bit size defines the size of the number that is used to represent a given sample - so 32-bit is a longer number than 24-bit so it provides finer granularity, sample rate just defines the frequency at which values are sampled.

                HISE Development for hire.
                www.channelrobot.com

                C 1 Reply Last reply Reply Quote 1
                • orangeO
                  orange @ccbl
                  last edited by

                  @ccbl

                  develop Branch / XCode 13.1
                  macOS Monterey / M1 Max

                  1 Reply Last reply Reply Quote 1
                  • griffinboyG
                    griffinboy @ccbl
                    last edited by

                    @ccbl

                    Yeah so RT neural will be the 'easiest' way to create an analog style distortion, in terms of the amount of physics you will need to learn.

                    I've not yet messed with machine learning myself, but I can tell you that making a white box model will mean you are reading endless amounts of papers on maths, electronics and physics

                    C 1 Reply Last reply Reply Quote 0
                    • C
                      ccbl @griffinboy
                      last edited by

                      @griffinboy ha ha yeah. To be honest I wasn't expecting to make some UAD level simulation but just learn the basics of HISE a bit more.

                      I'm already very familiar with AI though through making a lot of NAM profiles and being one of the co-founders of ToneHunt.

                      Really looking forward to implementing quite a few ideas once that process is ironed out.

                      1 Reply Last reply Reply Quote 1
                      • C
                        ccbl @Lindon
                        last edited by ccbl

                        @orange sweet, will check this out.

                        1 Reply Last reply Reply Quote 0
                        • clevername27C
                          clevername27 @ccbl
                          last edited by

                          @ccbl Chase the best sound, not the technology.

                          Currently, the best way to simulate amplifiers is with nonlinear impulse responses.

                          But a single very-high-quality linear impulse response (as used in HISE) sounds as good as the best amplifiers recorded by the best engineers.

                          How it sounds is far more important than the underlying technology.

                          I would suggest investing in hiring a world-class audio guitar engineer to create conventional IRs—put your resources there. (I can recommend someone if interested, but you gotta be serious).

                          C 1 Reply Last reply Reply Quote 0
                          • C
                            ccbl @clevername27
                            last edited by

                            @clevername27 I appreciate your sentiment, but I'm just a guy with a normal day job with an interest in plugins. Not looking to make any money here, just releasing my work for free as and when inspiration strikes.

                            1 Reply Last reply Reply Quote 1
                            • First post
                              Last post

                            20

                            Online

                            1.7k

                            Users

                            11.8k

                            Topics

                            102.9k

                            Posts