HISE Logo Forum
    • Categories
    • Register
    • Login

    Exploring GPT Integration for Parameter Control in HISE?

    Scheduled Pinned Locked Moved General Questions
    7 Posts 6 Posters 94 Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • M
      marcrex
      last edited by

      Hi everyone,
      Has anyone considered—or even attempted—integrating a language model like GPT into a HISE-based plugin?
      Something simple like a text input field inside the plugin UI where the user could type natural language prompts like “make this sound warmer” or “create a lush pad,” and the plugin would interpret that and adjust parameters accordingly.
      It would be really interesting if such a feature could be integrated.
      Would love to hear your thoughts!

      d.healeyD ChazroxC HISEnbergH A 4 Replies Last reply Reply Quote 0
      • d.healeyD
        d.healey @marcrex
        last edited by

        @marcrex I think it would only be useful for specific plugins built around the concept. But as far as I'm aware no one has done it, but it shouldn't be too difficult using HISE's server api functions.

        Libre Wave - Freedom respecting instruments and effects
        My Patreon - HISE tutorials
        YouTube Channel - Public HISE tutorials

        1 Reply Last reply Reply Quote 0
        • ChazroxC
          Chazrox @marcrex
          last edited by

          Something simple like...

          1 Reply Last reply Reply Quote 1
          • rglidesR
            rglides
            last edited by

            I was messing around with text input a while ago, no actual AI integration but it could be somewhat faked to end up feeling like AI, if you spent long enough building up the library of usable words and phrases, with enough variations and tied those to actual parameter changes etc rather than just showControl as I did here

            HiseSnippet 2135.3oc2Y0sbbaaElTRz061nVmY5zq5ELp+L6NwVY+Q+4pDmMV1tUIV1a8lllLZTcvRhUDUj.Ljfxdqql1I8Ivu.Yl9.3mfdS6k85dWmouC8MH4..xkfbIWKowdZlPMiVhyue3fyA.DXXDyAGGyhLLa7wSCwFlug0noTt2ddHB0X+6XX1zBg9f8E+Yb6ogn3Xrqgo4x+JAeyFqXHe9eu+sQ9HpCNmjgwmvHN36SBH7bpCG7QDe+6gbweLIPS5MFruCitGymk.XYYqNFgHmSPGie.RH1RVFlW4ttDNKZDGwwwFlqbal6zQdrmPUx+IjXxXernQWiQfgTjuGy2UfXAUi87H9tCy5ywFfUFlGAVVEA9QVGPbIynmGItljgctF5wCykJBukK.ut5vqiF7p.RlZPZEEjdSqQNQjPdNGAd99V6S43nIHHrqCEkrFK8u+gV6w.In70CPmfuWDzXlFs1pSmqaC+q8tMaBg9Xt8onH66iFi86Z+d1YZdLluGKHjQgFsVSwds16tZyU0zxkN9w0qjjqTmhZzagZzqBM5uPM5CZXaqqyulkDiW.xR4WxSJp8dIZ0qRs5+RzZdLNjEt.DJ4VxSBZ8VnF8pPi9KTi4w0CYgw0phfYIe7H7jHbrWspjxetbmPDE6+6XQtBu8rUaZCOooL+R6CWyApeW651qMNhbrGW7laTRvuXLjsm0.QcyZNAEKk4OjPO1GKdCELl.92VHnMHostl4DN55ZdtmzyXJNB5BDGgzNPGwG+TERvnSFiQRGkPcAohXvuhlTbRDaRB8DQCexWjPjjCv9LWvPE7ReoW7vnSmJMaFtPQRs8PQRc4XGOJytLbECVRSkkCKLVper8DzxLRdqLS6lLVhJBkDf70hR49SpbldYwPEgYchr5.gmcw3PMGAsD+BAFlp2kEADgRek6EgIIuvH1wPpQL4z4LtJFwiXgDGERm4CGXhSIEXnvwS7RbBMFK6C9Hh6MFCKfH6PoZOC9yBcphKgG7Ae6q.CKTE3BCmpbCFlvLQ5IWRriLxf74B4lAVUMmLcEwc7jJFgbIRgCf0Oh4PRSfLgILK0wUrdocp+hEK5ZOmQU8eB0kfS8KNhh3hHkBq1vJBxd4oDJGVtT5YLzkEuH0SYTnKeV4BOPN9z5q7ptxohBk7j9JxuuHouWr7rElprnA65G8xGRpaTHMZNKbJBj9vjNiv9XGN1cnX1LHbt1ZR1DpOghsgfiCmvn1Igtv1CjBE2p8pMelvGJOIC7qGC6XPL+YDyu0DjeLVLcYF6dKlc+pXmY9z3b8VHMJ+xDXgNQFiq2Bx.8hYuPyKV0YQ7U+2mIF+IzvDNLPn1whXsnVPt3S4q0dcN69rmfi1CEiak4akRHGQoUowvbKOgEY2RtdEXdsEtZqXmVBMSRBXiN6ZSreWMYOT95Qq6ioGy8.tu8a2NWOMSHdHS.qn5Hu2713PxQsKJeI0EOE6RRE2cdojqoUh9Y4M0dU.Ica9Vh3T6BF3r5FtJfEodkBvUa74htUtACM8ZWvq7njYoXxg5JJW0TNqS.U3UUAynpLpLq6jAfqCSp5CdBJpUVnXs9tR6klMFK.tT+8P99hEpZUxts2sZmmtKpE48lMDgw2R0t4q2IW9NvbK0jqpxZDNuvDD99CKrr4gGsakyMjKTMSN7DfWIISKqqe5fhde8vjXOoczytSKUOq467N23b+7HXkYV.IFG0TqPrn2TSXYeK6Ny2iTAmHoU1G1Q7SgHyAHt25BRvmb1B9TypM2Mr6pi975izYqKq2gZN4n7oOprdKuLNspo55thkTJapUw8.FG+PZq1hBqlm0ztLqISpjWp47wQUxVb5GQKRwVzjfw3nr55LAMLWo34LXU+4LneLHNpoL0DjQ2mR3OLDSq6vQLRmmUblDonBDkKOQheP5IRb6DNmQMHtFlMrRijFRHqeRPF+18uChixLDXSvOg3HNQzELuC9ThCVcTGMrtCN9DNKD7wrI2MLupxsqlcPHhQYoWupk78dFOU2gS0a7DhK2aFg+wWNvCK9VxYTt1ec.giCxN7oFW4Eu3E+KAibp8Dj+vO7i9uREt.3pqFtF74Clp2nHtFjTFW.kh35q9p+VU354O+4ygqlEwkrtJEWpZL830uWGW+4AhBOgfp8IWJ7YvKCSiaMX7w4fDH70BpHexwz.IXZXAim2GOgaLAF8GQ9iZIGu6.cX2ntv42yRtDlFp+KHMTCMJAyOaNX9YEhlugU+s15larc+dc1tX.E3rSm9c52uSmMMNMqbxb4K.N0yFGv91KN6qGOO8U.N6r8NarwMqBm6zq2l85dgwYCqzMWT2HODduDHcys2.ha6rybHc6a1cqM6rS2stzHs1w9u0gzZG8u3HUV1+epXtoLx0fwqVe9ob+d0Vue5EGg+y+dkHLi74rrQtOyZKue0CqKdgsbqt0VX++eDdEKwlsqaj8yOav4XAmh.7m+reSUqK9SG+S9xR.Tr+i5P3bqQVXOE+rAKZOEQUfQ0xn+XKdzTajsKYxDbj3zs33nfJVHb2AgHAeM.lFozipW4br8qbTa+mJtxdoH6sp.0Fye6SWy5.lahOhW7xvD2.XJCXmqEtAJwsLQiI7o52P3qraH67Bw2zZHg63UMFWpBLB6O80AFSuWwUstKjB3vyA3JV26Se8bIhFOhkvIzigOBKh.4CVOHIXDTW3fAuSEegDPybIwrdp1cDsEQfQXpqrwWCOoL6JZalxraFSi.jSD6wNpuLPbykWURAvDUdosMrNPz1t67ePP.wk7XGmhlZNE6cYUr+kUwMtrJt4kUwstrJt8kUwcd4JJtm6OHgyBTkMFFGL7txOfyz7tTDjAJyVM9Ffl1.iC
            
            1 Reply Last reply Reply Quote 0
            • rglidesR
              rglides
              last edited by rglides

              Here's a version where text input can set predefined knob values

              HiseSnippet 2560.3ocsZstabbaEdVaOtVpUEN.Asn.8GzpWvtH1J6tRZkRTr6FKa2n3aJVNtsPP0g6Lb0xpYImNCGYqpJzhzm.+BDf9.32g7y1Gh9Nz2fTda14vcmYzZgl0.xCIOW93gmygGNb1MgGPRS4IdMV34mDS7Z7i726DlXz1ivTl2N2yqwh9X7mti5ed28jXbZJIzqQiK+aUi2Xgq3o+8e+M2EGgYAjht77dAmFPdDcLUTz6t8eHMJ5A3PxyoiATuV+cB3rs4Q7LIVtreauXbvQ3CIOAqH6R9dMt58CoBdxdBrfj503J2kGdxdi3uhYn+EzT5fHhpQGu8jBxz8C3QgJDq50a6QznvcymyodRoragE3xFKv66+XZHcR+EVhqqG.UvAzdz3Rtv6xNvqCDdsAvqDH0..oqXfz64uWPBMVTLhBO+P+cXBRxPrzrCghgVuK8u9Y9aykTvDqLFeD4AIxFS3nYu1suIR9mVaszhKsnz3mJPGiSPOBOfD0AcaTNuGRDayGGyYxFMW1L7xSy0S4woUxiZPMGEz+LxvDR5nJYwN9TbcDiOnZn8P0nkvQ2Z4n6LykXLiD863IgpYzoKsHR9a4P1fW1Y4OFs+xAROpkuIZ4AIzCGITOEljM9WOPZ+yafYg4MGhS0z7mxXGFQTOgGOfJ0NRQHRRIBxYQGGbSfl6p0LgQRjS.Zfh5.4zHh7ZCRH3iFPvZEkwBkTkvk+upIijkvGlwNR0Hh9myn5tGSh3gRA4nkU0ZYDAe7IZwliKbhl6Q3DMuBRvHFGMMbQHqn9LdVJwZrr5AMR0WtPJZkK5vrAZTQYzw3HfUpPeZly4K2FZ5XxjvnYiwJjPhAJR1R8+RCC2L6xs.JSYjQ8Jyjdr3D9gR2uT5wyHbiMRjvioAFjNQGAxPYcOxkhfQpGRyXoD8bHBSCu0.YJM8Dxx8D3OwzsKO1Z3hj5Nx.FdrwvEGehQMDYHblVSgzz.skAGITzMArJAYrCAXQvHMiI3Ppl3wxLZoBoSyXsCSbtqSnJCNxpuT01.nYDpY9SYgThUujDFVnrTFrhj4nzyxioLgLAtVyD4TV8flOiPkS4ylJv6UxXtmyUAku.GkQ.gee3GhzAfZRRsvQGH9wnSWVmVP9zlqHSooaol5qs1JsOKG51f0pod0MATWDPWMC85NEC4A8yIhzIFpl1tqBn0l7nZpWec.0kmfoFXM8TedXBN8cVi55rFUjxBJpNccj0FPzmmWqF56styJadxu5z.bwBlgrFdVGZVJRiVCGaB0hMWaMjuFj77zQ0P+pqWgIeUGStI+MTNq55YAmWyrJ6RaGnkVmrtFZaCnUuWQ0ztIDCkteR071qWIlA6dNNFB28dfBrqqDWesoP94yxFeDzvH29nFZWEFJmuEWMz2syrAy0QemYMl0QduMfy1y01ztjLKmKWc6V4Zja9AvtzNd9cbsfPHq2JuFh6B8Ys62WC48fqN5JApg3MfDaJWnNQCWXf0TTCOqsdkVN2v7Ikf.E1Gsl6hGLCio7jZndUnemoFl5jMbxYKzoNgCCXJpFpFN5BWzmpjo4bNOw.pqqxw7YquxIMSa20Zn9UkrTCsN6+ZpTqFp2zwkKubt5XvYGZcMe0AbmPVSgg0PNb6eGClajpsNRnf5MUNQXZTSwl0PcOHJAUjVGKPEXKasN73X0lTaacJ.5jVT.bMbr9FUX7bCVM0K6D12tc0EC.qptNlVsmqKpo1653XcXLddE50wPWnIwTGesHBJ+hp8O2Y9rGFHdTBNkT9QvKuB2RNwbQYakbP22kyw9tcfyZOyXcm5q5iwU3+V0wwrGqZx4pTFwH4gL1iDQBDjvcUuVCo4b4k0CSYQTFAIMNABJmgxhCwBxjWNRZyVKs3oEt0p2gyJoi3uR8lTR3QMGhiRIs1Jebyei3pIHkEmIjZx7thTuyklRi8qEK2ZEA+Q7WQR1Vtz1T8dWJXBGnb2mBjpgoCQMm9jg6qUwAnabajpZ9gxYRXKzoKt.x9yvp5m1saE4tSZNqRRqnISNcLjGLByNjD1TM+JSfcmOA10Jvt.AtnqDOCQj1w4B5xvlKB.KXCBCq1yMxC4Inl5W7kb0C7FvZYF9zBoqojJWhZuEhh9D.s6qe7fUhHrCEiji9AePqB9.hHeU05mb6Ykw9zCZ4R+Trq945wnYbqYoRe9vo5+rhlfGUPBJyanbCa4HfyPPuRG.XHF.yYhXDIYjIFd6xdMjaCvJnnrfY.DJVROqr3aNyDNlK9f7H8aJS2J8STg6FILalfszxzFNmpdEpZYrMNJRUEWyojcqsJG.1Wp64i.kw8Fl9Px8bxRXSxFL2wDEjetwBZJ0+whP.CslnYaZpnncmryz9GrUogOFBpH1Qkv.Pk0iu5HkBMtRbV5HM+P2hylrtCKB4Vy8umI26iOllRR.BP6dWnXSHM5Nn1yNoL1kDsT1QlL90RCyiwhQqn5ZGlnoby9YE0sPcfShBeK6VEPd1GH7CJVPpzWsHL.rdNqOqq6Xqs.dpOgKHOk0r0hx7xKd1hnoGZ3vRGyJpHRRoCqtxqj5XrIKa7.RRd7PNgdMth6kK4W8kKAu6q.ykd.Hjy1gQEOMlvp5Fw7r2Thmj.KpjOIzWC0O1dMT2MSH3LOZnWiE7sVQOMjkcb07q+yqJ12KhFRRzr+C70WZyDls2FHp+7yb2oY9y568k6bOr.qtKM6rQNCiIIBpx303djioADyMqsf+8HoGI3wRwMwKxqw0LJco76cSEhp04070O2060va57DXiWQCEilzw2908GQTu64I8b8+QepfLN+tNW3pu8su8eqFnn2tpt+7O+g+GqcbtwUG.t5+U8OA1vEW8ylFWxdbw027M+yxv0adyalAWK5hKczrEWlHan85OBw0equJb2qwO0uzh5mxZ5IlF0d2o+fCKvrriuS0KNhdHarFaK3KWdeDYnvanzYXO5eA3q7I8gyhEpx5dUe0tyf4veGClCe0Y8mCT5Za+Um9EkYa+EC94esp6iyCnabYkObUHbF6rie4urec9kIkfQyRwOwWjbBBiBoCGRRTKIxSiNtDq2V8iwpwA.zZofV0qNGIOJPM5u55cLkk8NkfZnx7O2TMEp5aG.MPg8GiAFu+v6jXg1cTR8hc1q2+59OlGlEgEtesApOwB6.xcIbthe003yRohSfeBF+e6SPXdg364uKUDLpbLdoRvnLi72GXz9garj+8kNrAhB.dE+G76+94qzv6Y7LAkcnrPmDpb80+IYi2SFEGPjZmIyan1ktwkTYIMsaqZqr.6QXg5Fem7mcvNp1MrC1IePuw3fD9KCL6Dp9zPtltGIlX5uJlE7erpMZl8O87FSCouLHvUTyvX2KJiqdQYbsKJiqeQYr2EkwMtnLt44yn5CI5SyD7wlvFOuGu680krznw8YXoGn1a06+Akbi92.
              

              this could be really useful but you can see how long you'd have to spend to actually fill out the word and phrase library to properly work and feel like AI. Hence the L in LLM I suppose

              1 Reply Last reply Reply Quote 2
              • HISEnbergH
                HISEnberg @marcrex
                last edited by

                @marcrex I've not done this in HISE but I've done this in JUCE: A text-to-impulse response generator. Users could input a short text (a big empty chamber), and a server-side model (I used TangoFlux) generated the IR audio file. I used a tagging system to structure the prompts and a local GUI that sends HTTP requests to the model running on a GPU backend (so like what David mentions).

                In short it could definitely be done but you would really need to narrow down specifically what you are trying to accomplish. If you're trying to generate audio from text (e.g., new samples or effects), you’ll likely need to offload that to a server. Most users won’t have the GPU horsepower for real-time generation locally, especially with modern diffusion models.

                If you want to use GPT to interpret text and adjust plugin parameters (The user says "create a metallic tap", GPT understands: lower the filter cutoff, shorten the envelope shape, etc.) that's a little easier. You could probably get GPT to map descriptions to internal parameters using a set of structured rules or fine-tuned examples (so give it a structured breakdown of your parameters and guidelines on mapping them).

                1 Reply Last reply Reply Quote 1
                • A
                  aaronventure @marcrex
                  last edited by aaronventure

                  @marcrex Ship ollama or a similar layer with your plugin, and a model of your choice. For this you probably want a good tool calling model, like Devstral. You will need the super light version. Unsloth has a super small quant https://huggingface.co/unsloth/Devstral-Small-2505-GGUF/blob/main/Devstral-Small-2505-UD-IQ1_S.gguf but I have no idea how well it runs. Check Gemma 1b as well.

                  Your system prompt needs to contain the functions that it has available for use and what each of them does.

                  Define these functions in HISE.

                  You'll run it by mounting the model, if not mounted, on plugin start, using BackgroundTask. These small models should be fast to mount anyway, so best to have it unload itself automatically after an hour of not using it.

                  The user can then write an instruction and the model is fed system prompt + instruction via BackgroundTask. The system prompt details what it can do, the instruction tells it what to do. You want to explicitly tell it to return a "function call" in a clearly specified format (for example doThis(arg1, arg2)) and for that to be the only return, no other text.

                  You should then validate the return of the BackgroundTask call, by checking that it contains a function by using string operations, and if it validates, you would call eval(return). eval() will attempt to execute the string within the code. In this case it's a function that you already defined, so it will run.

                  In the end, the brunt of the work is you designing the tools (functions) that the model can call.

                  This is a very simple example, if you want to chain tool calls, you'll need to feed the output of the function back into the model along with the previous prompt so that it can continue doing the work if needed.

                  If you want an example of how to use eval(), I did a cool experiment here https://forum.hise.audio/topic/9086/hise-s-best-kept-secret-dynamic-plugins

                  These are the limits of my current knowledge on the subject, some of these workflows may already be outdated, but hopefully I gave you a good starting point to go and explore.

                  Before you download ollama and start fiddling with local inference, you can try it out instantly via OpenRouter. https://openrouter.ai/mistralai/devstral-small:free

                  This free endpoint will of course use any data you send for training, but you're not gonna be sending any sensitive info, just your system prompt any whatever the user types in. For a simpler implementation (though obviously not as rock-solid) you can offer the user to enter their OpenRouter API and use the free endpoint that way, or provide your own via cloud functions, at which point it becomes a service. However, this way everyone can run the proper 24B model; on macBooks with 32+ GB of RAM it runs fine, but on Windows, VRAM isn't super easy to come by.

                  1 Reply Last reply Reply Quote 2
                  • First post
                    Last post

                  22

                  Online

                  1.8k

                  Users

                  12.1k

                  Topics

                  104.9k

                  Posts