Made an MCP server that lets you design sounds on the Digitone 2 with natural language

How does this work?

You just describe what sound you want to design on the DN2 using text, something like, “Make a fat dark Bass sound” the tool connects to Digitone via midi and makes the sound for you using Claude desktop.

This tool uses Claude desktop, Claude is the competitor of ChatGPT.

Project open source here → GitHub - zerubeus/elektron-mcp: MCP sever for controlling Elektron devices using LLMs

17 Likes

This actually connect with DN2 and manipulate the program?

Exactly, you write the text in natural language and it applies the sound design directly to the DN2, you can see in the demo how the parameters change automatically in the screen of DN2

1 Like

Interesting idea, asking the obvious question if it can work with OG Digitone?

2 Likes

This is rad. Coincidentally, I’ve been using quite a few MCP servers at work in my IDE. Impressed!

wow, need to test this! thank you for sharing!

Nope, I don’t have the OG DN to test.

can’t try, but e.g. if you said …

hey claude, make 128 variations of the generic sounds of the impending soulless dystopian future where creativity has been run through a blender

… would it make a good soundpack, not that i’d want to use that (or any soundpack that somebody else made tbh) … i’m genuinely (if somewhat sardonically and rhetorically) just wondering how long it will be before we have bots selling soundpacks on here !

Kudos on your skills to accomplish your vision, i’m not sure it’s a direction of travel (within this world) that i, for one, can applaud nor relish although i am sure somebody will get happy accidents from it … although i personally couldn’t imagine feeling true creative ownership of anything regurgitated by a process like this, unless i had trained it in my own way

Before anyone starts suggesting this is luddite diatribe, i’m no luddite, plus i have built and trained neural networks in a serious professional context long long before AI was the latest marketing thing - i just know that what moves me are the witness marks from a human touch - this includes the creative intellectual leaps to do meaningful good with an method like this

so whilst it’s impressive you can do this, it’s giving me a mildly depressing foretaste of an unpalatable hollow end game, even if only as a proudly defiant spectator to it … ymmv

25 Likes

Truth :pray:

1 Like

Isn’t randomized parameters already built into the Digi devices??

9 Likes

While this certainly demonstrates a comprehensive level of coding, for which I can respect the many hours spent, any effort to remove our humanity from the creative process just makes me profoundly sad.

In all honesty, I see no point to music, art, or literature that is created this way, and don’t want to live in a world wherein this might qualify someone as a musician or music producer.

I can only hope that the technology on display here gets put to better use, in what I see presently as a bleak and soulless future.

Sorry, zebra. I envy your grasp of programming though, and it should serve you well.

Cheers!

4 Likes

I found it a very interesting tool with a lot of potential, I like to think of LLM as a tool, not a replacement for creativity, the same way a screw driver helps you to be more effective at removing screws with it instead of trying to use a spoon or a knife.

However I must say that I`m not really familiar with the technical steps for setup this on my computer, maybe a fool proof step by step guide will be very useful for those of us not familiar with code?

2 Likes

Appreciate the kind words—thank you! I also can’t stand fully AI-generated art. I built this as a sound design assistant because I’ve been struggling with FM and wanted an LLM to guide me, almost like a copilot on my sound design journey. The main goal is to learn from the happy accidents it creates—letting the computer handle the boring, non-creative stuff while I focus on the creative part.

Thanks again!

3 Likes

Excellent metaphor for the use of AI/LLMs.

2 Likes

But that’s not what @Zerba is doing. He’s using AI as an assistant in the initial part of the process, shaping an initial sound.

1 Like

This is super cool! I have an OG Digitone and would love to try this out.

Te day when we ask a computer to create a pattern/kit/song to a description similar to those mentioned in this thread, and via a sysex communication immediately produces something, then we ask it to play it and modulate some parameters, is almost here.

We can focus on more important things for our fingers and thumbs like ear wax and tiktok.

4 Likes

Well from my side I think this is a well done example of how to use AI tech as a tool, not a replacement, no matter how much gear you have, if creativity is not there it won´t make a difference, and yes you can just randomize parameters over and over until you found something that you like and tweak it from there, if that works for you is all good, however, with this tool you can just input text with whatever idea you have in mind, then the LLM will show you the result on your prompt, and then, just like with randomize, you will tweak it to your taste, or not, maybe is just non sense and then you can try another idea and so on, I don´t see why not explore that avenue, I definitely would, one thing for sure is that the LLM wont make you a great artist, and finally, what more experimental that write down an idea, input text to the machine and see what you get back? over time you can learn how to promt the LLM in order to find the results you like, and like Zerba said, you can learn more about how this fine piece of hardware works, I think this is a really smart use of AI.

I just need a brief tutorial on installation as I don´t know sh+t about coding :rofl:

2 Likes

To me something like this must be very useful for someone with physical disabilities. But for someone else, if you know what to tell to this software already, why not tweak those knobs yourself?

1 Like