Made an MCP server that lets you design sounds on the Digitone 2 with natural language

happy accidents? why use a sequencer when you can just play it?

3 Likes

Would there be any way to get the program to control other aspects of a device? Use natural language to describe a default project template; setting things like project name and other boring-to-setup settings, eliminating menu diving. Maybe it could set some random presets on specific tracks, or use the tag system in the browser to pull presets that you’ve designed and stored.

I could see that being pretty useful.

Just leave the sequencer alone, haha

1 Like

the vast majority of people were fine with sampling and synthesizers!

the big deal was generally legal (sorry Biz :frowning: )

1 Like

Cool project. Don’t let the hand wringing get to you. Socrates was skeptical of writing because he thought it would make people forgetful. Like any almost any technology, AI will undoubtedly change music for the better (and worse)!

I think it’s interesting more from a user interface perspective than an AI perspective.

Especially with FM synthesis parameters that are fairly complex and a bit esoteric.

I mean, why can’t there be 128 different words that might describe 0-127 midi value steps of a given parameter?

Why couldn’t there be phrases that describe LFO rate, shape and depth?

1 Like

Has anyone made something like this for an EaganMatrix synth yet? I feel like it’s going to be particularly powerful for that kind of complexity.

I think it won’t be too long before a lot of sound synthesis is done entirely by smart prompting of different AI systems, generating novel DSP code directly rather than just generating patches or samples as seems to be most common currently.

This doesn’t necessitate putting the human operator out of the loop on the interesting creative decisions about how to shape a sound. All most people are seeing right now is short natural language prompts going in one end of an AI and some output coming out the other, but if you use a little bit of imagination it should be easy to envisage a system where the human is much more integrated in a back and forth with the system as you shape the sound together. I think this should probably involve much better physical interfaces, haptics etc. And I strongly suspect that this would involve something much more elegant than the typical LLMs we have at the moment, people are working on such things.

2 Likes

Exactly what my mind went to. I can teach myself the basics of FM synthesis to the point I don’t want an AI assistant. EaganMatrix, Kyma, M4L though… I could certainly see some help being needed to open those up to more curious people.

1 Like

That’s also where I‘d like to see AI implementation going, if it will become more widespread. There’s lots of repetitive things you do on Elektrons that are more like organizing office work. I‘d be fine if AI helped me do that faster and with less potential errors.

Nobody complains about track swap and how it robs you of the human experience of manually copying tracks from A to B and messing up at some point. If AI could help us, why don’t we start here instead of the creative process?

However, those functions should imo always be opt in and optional. My nightmare scenario is a world where all kinds of devices remove the possibility for me to do things a certain way without discussing that with an AI. I see a bit of a bad trend in UI to give you less and less control.

2 Likes

Absolutely

3 Likes

Absolutely. :100:

This is exactly my experience (FWIW) using AI coding assistants: Get them do do the drudge work that you find boring and repetitive, so you can apply your skills and experience elsewhere.

4 Likes

In my opinion, nothing compares to learning by just twisting knobs and paying close attention.

2 Likes

Beautiful, very impressive!

great idea, I just tried but I must have made an installation error …

1 - Python 3.10+ Ok
2 - UV and UVX installed. OK (uv 0.6.10 (f2a2d982b 2025-03-25))
3 - I modified: claude_desktop_config.json OK

In “Claude” I use the same prompt as in the video but I just have another window that appears

Digitone 2 MCP Dark Bass Sound Design (Wavetone Mode):

  1. Oscillator Configuration
  • Select Wavetone mode
  • Choose dark, complex wavetables:
    • Wavetable 1: Thick, low-end focused wave
    • Wavetable 2: Saturated/distorted wave
    • Wavetable 3: Sub-harmonic wave
  1. Modulation Strategy
  • Use multiple operators for rich texture
  • Primary oscillator: Deep fundamental
  • Secondary oscillator: Add harmonic complexity
  • Modulation amount: High (60-80%)
  • Crossfade between wavetables for movement
  1. Pitch and Modulation
  • Pitch modulation:
    • Subtle pitch drift
    • Slow LFO modulation (0.1-0.3 Hz)
  • Use micro-pitch variations for depth
  • Slight detuning between oscillators
  1. Envelope Shaping
  • Amplitude envelope:
    • Slow attack (20-50 ms)
    • Medium decay (200-300 ms)
    • Low sustain
    • Medium release (150-250 ms)
  • Filter envelope:
    • Moderate cutoff modulation
    • Resonance for additional character
  1. Filter and Processing
  • Low-pass filter:
    • Cutoff: 100-300 Hz
    • High resonance
    • Slight self-oscillation
  • Saturation/distortion:
    • Soft clipping
    • Minimal drive (20-40%)
  1. Additional Techniques
  • Use unison mode for width
  • Apply subtle chorus effect
  • Add light reverb for space
  • Gentle side-chain compression

Pro Tips:

  • Experiment with wavetable crossfading
  • Layer multiple patches for complexity
  • Use microtuning for organic feel

Didn’t work? Is your Digitone possibly connected to other software via USB MIDI—like Transfer, for example? That could interfere.

You should have a log generated by Claude—if you share it here, I can help debug the issue.

Also, make sure to specify which track (1 to 16) you want Claude to target.

Try something like:

“Use Digitone MCP to design an evolving dark pad using the Wavetone machine on track 1.”

Finally, check the developer section for any errors.

If everything’s set up correctly, it should work and generate some wild sounds.

1 Like

I just checked if all the libraries was install, everything is ok

I have just entered into developpers mode , the main error is
“spawn uvx ENOENT”

If I understood correctly : the server tries to run “uvx”, but it does not find it

I have just verified with:
uv --version
python -c “import fastmcp; print(fastmcp.version)”

everything is ok

Logs :
2025-03-27T16:07:00.376Z [info] [Digitone 2] Initializing server…
2025-03-27T16:07:00.401Z [error] [Digitone 2] spawn uvx ENOENT
2025-03-27T16:07:00.402Z [error] [Digitone 2] spawn uvx ENOENT
2025-03-27T16:07:00.406Z [info] [Digitone 2] Server transport closed
2025-03-27T16:07:00.406Z [info] [Digitone 2] Client transport closed
2025-03-27T16:07:00.407Z [info] [Digitone 2] Server transport closed unexpectedly, this is likely due to the process exiting early. If you are developing this MCP server you can add output to stderr (i.e. console.error('...') in JavaScript, print('...', file=sys.stderr) in python) and it will appear in this log.
2025-03-27T16:07:00.407Z [error] [Digitone 2] Server disconnected. For troubleshooting guidance, please visit our debugging documentation
2025-03-27T16:07:01.912Z [info] [Digitone 2] Initializing server…
2025-03-27T16:07:01.928Z [error] [Digitone 2] spawn uvx ENOENT
2025-03-27T16:07:01.928Z [error] [Digitone 2] spawn uvx ENOENT
2025-03-27T16:07:01.930Z [info] [Digitone 2] Server transport closed
2025-03-27T16:07:01.930Z [info] [Digitone 2] Client transport closed
2025-03-27T16:07:01.930Z [info] [Digitone 2] Server transport closed unexpectedly, this is likely due to the process exiting early. If you are developing this MCP server you can add output to stderr (i.e. console.error('...') in JavaScript, print('...', file=sys.stderr) in python) and it will appear in this log.

Yes uvx not found, you installing this on windows?

Try

which uvx

Which uvx gimes me this:

Yes I install on windows, I copy in this file claude_desktop_config.json

Sorry my levels are low

Please, could you post a tutorial on how to install on a Mac? Thank you very much, great contribution.

You need to have uvx in the path something like this “$HOME.local\bin\uvx.exe”

You followed the tutorial from this page ? Installation | uv

Yes I followed well

C:\Users\me>where uvx
C:\Users\me.local\bin\uvx.exe