Recording midi from sequencer via OB (USB)

I’ve discovered that when recording in a DAW, if you play notes from the trigs it will record midi in live; however it will not capture a sequence. It seems that sequences do not transmit midi data, a reason for which I cannot figure out why to not give users that option.

According to this older thread Sequencer Midi receive via usb:

I am wondering, why is it a design choice? It doesn’t seem to serve much design utility other than to limit users. If it’s one of those Apple-esque company design behaviors to force a user to remain in the ecosystem, I feel like that’s a bit silly – I bought the device because I like the synth engines most of all, and the sequencer is great for sketching. But I see no reason to not be able to continue the sequence on from there in a DAW for final arrangement finesse etc.

I understand the workaround to copy a pattern to another and change them all to midi output engines but… ugh. If the machine is capable of outputting midi, it seems like it should be possible to do it without the extra steps.

I presume the processor internally can deal
With 16 tracks of midi

Either they are internally processed or sent externally due to throughout of the data .

Everything is a design decision to ensure things done start spluttering / glitching and impacting solid timing.

Similar to ps5 not being an infinite memory / polygon device - they made a decision with amount of memory / data bandwidths etc based on money / cost / how hot the device runs

Though I’m just guessing - I’ve never designed a synth or midi device or a ps5 … maybe you have and have expert knowledge of data management on this type of hardware platform.

1 Like

It serves the low-latency goals of Overbridge by not including competing clock polling and overhead, especially when it covers redundant functionality.

Overbridge provides USB MIDI and audio.

If you want more class compliant USB MIDI and audio, there is a mode for that.

If you want USB MIDI and audio and USB MIDI and audio, that’s extending the system past its design.

It’s not an arbitrary delimiter, all OB devices are designed around this prioritization.

It isn’t putting two different hardware drivers in charge of timing.

It is “possible” in a purely technological standpoint but you have the worst of all worlds, in that neither the OB nor USB MIDI source would be as reliable.

The OB design is a custom driver with additional controls and an app which provides interactive access to parameters and settings. This supersedes the class compliant USB driver which only handles the standard MIDI/audio datastreams, so it’s really one or the other.

We all have our workflows and possible use cases, but every digital device is going to wrestle in its own manner with potential conflicts of priority and standards, especially with a custom ecosystem.

Access handled their Total Integration controls differently, and with much different internal priority over their Kemper Amps, but perhaps it is useful as a comparison.

Tech is faster/better since OB1, USB chipsets and computers have updated their bandwidth, the further need to be OS agnostic leads to even more complexity in what they can get away with in Windows, macOS, etc. It is great that persons have coded up a Linux port for OB!

That said, there are plenty of factors we could discuss (I used to be a PM for a PC OEM ages ago and love the excuse to delve deep) , the decisions were not for no reason and I think are worth considering in matching a more realistic workflow.

No instrument does everything! Sometimes we need to simplify even if it’s a pain compared to what could be.

2 Likes

Ah, ok, I wasn’t aware and simply hadn’t considered those details (figuring all was possible with current tech). But that all makes sense. I appreciate you taking the time to explain it and you did so very clearly. I was otherwise thinking it was arbitrary or, like I said, an Apple-like decision to force users in the ecosystem – but now I understand this much better rather than being told “it is what it is, just because.”

Hopefully one day as tech improves and bandwidth expands something like that could be possible. Alternatively I’d be curious if they could introduce some kind of “print sequence” or “send sequence to DAW” function, where it is a separate task not requiring the audio and full bandwidth to be running simultaneously. Thus not a real-time recording, and more like data being pulled over. I obviously have no idea what I’m talking about with regards to tech implementation (lol), but as a user I imagine something like that could be helpful/cool.

Thanks again!

1 Like

Yes, OB1 generation has been superseded. OB2 still supports those devices and older platforms, we will see as USB develops further, how this might be factored in.

Right now, I don’t see much change, but who knows? There are a lot of moving parts beyond Elektron, there are Microsoft and Apple, there is the supplier of the basic native driver, there is the chipset support of the existing Digi2 platform, and there is the extra customization Elektron builds on top of it all (beyond DAW support, VST3, etc etc.)

No prob! I’m happy to have at least provided some abstract reasoning that clicked. I don’t work for Elektron so I still have to piece together likelihoods from outside and what I know of the history of OB and computing platforms directly.

My spouse might angle to get a position in Sweden so maybe I’ll just annoy the front desk at Gothenburg some day :slight_smile:

This might be an interesting workaround, so what is the intended goal? Can you set up a Song, record those MIDI patterns into the DAW, then flip to OB and record the audio/OB automation lanes separately?

It’s an extra step of course.

1 Like