Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Send with timestamp #45

Open
chris-zen opened this issue Dec 27, 2018 · 21 comments
Open

Send with timestamp #45

chris-zen opened this issue Dec 27, 2018 · 21 comments
Labels

Comments

@chris-zen
Copy link
Contributor

chris-zen commented Dec 27, 2018

Hi,

I was about to use midir, but realised of a missing feature that it is very important, the ability to specify the timestamp in the future when the MIDI data has to be sent (see this). I think this is specially important when working with audio and MIDI in sync and need the maximum precision possible (having to implement the scheduling of MIDI events myself with a thread is not an option, as this would lead to sync problems for sure).

I was wondering if you were considering to add this feature at some point (if it is really possible to find a common model for timestamps and synchronisation across platforms).

@Boscop
Copy link

Boscop commented Dec 28, 2018

Windows only supports sending msgs "right now", not "at some point t in the future":
https://docs.microsoft.com/en-us/previous-versions/dd798475(v%3Dvs.85)
https://docs.microsoft.com/en-us/previous-versions/dd798481(v%3Dvs.85)
https://docs.microsoft.com/en-us/previous-versions/dd798474(v%3Dvs.85)

So this feature would have to be built on top of midir as a separate crate.
I also implemented my own scheduling of midi msgs before msgs are sent out with midir..
(But my midi scheduling requirements are different from yours..)
But I think it wouldn't be difficult to implement audio/midi sync, with latency compensation etc.
Either way, it would be very specific to your application (just as my scheduling is specific to mine)..

@Boddlnagg
Copy link
Owner

Yes, midir unfortunately doesn't support this, because it's not available for all underlying platform APIs ...

@chris-zen
Copy link
Contributor Author

Thanks for the answers, it seems it won't be a feature :-(
thanks anyway, best.

@chris-zen
Copy link
Contributor Author

@Boscop @Boddlnagg I am reopening to explore another possibility. Would you consider midir providing an internal scheduler for those backends not supporting it natively, but use the native features in the ones supporting this feature ? That way, midir would provide a more advanced interface with a fallback implementation for scheduling for those backends that don't support it. I am mainly interested in Linux and Mac, which backends (ALSA, CoreMIDI) do support specifying timestamps in the future.

@chris-zen chris-zen reopened this Dec 28, 2018
@Boddlnagg
Copy link
Owner

@chris-zen Yes, this is a possibility that I've also thought about (which is why I hadn't closed this already).
However, my knowledge of timers etc on Windows is currently too limited to implement this myself (maybe there's some crate out there which already implements this scheduling functionality?).

Do I understand you correctly that what you need is a variation of MidiOutputConnection::send with a timestamp parameter? This is also what WebMIDI does (see https://webaudio.github.io/web-midi-api/#midioutput-interface, cf. #15). I remember that I once had a look at the WebMIDI implementation in Chromium, and saw that they used platform-specific timer APIs (apart from the MIDI APIs) even for ALSA and CoreMIDI, probably because they needed to integrate their scheduling with the rest of the browser's scheduling functionality. This confirms the point that @Boscop is making, that there are varying requirements as far as scheduling is concerned, and it's another reason why I have so far been hesitant about this.

@chris-zen
Copy link
Contributor Author

Do I understand you correctly that what you need is a variation of MidiOutputConnection::send with a timestamp parameter?

Exactly that.

I remember that I once had a look at the WebMIDI implementation in Chromium, and saw that they used platform-specific timer APIs

I guess that this is because you need to translate/sync timestamps between different clocks/frameworks. For example CoreMIDI requires a match_host_time, which might be different to the one used by Chromium for execution and JS timers.

there are varying requirements as far as scheduling is concerned

On that point, I would just look at existing APIs/frameworks and follow their general contract, which would consist on buffering MIDI messages to be sent in the future in a Heap, keep comparing the head of the heap with the current time (according to some reference clock) in a thread, and sending them to the output port whenever they are due. Obviously, in practice, things are more complex than what I describe if you wish to be precise and efficient, but I would keep in mind that this is just a fallback for those cases where the native implementation doesn't support this feature.

I have so far been hesitant about this.

And I can fully understand it, it is not a trivial amount of work.

It is sad that the common denominator to all those systems has to be so low. I am wondering what kind of applications is midir trying to address, but I couldn't figure out how to use it in my own, where high precision timing is a requirement, and where Linux and Mac are the main targets.

It is curious that my main motivation to start the coremidi crate was to help midir become compatible with Mac, and now I can't use it because Windows doesn't support this feature :-P

My knowledge on windows is low too (although I remember from some ages ago that there was a high resolution timer in the win32 api), so I am sorry not to be very helpful this time.

@chris-zen
Copy link
Contributor Author

@Boscop by the way, I just took a look to the Windows APIs and saw a set of functions for working with streams of MIDI data, which seemed to support the feature of sending MIDI data in the future. I don't know how difficult would be to use them, but worth taking a look.

MIDIEVENT structure See the dwDeltaTime field.
midiStreamOut function
midiStreamOpen function

@Boscop
Copy link

Boscop commented Dec 30, 2018

I wasn't aware of the stream functions and I'm not sure if they can be easily unified with the coremidi way, e.g.:

Before playing queued MIDIEVENTs, you'll want to set the Timebase for the stream device. This is equivalent to the MIDI File Format's Division. It tells the stream device how to scale the dwDeltaTime field of each MIDIEVENT. (ie, Consider it a SMPTE time in 30 fps, or a time-stamp at 96 PPQN, or a time-stamp at 120 PPQN, etc).

Is it the same in coremidi?

Btw, here is another tutorial.

Anyway, even if midir ends up supporting scheduled message sending, I think the current sending functions should be kept, and should still call the direct winapi functions for minimum latency & cpu usage (e.g. my application needs the lowest latency and sends a lot of midi messages, multiple kb/s).


It is curious that my main motivation to start the coremidi crate was to help midir become compatible with Mac, and now I can't use it because Windows doesn't support this feature :-P

Hm, if you only need to support Mac, you could just use the coremidi crate directly for now..

@chris-zen
Copy link
Contributor Author

chris-zen commented Dec 30, 2018

@Boscop every backend will have a different MIDI clock system (with the main characteristics of being monotonic with high resolution), for example, in the case of CoreMIDI it is the mach_host_time, in WebMIDI it is DOMHighResTimeStamp (number of milliseconds since navigation to the page started), and in Windows what you described, which allows the programmer to choose between different types of clocks (by PPQN, Frames, ...). The mission of midir would be to provide a unified/consistent clock view, and translate timestamps behind the scenes to the actual one. Just as an example, you might choose to use an f64 representing seconds since some arbitrary point in time (ex. when the computer started, it doesn't matter as far as it is monotonic and high resolution). midir would also allow to get the current timestamp, so the application can use it as a reference for creating timestamps for the send call. Then midir would convert all timestamps received through the send call into the internal clock units, and delegate to the backed implementation.
The difficult part would be to find a unified/consistent clock view, so there is minimal errors translating between clocks for the different backends. In most of the backends (ALSA, CodeMIDI, Jack), the native implementation allows to proxy the send calls from midir to the native system just doing some timestamp conversion (please correct me if I am wrong). In the case of Windows, it seems like the events received through the send call would be put into a queue so the internal stream process can pick them whenever the callback is called from the MIDI stream.
Definitively this is not trivial. But it is the feature that makes the difference between just controlling devices, vs sequencing music.
Anyway, with my comments I am just trying to help in the brainstorming, just in case you are really considering to work on this.
In my case, I am mainly interested in Linux and Mac, and will end implementing my own midir for it (using coremidi and any of alsa or jack), but it would be nice to have it in a portable library 😉

@chris-zen
Copy link
Contributor Author

@Boscop @Boddlnagg I saw that portmidi also supports to specify the timestamp when sending data, and they also support Windows. After taking a look into their implementation, it doesn't seem like it would be that difficult for midir to support that feature at all.

https://sourceforge.net/p/portmedia/code/HEAD/tree/portmidi/trunk/pm_win/pmwinmm.c#l733

The key seems to be the use of the streaming API.

@Boddlnagg
Copy link
Owner

Okay ... so the streaming API allows timestamp, or rather, enforces timestamps ... when PortMidi uses the streaming API and wants to send a message immediately (timestamp = 0) it manually computes the correct timestamp. This seems suboptimal, and it's probably also the reason why PortMidi uses both the non-streaming and streaming API, depending on how the user initializes the PortMidi context (latency == 0 or latency != 0).

Furthermore, in the WinRT/UWP implementation, which midir also has as a possible backend, there seems to be no equivalent to the streaming API. For sending there's only an API that ignores the timestamps (see https://docs.microsoft.com/en-us/uwp/api/windows.devices.midi.imidimessage).

Because of this, I don't think it's a good idea to use the streaming API, because we would still need a manual scheduler implementation for the WinRT API (except when building a scheduler with WinRT APIs itself would be a lot easier than using classic WinAPI, but I don't know that).

@Boddlnagg
Copy link
Owner

Another option is to have an extension API for this functionality that's simply not available on Windows. This is already the case for our virtual ports support.

Would that help anyone here?

@Boscop
Copy link

Boscop commented Sep 28, 2019

Well, I only use midir on Windows thus far..
Btw, how difficult would it be to implement a virtual port driver in Rust for Windows?

@Boddlnagg
Copy link
Owner

Boddlnagg commented Sep 30, 2019

Btw, how difficult would it be to implement a virtual port driver in Rust for Windows?

Well, I don't know since I have never done any driver work (neither in Rust nor another language). But the benefit of such an effort seems questionable to me. A driver can't be made a part of the midir library directly, it always will need to be installed system-wide before any application could use it. At least that has been my understanding so far, and I don't know why it would be any different when the thing is written in Rust.

@Boscop
Copy link

Boscop commented Jan 13, 2020

@chris-zen Do you know what the advantage is of using the windows midi streaming API vs the normal midi out api? With both, I have to do the timing (there is no external clock that calls my callback that returns all midi msgs to be sent for that timeframe). The only difference seems to be that with streaming, you queue the midi events with a timestamp and then flush the queue[1]. So their relative timing will be more correct but the latency will be higher (because you're not sending them out immediately), right?
Do you know if there's any way to get an external clock calling my callback that returns the midi msgs to be sent (with timestamp representing the time offset from the start of the "buffer") like with audio output?

@chris-zen
Copy link
Contributor Author

@Boscop I think that the point of those APIs is to avoid callbacks and the user to have to deal with precise timing. The user just queues the events that need to be sent with their respective timestamps in the future, and the API will take care to delivering them accurately on time. That's all. At least that's how CoreMIDI works, and how the portmidi interface is built on top of the Windows Streaming API. Those OS libs have realtime threads that can deal with high precission timing, which a mortal app cannot. That's why it was so important for me that midir implemented that interface, because otherwise it is useless for serious applications like DAWs, MIDI players, and so on.
When are those MIDI messages with timestamp sent to the API ? I usually have an audio thread with a callback that takes care of selecting the MIDI messages that need to be scheduled for that specific buffer period, that's all. midir doesn't need to provide those callbacks, because AFAIK people usually use Audio callbacks or any other OS feature for high precision timing callbacks.

@Boscop
Copy link

Boscop commented Jan 21, 2020

@chris-zen Doesn't that queueing add latency though, compared to sending the msgs out instead of queueing them?
Also, in the absence of audio callbacks, which OS features do you recommend using for high precision timing callbacks? I wrote an iterator that uses time::precise_time_ns internally but when I sleep the thread for the remainder of each frame, that sleep call might wake up at the wrong time, it doesn't have good time resolution.
I need to send low latency midi out, currently I'm doing it without queueing, and I don't have an audio callback. I'm streaming midi from my live performance application into the DAW which opens the ASIO device so I can't also open the ASIO device, JUST to get a precisely timed callback. Any idea how to get the tightest timing in this scenario? :)

Maybe this would be more accurate than a loop that sleeps for the remainder of the time each frame?: https://docs.rs/crossbeam-channel/0.4.0/crossbeam_channel/fn.tick.html

@chris-zen
Copy link
Contributor Author

@chris-zen Doesn't that queueing add latency though, compared to sending the msgs out instead of queueing them?

That queuing will be done or not by the low level OS API (CoreMIDI, ALSA, Windows MIDI Stream) depending on the timestamp, so we as users don't need to bother about the details, and just expect that it will be done in the more efficient way.

Even if you use a precise timer, your code is not running in a real-time thread, so you will get unexpected delays when waking the thread up. I am surprised that ASIO don't allow two processes to process audio at the same time. But assuming that ASIO has that limitation you still have other audio drivers or devices that you can use for the callbacks. And even if you didn't have any audio device remaining, depending on the OS there are services to have precise callbacks (I used one from Windows plenty of years ago, but I don't remember now), but you might need to reconfigure your scheduling thread for real-time. You might be interested in this.

Honestly, if portmidi supports specifying the timestamp when sending MIDI for all major platforms (including Windows), I don't see why midir couldn't do it. And about the scheduling needed by applications to send MIDI events in chunks, you need to look into the native multimedia services (or a library that wraps them, for example cpal), or a OS service with precise callbacks with real-time priority threads.

@faern
Copy link
Contributor

faern commented Apr 13, 2020

It could be implemented for only the platforms that supports it natively and ignoring Windows. By a unix extension trait: midir::os::unix::MidiOutputConnectionExt that has MidiOutputConnectionExt::send_at and takes the extra parameter being the timestamp when it should be sent.

@Boddlnagg
Copy link
Owner

@faern I already suggested this basically in #45 (comment), but I'm not sure if it would really help. I'd still like midir to be as portable as possible, and having virtual ports only on non-Windows is already kind of sad ...

@faern
Copy link
Contributor

faern commented Apr 13, 2020

Sorry. I searched the history for "trait" only, so I missed it. Most of the thread was about how to implement the scheduling so I did not read every message.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants