Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iOS support #237

Open
melanchall opened this issue Jan 17, 2023 Discussed in #235 · 27 comments
Open

iOS support #237

melanchall opened this issue Jan 17, 2023 Discussed in #235 · 27 comments
Labels
feature request API changes required
Milestone

Comments

@melanchall
Copy link
Owner

Discussed in #235

The discussion moved to the issue for more convenient conversation. Key moments:

  1. For iOS we need an .a file statically linked with an application.
  2. The issue created on MAUI repo – DllImport with .a file for iOS in MAUI dotnet/maui#12675.
  3. Here – DllImport with .a file for iOS in MAUI dotnet/maui#12675 (comment) – there is an instruction on how .a file should be attached and used.

This issue for discussion on iOS support within DryWetMIDI library.

@melanchall melanchall added question Just question about the library feature request API changes required and removed question Just question about the library labels Jan 17, 2023
@melanchall
Copy link
Owner Author

Hi @ayaphi,

So I've modified your test solution, including all the sourcesof my library and changed all DllImport's to specify __Internal. Also I've replaced fat.a file with Melanchall_DryWetMidi_Native64.a one which contains all native API required by DryWetMIDI. So we are getting closer to the real case.

Here the solution – TestifyWithDwm.zip. I've written simple code there:

var inputDevices = InputDevice.GetAll();
var outputDevices = OutputDevice.GetAll();

var stringBuilder = new StringBuilder();
stringBuilder.AppendLine("Input devices:");
stringBuilder.AppendLine(string.Join(Environment.NewLine, inputDevices.Select(d => d.Name)));
stringBuilder.AppendLine("Output devices:");
stringBuilder.AppendLine(string.Join(Environment.NewLine, outputDevices.Select(d => d.Name)));

// put stringBuilder.ToString() somewhere to see the result

As for last line, I really don't know where it would be handy to output stringBuilder.ToString() in UI so we can see whether API works or not. It's just a simple code which doesn't use many things from the native binary, but we need to start from something small I think. We'll go further if this code works.

Also I should mention that right now I don't know how to build real native API for x86_64 (I have a lot of compilation errors) architecture so Melanchall_DryWetMidi_Native64.a is built in fact for arm64 one only. As far as I understand it's not a problem for you since you run the simulator on arm64.

Can you please check the new solution on iOS?

Many many thanks!
Max

@ayaphi
Copy link

ayaphi commented Jan 17, 2023

Hi Max,

I tried your solution, but sadly I got compilation errors

Error		clang++ exited with code 1:
ld: warning: ignoring file Melanchall_DryWetMidi_Native64.a, file is universal (arm64) but does not contain the x86_64 architecture: Melanchall_DryWetMidi_Native64.a
Undefined symbols for architecture x86_64:
  "_AreInputDevicesEqual", referenced from:
     -u command line option
  "_AreOutputDevicesEqual", referenced from:
     -u command line option
  "_CanCompareDevices", referenced from:
     -u command line option
  "_CloseInputDevice", referenced from:
     -u command line option
  "_CloseOutputDevice", referenced from:
     -u command line option
  "_CloseSession", referenced from:
     -u command line option
  "_CloseVirtualDevice", referenced from:
     -u command line option
  "_ConnectToInputDevice", referenced from:
     -u command line option
  "_DisconnectFromInputDevice", referenced from:
     -u command line option
  "_GetApiType", referenced from:
     -u command line option
  "_GetEventDataFromInputDevice", referenced from:
     -u command line option
  "_GetInputDeviceDriverOwner", referenc	Testify	C:\Program Files\dotnet\packs\Microsoft.iOS.Sdk\16.1.1477\targets\Xamarin.Shared.Sdk.targets

Its related to that what you have already mentioned : you did not include a build for x64 and x64, which you are right, is practically not needed as my Mac build host and hence the simulator are both Arm 64 bit.

So this seeems to be build pipeline restriction - but maybe the .NET MAUI team knows a clever workaround, like a compiler / linker switch that can be injected in the project file respectively the plattform specific build directives?

But this not the only "bad news", because I tried to find a way to send MIDI messages in a .NET MAUI app without success : the API that should do it, MidiPort.Send is obsolet since iOS 14 !

I opened a discussion on the .NET MAUI repo #12720 and also a question on Stackoverflow https://stackoverflow.com/questions/75151944/net-maui-ios-midiport-send-is-obsolete-since-ios-14-no-other-way-for-send .

+++

This makes me thought if the efforts (and precious freetime) that especially you put into getting native playback run on iOS by means & features of DryWetMidi are really worth it at this point in time (development stage of .NET MAUI)?

The native API's that .NET MAUI provides through bindings seem not to be up to date in all areas - CoreMIDI is the second native library, AVFAudio was the first, whose "Binding APIs" are not up to date.

Its not about putting blame upon the .NET MAUI team, but to see clearly the state of the .NET MAUI project - it really seems to be still a "work in progress" and not finished yet (the significance of the term GA or release version 1.0 has been washed out in its "pure" technical meaning in our entire industry - like a huge reverb would do - and has taken in other missions like trigger curiosity and engagement "in" possible users).

This "GA has no technical meaning one can rely on" seems to be even "more true" (okay, its relative as humans can only have individual perceptions of reality and hence "truth" ;-) for .NET MAUI, the more one gets to "not line of business applications" features and requirements.

I am also quite sure that this is only a matter of time, till it will be production ready, and I don't think that it will be a second UWP (that has been died, before it was allowed to walk around) ... but currently we are in another "space of time" ;-).

+++

Secondly, even if .NET MAUI would have 100% native plattform API covarage, we should review, if its the right approach to let DryWetMidi be responsible for MIDI playback on iOS. On Windows that totally makes sense and works like a charm.

The operating system architecture in the sense of an "application plattform" is very different - UWP had let us experienced to what restriction & barriers this can lead, on Windows too - between a "sandbox enclosed" (iOS) and "free in the wild" (Windows - not UWP), and also how pipelines of MIDI and Audio are established and run underneath & between the apps.

I have not dived deep enough into iOS development yet, to claim "overseeing wisdom" ;-), but ... : let Apples core frameworks do tasks that are related to demands of efficiency and "near realtime", especially when they offer an explicit solution (classes in Swift or API set in Objective-C) and if this does not fulfill all needs, use established specialized frameworks, that have gone through all that nebula and pain before and have that job get done.

Regarding MIDI Playback, Apple offers a Sequencer (AVAudioSequencer : https://developer.apple.com/documentation/avfaudio/avaudiosequencer) that works for "in app" playback, which means this seqencer cannot send your midi to the "world outside the app" (MIDI output is limited to the border of the app sandbox respectively the node graph one sets up with an AVAudioEngine), but you can setup "instrument nodes" inside an AVAudioEngine session, that e.g. can playback Soundfont files. I tested it and it works.

In a nutshell : you can load midi files and even add or remove midi events on the fly and during playback.

For reaching outside the App sandbox there is the AudioKit open-source Framework (https://github.com/AudioKit/AudioKit), that offers a similar Sequencer that can send midi into the CoreMIDI pipeline. I did a quick test with a Swift Playground and I could send MIDI from the iPhone over the network to my PC.

+++

What could this mean for a DryWetMidi user : we could use the Nativeless package and maybe some extension methods or utility class that realizes the bridge to the iOS Sequencer APIs, translate DryWetMidi constructs in tuple of values that sacrifice the parameters needed for AVMusicEvent and AVMusicTimeStamp instance creation in order to call the addEvent function e.g. (https://developer.apple.com/documentation/avfaudio/avmusictrack/3929223-addevent) - time representations in the sequencer APIs are beats based and AVMusicTimeStamp is just a fractional number of beats.

I did not have tried the nativeless package of DryWetMidi, but I will in the next days : loading a MIDI file from the assets into DryWetMidi and then pour it into an AVAudioSequencer instance and have it play back the midi tracks with a piano & ambient guitar soundfont inside a AVAudioEngine node graph.

And for the "world outside the App sandbox" - it might be a solution to build a .NET MAUI binding library for the AudioKit open-source Swift Framework and hence make it accessable in .NET MAUI. This will be quite a lot of work, but maybe it can be restricted to the classes and API that matter. I would not even begin to think about using the Audio API's via C# ;-), but the MIDI parts should work via bindings.

But before I give this "building a binding Library for a Swift Library" a try, .NET MAUI has to mature and I like to see a full .NET MAUI related documentation on their website, before I try to wrap my head around this (no out of date Xaramrion iOS docs on this one - as the project system has substantialy changed).

That's my thoughts ... and if you still would like to bring this playback challange on iOS nevertheless and "against all odds" (greatings to Phil Collins ;-) to an end, at least get it build successfully and theoretically running, I am still on your side and will test it as far as I can ... but as mentioned in the beginning, I have very little hope that we are able to make a real world test by sending MIDI outside the app.

If we delay this until .NET MAUI has matured or it makes really sense (for what ever reason), I am totally fine with that as well ... it's up to you (and don't forget to take care for yourself - freetime, once upon a time was created and hard fought by men, for recovering, refuel and relaxation ... don't waste your precious freetime for something that's not worth it ! & ;-).

have a nice day and cheers
Ayaphi

@melanchall
Copy link
Owner Author

Hi Ayaphi,

First of all, can you please try a new version of the native library – Melanchall_DryWetMidi_Native64.a? If it doesn't work, well... it looks I need take a rest for some time :)

As for nativeless package, yes, it has extension points. You can manually implement IInputDevice/IOutputDevice interfaces and use your custom implementations across DWM API (in Playback, for example). Waiting for your response whether the nativeless API works on iOS or not. I suppose it will work because this API is totally about internal processing, no resources outside of the library are used, so sandboxing should not affect the library. The only point where you can get a trouble is reading a file by path. But you can read a MIDI file from stream and here it's up to you how to get the stream for a MIDI file (I suppose iOS has the means for that).

Thank you, Ayaphi! It becomes a long long story...
Max

@ayaphi
Copy link

ayaphi commented Jan 18, 2023

Hi Max,

I tried your new Melanchall_DryWetMidi_Native64.a build, but sadly there are still compiler errors regarding x86_64, but the root cause seems to be different and related to the linker process : iOS 16 needed | iOS 11 used.

Error		clang++ exited with code 1:
ld: warning: object file (Melanchall_DryWetMidi_Native64.a(Melanchall_DryWetMidi_Native64_x86_64.o)) was built for newer iOS Simulator version (16.0) than being linked (11.0)

Undefined symbols for architecture x86_64:
  "_GetInputDeviceSysExBufferData", referenced from:
     -u command line option
  "_GetOutputDeviceChannelsMask", referenced from:
     -u command line option
  "_GetOutputDeviceNotesNumber", referenced from:
     -u command line option
  "_GetOutputDeviceOptions", referenced from:
     -u command line option
  "_GetOutputDeviceSysExBufferData", referenced from:
     -u command line option
  "_GetOutputDeviceTechnology", referenced from:
     -u command line option
  "_GetOutputDeviceVoicesNumber", referenced from:
     -u command line option
  "_OpenInputDevice_Win", referenced from:
     -u command line option
  "_OpenOutputDevice_Win", referenced from:
     -u command line option
  "_OpenSession_Win", referenced from:
     -u command line option
  "_RenewInputDeviceSysExBuffer", referenced f	Testify	C:\Program Files\dotnet\packs\Microsoft.iOS.Sdk\16.1.1477\targets\Xamarin.Shared.Sdk.targets	1225

Looks like progress in the right direction - although I can imagine that for you it's more like a pain than feeling "good" for this, as you have to fiddle around in the compiler & linker settings for the native library build, like in nebula ... I mean, fog is already bad (but you can feel and touch around - if you would have a iOS device), but messing around in (astronomic) nebula is hard to take.

So please take a rest, if you feel you should to ... we can continue on this some time later in the future, or maybe even drop this challenge at all.

For myself I already emigrate to planet Swift & SwiftUI ;-) and for other DryWetMidi users I think the nativeless package is a good option, whereby also the .NET MAUI team has to do "some" homework to be able to send MIDI outside the App to a DAW alike app like AUM (pure AUv3 Host with no build in MIDI Sequencer - but really great product & concept) or a real DAW like Cubasis on iOS, in order to let the MIDI ring / trigger / play really awesome synthesizer & their sounds that are available on iOS as AUv3 Plugins.

I will give that nativeless package in a .NET MAUI app + self implemented bridge to AVAudioSequencer incl. "in app" playback with AVAudioEngine a try, next week I think ... and I will post my "experiences" and feedback here.

And by the way : there are other interesting areas DryWetMidi can expand into, that make more sense and have a greater value for you & your users and maybe more fun & joy for you to discover, play around with and bring into digial life ... so no reason to be disappointed or frustated about this one.

You have done a great job with your library DryWetMidi already ... so enjoy & feel good about what you have made possible - its awesome and really easy & handsome to use (incl. your documentation) from a library user perspective.

Big thanks for your efforts and work ... and also a sorry, for getting you into this nebula - maybe not the greatest idea of mine to ask you for this feature ;-).

cheers
Ayaphi

@melanchall
Copy link
Owner Author

Hi Ayaphi,

Thanks for your tests! Well, I think I really need take a rest for some time. And thanks for your understanding. Right now it indeed looks like MAUI is pretty raw with all those inconveniences of Xamarin. You've mentioned UWP in your previous message. Well I've just closed the issue on UWP support (#108) since looks like it's dead or something like that.

maybe not the greatest idea of mine to ask you for this feature

Oh no no, thank you for your issue! Really. How would we know how things work (and don't work) in .NET MAUI without it? :) I touched MAUI one time before you created the discussion on iOS so it's good to learn something new about this tech.

I will give that nativeless package in a .NET MAUI app + self implemented bridge to AVAudioSequencer incl. "in app" playback with AVAudioEngine a try, next week I think ... and I will post my "experiences" and feedback here.

Waiting for your response, Ayaphi!

Have a nice coding,
Max

@melanchall melanchall added this to the Future milestone Jan 19, 2023
@melanchall
Copy link
Owner Author

Hi @ayaphi,

How is it going with the nativeless package?

Max

@codebeaulieu
Copy link

I'd like to use this library in MAUI as well. I'll be watching this thread, and willing to help lend a hand if need be.

@melanchall
Copy link
Owner Author

Thanks @codebeaulieu! You need iOS support within MAUI too, right?

@ayaphi
Copy link

ayaphi commented Feb 10, 2023

Hi @melanchall

sorry for the delay, but I am totally busy and currently got no time for the nativeless package evaluation - but I will do it some day in the upcoming weeks. I cannot give you a concrete date, because it will take more time as just doing one of the library test, as I have to set up a playback pipeline with the given iOS Audio API supplied via bindings by .NET MAUI.

So that this makes sense for me and my time (I do not want to rush over it - quick and dirty, try & forget does not make that much sense ;-), I have to make some progress in my Swift project in order to really know what should work and how. As I mentioned already, there are some Audio APIs out of date in .NET MAUI (not updated from ealrier Xamarin iOS times) and I want to poke around some time with the "real APIs" on iOS with Swift and a SwiftUI app beforehand.

... it will take some time, but I will do it, promised.

+++

Another thing that came up already quite some time ago using DryWetMidi on Windows with a Windows SDK App and again popped up during my Swift researches & evaluations for iOS (and MacOS) : synchronization with Ableton Link.

In the Windows SDK App (MIDI Sequencer alike) I use the MIDI Clock feature to let my app be the master clock and the DAW (Digital Audio Worksation like Ableton, Bitwig, Cubase, Logik Pro etc.) be the slave.

Via the Output-Device I can send a StartEvent (the DAW starts playback or record, if armed beforehand) before I call Start of the Playback instance, then with every tick of MidiClock I send a TimingClockEvent and on the Finished event of Playback I send a StopEvent - this works "sufficient" (I did no latency measurements to be able to say it works "great" and there always have to be third party virtual MIDI Port in the pipeline as well ;-) if the DAW runs on the same windows machine.

But this does not work for all DAWs, as not all support to work in MIDI Clock slave mode.

And as soon as some musical "thingy" from an iOS device comes into play respectively the setup (and there are plenty of really awesome and affordable "sound pleasures" on iOS available - at least as AUv3 plugins that can be run in a Host like AUM e.g.), Ableton Link is THE synchronisation technique that most apps support on iOS and users almost expect.

Description of the concept and links to source-code (libraries) open sourced by Ableton for Ableton Link can be found here : http://ableton.github.io/link/.

It's not really a feature request (by now ;-), as I am currently more "Apple & iOS + MacOS" focused, but I would be interesseted about your thoughts if an Ableton Link Integration into the Playback API would be possible.

Receiving multicast UDP Packages (which Ableton Link is based upon) and making sense of them should be not that hard, somewhere I also found example calculations for timing the events accorindingly to the "given sync information" (how complete they are I cannot tell, because I only browsed over them - and I did not noted the website / example where this was done, but each search-engine should bring it back ;-).

And besides the ready to use iOS SDK, "the Link repo contains C++ source code implementing the Link protocol and a C++ API for integrating applications." (taken from the linked Website) ... so this could be fueled into a native windows C++ library using P/Invoke from DWM I think.

But what about applying the synchronisation inside you Playback API? How great is this challenge? I can image that there are a lot of edge case, especially when the app / DWM is told "you are ahead of time" and playback has to roll back in time (notes played are "out there", nothing to rewind on that, but what are the implications ... but I guess that the synchronisation offset will be quite small by Link Protocol design, so even in bad cases it ends up more like a swing feature / feeling).

All the adjustments that have to be done may be not as easy - or a intelligent timeline abstraction that works like a offset and handfull of rules for certain cases can do the job, easier that first though about?

What do you think about it?

cheers
Ayaphi

@melanchall
Copy link
Owner Author

melanchall commented Feb 10, 2023

Hi Ayaphi,

Thanks for the response! As for synchronisation inside playback API – do you mean to control playback externally? If yes, that's already possible via manual ticking of a playback's internal clock. So for example, you can get Playback like this:

var playback = midiFile.GetPlayback(new PlaybackSettings
{
    ClockSettings = new MidiClockSettings
    {
        CreateTickGeneratorCallback = () => null
    }
});

and then on each sync pulse/signal received from an external app just call:

playback.TickClock();

Is that what you're talking about? Or you mean something different?

As for Ableton Link, interesting technology and DWM integration looks useful. But it definitely should be separate project, I suppose. I mean there should not be any 3d party integrations withing the library I think. Just a matter of architecture.

And about latency of Playback. By default the library uses 1 ms interval for ticking the internal clock. But both Windows and macOS are not realtime operating systems, so delays are possible due to many other processes and threads are running the same time in the system. But tick generator on macOS is more precise than on Windows, since macOS allows to specify "realtime" priority for a thread where I run the tick generator.

Thanks,
Max

@ayaphi
Copy link

ayaphi commented Feb 13, 2023

Hi Max,

regarding playback synchronisation : I did not realize that DWM also supports "MIDI Clock - slave mode". If I understand you correctly, calling playback.TickClock already does a synchronisation of the "midi data" inside the playback instance - adjustmets of the "playback times"?

That would be great on its own, and also great starting point for Ableton Link integration ;-).

To think it a little bit further along my current use cases - are there "time drift limits", in the sense of latency that can still be handled by your current playback synchronisation implementation based on MIDI Clock ticks and is there a threshold of latency (e.g. 2 seconds - just to name a time), on which synchronisation will fail?

The reason I ask: I am currently evaluating if wireless MIDI is a usefull respectively practical option, and using Apples MIDI Network Sessions works sufficiently (which is also supported on Windows by rtpMidi, see : https://www.tobias-erichsen.de/software/rtpmidi.html), but introduces fluctuating latencies - currently the amount of latency varies from 3 ms up to 32 ms, but will increase with more traffic on the local network of course.

By the way Bluetooth MIDI seems to have much lower latencies as MIDI over Wifi, but I am looking for "real network" options and not only a "1 : 1" virutal MIDI cable solution.

And to answer you question : "Is that what you're talking about? Or you mean something different?". Conceptionally I want my app to be the MIDI Master, because of the UI & workflow, the DAW will stay in the background for the most of the time (typical 80% to 20%) and I want to start the DAW from my app while syncing the playback, which works already like I desribed in my last post - MIDI Clock master mode is fully covered in DWM.

But there are cases (the 20% of cases that makes the app complete ;-) when the roles will change - the user will be in the DAW and works its way through sound design heaven (;-) and wants to start playback in the DAW and also trigger my app feeding MIDI into the DAW.

And this is the case where Abelton Link can solve all requirements, including getting iOS apps into the musical setup : as it forms a session of equitable peers - after peers have joined the session, any peer can start and stop the session and every peer has to keep up with synchronisation.

+++

Regarding Ableton Link suport, for sure, it makes sense to see this as seperate project and it could make sense to open a seperate discussion for that (if you like). But you see a possible integration into DWM, so that e.g. the Playback API gets an extension, in which Ableton Link synchronisation will be used (instead of MIDI Clock synchronisation)?

Dependecy on the Ableton Link SDK is surely not a nice design decision for DWM - stay decoupled from Ableton LInk SDK, but also integrateable in DWM is a nice architecure kind of project ... someting like a sibling / child nuget package, that implements its features in a subordinated namespace?

Just another hint from a iPad & PC music app users perspective (I have not looked deeply into the technical documentation of Ableton Link) : as any peer can change the tempo, e.g. from 96 to 132 BPM, and peers can enter and leave the session, I do not know in which case also the "master role" changes, even if the concept of the master is compareable to that we know from MIDI Clock - so it might be tricky (your MIDI Clock synchronisation implementation might need substantial rework / redesign) and one has to implement a fluid master / slave behavior that might has to change on the fly.

+++

Another technical question: did you ever looked into creation of virtual MIDI ports on windows? On MacOS you can easly create virtual MIDI Ports via Apples Audio-MIDI-Setup app.

I know there are several client for windows, just for the purpose of creating virtual MIDI Ports (e.g. loopMIDI : https://www.tobias-erichsen.de/software/loopmidi.html) and there is also a commercial SDK with C# bindings (https://www.tobias-erichsen.de/software/virtualmidi/virtualmidi-sdk.html).

I guess its more a hassle and needs a deep dive into Windows Driver technology - or do you know a Windows API that can create virtual MIDI Ports which can be used via Bindings in a C# app?

cheers
Ayaphi

@melanchall
Copy link
Owner Author

Hi Ayaphi,

First of all, about how Playback works. Playback uses timer under the hood. In DryWetMIDI this timer called tick generator. On every tick of timer playback looks what objects should be played by the current time, plays them and advances position within objects list waiting for next tick.

We can turn off this built-in timer as I've described above and use manual ticking. What does that mean? Every MIDI event in the list within Playback has exact time when it should be played. So, for example, event A should be played after 2 seconds after playback started, event B after 3 seconds after playback started and so on. Once we started Playback the internal time counter starts (it's not a clock!). This counter used on every tick of the tick generator or every manual call of TickClock method to decide if an event should be played at the current time or not.

Now let's imagine we have external app or device that should manage playback ticking. We receive a signal from it and call TickClock method. At this moment Playback looks what unplayed events should be played at the current playback's time (returned by time counter) and sends them all to output device. And imagine one more thing. That external thing sent a signal 2 seconds later than it should. Our app receives the signal, calls TickClock and playback collects all events that should be played within that 2 seconds, and immediately sends them to a device.

So there are no any time adjustments in Playback. It just sends events when they should be sent. With big latency from the external thing we may have kind of "jumps" in playback.

So that's what I mean by manual ticking and how it works. I think you're talking about something different.


But you see a possible integration into DWM, so that e.g. the Playback API gets an extension, in which Ableton Link synchronisation will be used (instead of MIDI Clock synchronisation)?

Yes, extension points in API is a good practice. Regarding Ableton Link, it's interesting, but obviously I don't have time right now to implement such integration, sorry.


Another technical question: did you ever looked into creation of virtual MIDI ports on windows?

Unfortunately those options you've mentioned are the only. We need to implement Windows driver to make virtual MIDI ports creation API. Yes, macOS is more flexible and DryWetMIDI allows to create virtual MIDI ports on macOS. But Windows is out of the game with its MIDI API.

Thanks,
Max

@ayaphi
Copy link

ayaphi commented Feb 21, 2023

Hi Max,

finally I got the test with DWM nativeless and .NET MAUI for iOS, working fine on its own (which is great), but unfortunally the AVFAudio API bindings preset in .NET MAUI are incomplete, as the CoreMIDI API bindings are out of date (as we discovered earlier).

This is really a pity (aka "sad bad true" as a famous band shout out ;-), because Apples API are sufficient for MIDI and Audio playback. I got it working with a Swift + Swift UI app on iOS using only Apple APIs, recently using a great CoreMIDI wrapper library (MIDIKit : https://github.com/orchetect/MIDIKit) that makes some things easier - but I got it working on bare iOS API metal also ;-).

Creating virtual midi endpoints via CoreMIDI and AVAudioEngine + AVAudioSequencer is all that's needed - this way its also possible to route MIDI outside the app into the operating system MIDI pipline to other apps or using the MIDI Network session as well and hence route MIDI to deskop machines.
Did not saw this routing option at first sight, but its possible by means of AVMusicTrack.destinationMIDIEndpoint.

So theoretically DWM nativeless would be sufficient, as the playback engine is provided by Apple APIs as well as an audio render engine with soundfont files ... but unfortunately .NET MAUI iOS bindings are more than "out of time".

In particular : I cannot create and append a track to AVAudioSequencer (createAndAppendTrack is missing, https://developer.apple.com/documentation/avfaudio/avaudiosequencer/3929193-createandappendtrack) and I see no way to ge hold of a MIDIEndpointRef, which is somehow a pointer to a MIDI destination (a MIDI input port for or from other apps) to set the aforementioned AVMusicTrack.destinationMIDIEndpoint, and last but not least with the current state of the CoreMIDI API bindings, I cannot create virtual MIDI endpoints.

So my idea to use DWM Pattern feature and its music theory capabilites and some new utility extension method implementations of my own, to feed the pattern into AVMusicTrack's of an AVAudioSequencer, is not possible to implement "at the moment" ... which reveals the danger / pifalls of a "3rd party" API wrapper, which to some extend is what .NET MAUI does - it does a lot more than that and to be honest in a quite genial way, but in some core aspects it is a "3rd party API wrapper" and in that regard its dangerous for an app developer, as one has no control over the time schedule when API's are updated and maintained.

So what I could tested with DWM nativeless : I read a MIDI file from the app bundle and traversed the track-chunks and notes and printed them to the debug console - all being correct.

So you've done a great job already ... and to be honest my intial request was senseless ;-), as all that I've asked for is already present in Apples iOS APIs - so again : sorry for that.

But it makes another thing clear, that I felt before I start learnig Swift and SwiftUI : if a .NET MAUI developer wants to or even has to drop down to native plattform APIs like for iOS, one has to be experienced in this native plattform APIs and developing for the plattform - practical experience is absolute necessary.

Or in other words : the beauty and practical relevance of .NET MAUI is the area ahead the native plattform API's - as long as the app can be developed with the natural and "OS agnostic" .NET MAUI APIs it makes sense ... let it mature and it will be useful in these areas.

+++

Meanwhile another thing came into my mind regarding DWM on windows. Is it possible to have an efficient "multi track & multi MIDI port" playback with DWM?

For example: multiple Pattern instances (or TrackChunks) are used to create equivalent many Playback instances, whereby each Playback instance will use a different OutputDevice (MIDI Port), and all of these are driven by only 1 HighPrecisionTickGenerator.

I think this should be possible as you have descirbe here : https://melanchall.github.io/drywetmidi/articles/playback/Tick-generator.html, using the PlaybackSettings parameter :

var playback = midiFile.GetPlayback(new PlaybackSettings
{
    ClockSettings = new MidiClockSettings
    {
        CreateTickGeneratorCallback = () => new HighPrecisionTickGenerator()
    }
});

but using a shared HighPrecisionTickGenerator instance for all Playbacks (and using the correct and existing extension methods for a Pattern or TrackChunk, which also got a IOutputDevice parameter - in PlaybackUtilities).

Or does a shared HighPrecisionTickGenerator has a somehow sideeffect on the Playback instances, apart from that each playback instance will be called after another (a serial chain implies some, but I think marginally latency) ?

Is there a DWM structure for grouping Playback instances and controlling them alltogether, taking care of their time related jobs especially their events like Finished?

Thanks & cheers
Ayaphi

@melanchall
Copy link
Owner Author

Hi Ayaphi,

Yes, it's the approach I would recommend you – use a shared instance of the HighPrecisionTickGenerator. Latency is possible, but I suppose it will be pretty small if you are not going to have 100 Playback objects and use advanced playback features (like NoteCallback).

As for a kind of container for multiple Playback's to manage them all at once, no, there is no such API in the library. I have plans on Playlist API which is similar to what you want except I didn't think about playing all child playbacks simultaneously.

Thanks,
Max

@akshayabd
Copy link

Hi @ayaphi - did you ever get DryWetMidi working on iOS? Using DllImport Internal?

If so could you create a GitHub sample project to show how this can be done? I'm stuck trying to make it work as well.

@ayaphi
Copy link

ayaphi commented Aug 3, 2023

Hi @akshayabd,

since my last post from February I did no further serious investigations - neither .NET MAUI on iOS nor DryWetMidi on iOS.

I switched to Swift as I wanted a robust and reliable foundation to build upon. For my needs .NET MAUI unfortunately did not deliver what I need and I got the feeling that this field of application is "too low level" for .NET MAUI in that moment and that .NET MAUI "needs time" to grow to be used for such applications ... talking from a point of view and know-how at the beginning of this year.

So, DryWetMidi was not the "real" problem, but .NET MAUI was.

Cheers
Ayaphi

@akshayabd
Copy link

@ayaphi Thanks for the info. I guess I'll look into Swift too.

@sonicviz
Copy link

sonicviz commented Aug 17, 2023

So is it fair to say if you wanted to use this in Unity to do full cross-platform you can't, due to it not working on iOS?
I'm doing a rewrite/update of an old Unity app (https://harpninja.com/) using a custom midi lib I wrote ages ago and was thinking of switching to this, but if it's not compatible with iOS/Mac I guess that kills that idea! Pity, looks like a neat lib otherwise.

Edit: Just checking description again on https://assetstore.unity.com/packages/tools/audio/drywetmidi-222171#description and I see it also doesn't mention Android either.

So, just to confirm, this is neither iOS nor Android compatible?

@melanchall
Copy link
Owner Author

@sonicviz Please take a look into Supported OS article. Right now Devices API of the library can be used on Windows and macOS only.

But if you are ready to help me a bit, we can try to investigate running on iOS within a Unity project. After some googling I found this article - Building plug-ins for iOS. So can you please try this instruction:

  1. Add content of the library (https://github.com/melanchall/drywetmidi/tree/develop/DryWetMidi) to your project (into Assets folder I suppose);
  2. Change library name in all DllImport's to __Internal;
  3. Add native code source files (https://github.com/melanchall/drywetmidi/blob/develop/Resources/Native/NativeApi-Constants.h and https://github.com/melanchall/drywetmidi/blob/develop/Resources/Native/NativeApi-macOS.c) to the Unity project (I'm not sure in which folder to put them);
  4. Deploy the app to a device and run the game.

Looks like Unity will compile sources for iOS by yourself. As for Android - it requires to implement entire native layer for it, so I can't say you any dates unfortunately. macOS and iOS have similar API and there is hope we can handle iOS support with a little effort.

Thanks,
Max

@sonicviz
Copy link

sonicviz commented Aug 17, 2023

Hey Max,

Unfortunately I need full cross platform out of the box, as it's an old app that's always been xplatform and android and iOS are both key. So I'll have to put this one on the shelf for now, but I'll keep it in mind if I do a desktop only version.

I haven't actually started on the upgrade yet, but when I do I'll reacquaint myself with the code to see what I did last time to get full xplatform support working. iirc I didn't do anything too out of the ordinary, as in I didn't have to write any native layers and the timing seems ok for midi playback and control.

@melanchall
Copy link
Owner Author

@akshayabd Can you please try the instruction above?

@akshayabd
Copy link

Hey @melanchall, I have been trying but its difficult.

However I am able to work with Windows and MacOS so thats great. I will keep trying - thanks for all the instructions - I really appreciate all your help!

@melanchall
Copy link
Owner Author

Well, I did quick tests some time ago and regarding Unity, we need to make the library IL2CPP-compatible. The issue is open on the subject – #282. But right now I have no idea how to rework my code.

@antiero
Copy link

antiero commented Nov 7, 2024

#282 ticket seems to have magically disappeared @melanchall - are there any updates on this for iOS?.. I've very happy to assist with building and testing!

@melanchall
Copy link
Owner Author

@antiero It's strange... I've asked GitHub Support to provide details on the missed issue.

Well, as for iOS support. Thanks for your response! Can you please say, what kind of project you can help with? Unity?

@antiero
Copy link

antiero commented Nov 9, 2024

Yes, a cross platform Unity project, you guessed it! 🙂

@melanchall
Copy link
Owner Author

Oh, it's a pretty hard task :-) I'm afraid I can't provide any dates when I get back to it... But thank you a lot for your assistance offer! I'll definitely contact you when I start the task.

As for the issue #282, here the support answer:

The issues are hidden due to the account status of the user who created the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request API changes required
Projects
None yet
Development

No branches or pull requests

6 participants