-
-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
iOS support #237
Comments
Hi @ayaphi, So I've modified your test solution, including all the sourcesof my library and changed all Here the solution – TestifyWithDwm.zip. I've written simple code there: var inputDevices = InputDevice.GetAll();
var outputDevices = OutputDevice.GetAll();
var stringBuilder = new StringBuilder();
stringBuilder.AppendLine("Input devices:");
stringBuilder.AppendLine(string.Join(Environment.NewLine, inputDevices.Select(d => d.Name)));
stringBuilder.AppendLine("Output devices:");
stringBuilder.AppendLine(string.Join(Environment.NewLine, outputDevices.Select(d => d.Name)));
// put stringBuilder.ToString() somewhere to see the result As for last line, I really don't know where it would be handy to output Also I should mention that right now I don't know how to build real native API for Can you please check the new solution on iOS? Many many thanks! |
Hi Max, I tried your solution, but sadly I got compilation errors
Its related to that what you have already mentioned : you did not include a build for x64 and x64, which you are right, is practically not needed as my Mac build host and hence the simulator are both Arm 64 bit. So this seeems to be build pipeline restriction - but maybe the .NET MAUI team knows a clever workaround, like a compiler / linker switch that can be injected in the project file respectively the plattform specific build directives? But this not the only "bad news", because I tried to find a way to send MIDI messages in a .NET MAUI app without success : the API that should do it, I opened a discussion on the .NET MAUI repo #12720 and also a question on Stackoverflow https://stackoverflow.com/questions/75151944/net-maui-ios-midiport-send-is-obsolete-since-ios-14-no-other-way-for-send . +++ This makes me thought if the efforts (and precious freetime) that especially you put into getting native playback run on iOS by means & features of DryWetMidi are really worth it at this point in time (development stage of .NET MAUI)? The native API's that .NET MAUI provides through bindings seem not to be up to date in all areas - CoreMIDI is the second native library, AVFAudio was the first, whose "Binding APIs" are not up to date. Its not about putting blame upon the .NET MAUI team, but to see clearly the state of the .NET MAUI project - it really seems to be still a "work in progress" and not finished yet (the significance of the term GA or release version 1.0 has been washed out in its "pure" technical meaning in our entire industry - like a huge reverb would do - and has taken in other missions like trigger curiosity and engagement "in" possible users). This "GA has no technical meaning one can rely on" seems to be even "more true" (okay, its relative as humans can only have individual perceptions of reality and hence "truth" ;-) for .NET MAUI, the more one gets to "not line of business applications" features and requirements. I am also quite sure that this is only a matter of time, till it will be production ready, and I don't think that it will be a second UWP (that has been died, before it was allowed to walk around) ... but currently we are in another "space of time" ;-). +++ Secondly, even if .NET MAUI would have 100% native plattform API covarage, we should review, if its the right approach to let DryWetMidi be responsible for MIDI playback on iOS. On Windows that totally makes sense and works like a charm. The operating system architecture in the sense of an "application plattform" is very different - UWP had let us experienced to what restriction & barriers this can lead, on Windows too - between a "sandbox enclosed" (iOS) and "free in the wild" (Windows - not UWP), and also how pipelines of MIDI and Audio are established and run underneath & between the apps. I have not dived deep enough into iOS development yet, to claim "overseeing wisdom" ;-), but ... : let Apples core frameworks do tasks that are related to demands of efficiency and "near realtime", especially when they offer an explicit solution (classes in Swift or API set in Objective-C) and if this does not fulfill all needs, use established specialized frameworks, that have gone through all that nebula and pain before and have that job get done. Regarding MIDI Playback, Apple offers a Sequencer (AVAudioSequencer : https://developer.apple.com/documentation/avfaudio/avaudiosequencer) that works for "in app" playback, which means this seqencer cannot send your midi to the "world outside the app" (MIDI output is limited to the border of the app sandbox respectively the node graph one sets up with an AVAudioEngine), but you can setup "instrument nodes" inside an AVAudioEngine session, that e.g. can playback Soundfont files. I tested it and it works. In a nutshell : you can load midi files and even add or remove midi events on the fly and during playback. For reaching outside the App sandbox there is the AudioKit open-source Framework (https://github.com/AudioKit/AudioKit), that offers a similar Sequencer that can send midi into the CoreMIDI pipeline. I did a quick test with a Swift Playground and I could send MIDI from the iPhone over the network to my PC. +++ What could this mean for a DryWetMidi user : we could use the Nativeless package and maybe some extension methods or utility class that realizes the bridge to the iOS Sequencer APIs, translate DryWetMidi constructs in tuple of values that sacrifice the parameters needed for AVMusicEvent and AVMusicTimeStamp instance creation in order to call the addEvent function e.g. (https://developer.apple.com/documentation/avfaudio/avmusictrack/3929223-addevent) - time representations in the sequencer APIs are beats based and AVMusicTimeStamp is just a fractional number of beats. I did not have tried the nativeless package of DryWetMidi, but I will in the next days : loading a MIDI file from the assets into DryWetMidi and then pour it into an AVAudioSequencer instance and have it play back the midi tracks with a piano & ambient guitar soundfont inside a AVAudioEngine node graph. And for the "world outside the App sandbox" - it might be a solution to build a .NET MAUI binding library for the AudioKit open-source Swift Framework and hence make it accessable in .NET MAUI. This will be quite a lot of work, but maybe it can be restricted to the classes and API that matter. I would not even begin to think about using the Audio API's via C# ;-), but the MIDI parts should work via bindings. But before I give this "building a binding Library for a Swift Library" a try, .NET MAUI has to mature and I like to see a full .NET MAUI related documentation on their website, before I try to wrap my head around this (no out of date Xaramrion iOS docs on this one - as the project system has substantialy changed). That's my thoughts ... and if you still would like to bring this playback challange on iOS nevertheless and "against all odds" (greatings to Phil Collins ;-) to an end, at least get it build successfully and theoretically running, I am still on your side and will test it as far as I can ... but as mentioned in the beginning, I have very little hope that we are able to make a real world test by sending MIDI outside the app. If we delay this until .NET MAUI has matured or it makes really sense (for what ever reason), I am totally fine with that as well ... it's up to you (and don't forget to take care for yourself - freetime, once upon a time was created and hard fought by men, for recovering, refuel and relaxation ... don't waste your precious freetime for something that's not worth it ! & ;-). have a nice day and cheers |
Hi Ayaphi, First of all, can you please try a new version of the native library – Melanchall_DryWetMidi_Native64.a? If it doesn't work, well... it looks I need take a rest for some time :) As for nativeless package, yes, it has extension points. You can manually implement Thank you, Ayaphi! It becomes a long long story... |
Hi Max, I tried your new
Looks like progress in the right direction - although I can imagine that for you it's more like a pain than feeling "good" for this, as you have to fiddle around in the compiler & linker settings for the native library build, like in nebula ... I mean, fog is already bad (but you can feel and touch around - if you would have a iOS device), but messing around in (astronomic) nebula is hard to take. So please take a rest, if you feel you should to ... we can continue on this some time later in the future, or maybe even drop this challenge at all. For myself I already emigrate to planet Swift & SwiftUI ;-) and for other DryWetMidi users I think the nativeless package is a good option, whereby also the .NET MAUI team has to do "some" homework to be able to send MIDI outside the App to a DAW alike app like AUM (pure AUv3 Host with no build in MIDI Sequencer - but really great product & concept) or a real DAW like Cubasis on iOS, in order to let the MIDI ring / trigger / play really awesome synthesizer & their sounds that are available on iOS as AUv3 Plugins. I will give that nativeless package in a .NET MAUI app + self implemented bridge to AVAudioSequencer incl. "in app" playback with AVAudioEngine a try, next week I think ... and I will post my "experiences" and feedback here. And by the way : there are other interesting areas DryWetMidi can expand into, that make more sense and have a greater value for you & your users and maybe more fun & joy for you to discover, play around with and bring into digial life ... so no reason to be disappointed or frustated about this one. You have done a great job with your library DryWetMidi already ... so enjoy & feel good about what you have made possible - its awesome and really easy & handsome to use (incl. your documentation) from a library user perspective. Big thanks for your efforts and work ... and also a sorry, for getting you into this nebula - maybe not the greatest idea of mine to ask you for this feature ;-). cheers |
Hi Ayaphi, Thanks for your tests! Well, I think I really need take a rest for some time. And thanks for your understanding. Right now it indeed looks like MAUI is pretty raw with all those inconveniences of Xamarin. You've mentioned UWP in your previous message. Well I've just closed the issue on UWP support (#108) since looks like it's dead or something like that.
Oh no no, thank you for your issue! Really. How would we know how things work (and don't work) in .NET MAUI without it? :) I touched MAUI one time before you created the discussion on iOS so it's good to learn something new about this tech.
Waiting for your response, Ayaphi! Have a nice coding, |
Hi @ayaphi, How is it going with the nativeless package? Max |
I'd like to use this library in MAUI as well. I'll be watching this thread, and willing to help lend a hand if need be. |
Thanks @codebeaulieu! You need iOS support within MAUI too, right? |
Hi @melanchall sorry for the delay, but I am totally busy and currently got no time for the nativeless package evaluation - but I will do it some day in the upcoming weeks. I cannot give you a concrete date, because it will take more time as just doing one of the library test, as I have to set up a playback pipeline with the given iOS Audio API supplied via bindings by .NET MAUI. So that this makes sense for me and my time (I do not want to rush over it - quick and dirty, try & forget does not make that much sense ;-), I have to make some progress in my Swift project in order to really know what should work and how. As I mentioned already, there are some Audio APIs out of date in .NET MAUI (not updated from ealrier Xamarin iOS times) and I want to poke around some time with the "real APIs" on iOS with Swift and a SwiftUI app beforehand. ... it will take some time, but I will do it, promised. +++ Another thing that came up already quite some time ago using DryWetMidi on Windows with a Windows SDK App and again popped up during my Swift researches & evaluations for iOS (and MacOS) : synchronization with Ableton Link. In the Windows SDK App (MIDI Sequencer alike) I use the MIDI Clock feature to let my app be the master clock and the DAW (Digital Audio Worksation like Ableton, Bitwig, Cubase, Logik Pro etc.) be the slave. Via the Output-Device I can send a But this does not work for all DAWs, as not all support to work in MIDI Clock slave mode. And as soon as some musical "thingy" from an iOS device comes into play respectively the setup (and there are plenty of really awesome and affordable "sound pleasures" on iOS available - at least as AUv3 plugins that can be run in a Host like AUM e.g.), Ableton Link is THE synchronisation technique that most apps support on iOS and users almost expect. Description of the concept and links to source-code (libraries) open sourced by Ableton for Ableton Link can be found here : http://ableton.github.io/link/. It's not really a feature request (by now ;-), as I am currently more "Apple & iOS + MacOS" focused, but I would be interesseted about your thoughts if an Ableton Link Integration into the Playback API would be possible. Receiving multicast UDP Packages (which Ableton Link is based upon) and making sense of them should be not that hard, somewhere I also found example calculations for timing the events accorindingly to the "given sync information" (how complete they are I cannot tell, because I only browsed over them - and I did not noted the website / example where this was done, but each search-engine should bring it back ;-). And besides the ready to use iOS SDK, "the Link repo contains C++ source code implementing the Link protocol and a C++ API for integrating applications." (taken from the linked Website) ... so this could be fueled into a native windows C++ library using P/Invoke from DWM I think. But what about applying the synchronisation inside you Playback API? How great is this challenge? I can image that there are a lot of edge case, especially when the app / DWM is told "you are ahead of time" and playback has to roll back in time (notes played are "out there", nothing to rewind on that, but what are the implications ... but I guess that the synchronisation offset will be quite small by Link Protocol design, so even in bad cases it ends up more like a swing feature / feeling). All the adjustments that have to be done may be not as easy - or a intelligent timeline abstraction that works like a offset and handfull of rules for certain cases can do the job, easier that first though about? What do you think about it? cheers |
Hi Ayaphi, Thanks for the response! As for synchronisation inside playback API – do you mean to control playback externally? If yes, that's already possible via manual ticking of a playback's internal clock. So for example, you can get var playback = midiFile.GetPlayback(new PlaybackSettings
{
ClockSettings = new MidiClockSettings
{
CreateTickGeneratorCallback = () => null
}
}); and then on each sync pulse/signal received from an external app just call: playback.TickClock(); Is that what you're talking about? Or you mean something different? As for Ableton Link, interesting technology and DWM integration looks useful. But it definitely should be separate project, I suppose. I mean there should not be any 3d party integrations withing the library I think. Just a matter of architecture. And about latency of Thanks, |
Hi Max, regarding playback synchronisation : I did not realize that DWM also supports "MIDI Clock - slave mode". If I understand you correctly, calling That would be great on its own, and also great starting point for Ableton Link integration ;-). To think it a little bit further along my current use cases - are there "time drift limits", in the sense of latency that can still be handled by your current playback synchronisation implementation based on MIDI Clock ticks and is there a threshold of latency (e.g. 2 seconds - just to name a time), on which synchronisation will fail? The reason I ask: I am currently evaluating if wireless MIDI is a usefull respectively practical option, and using Apples MIDI Network Sessions works sufficiently (which is also supported on Windows by rtpMidi, see : https://www.tobias-erichsen.de/software/rtpmidi.html), but introduces fluctuating latencies - currently the amount of latency varies from 3 ms up to 32 ms, but will increase with more traffic on the local network of course. By the way Bluetooth MIDI seems to have much lower latencies as MIDI over Wifi, but I am looking for "real network" options and not only a "1 : 1" virutal MIDI cable solution. And to answer you question : "Is that what you're talking about? Or you mean something different?". Conceptionally I want my app to be the MIDI Master, because of the UI & workflow, the DAW will stay in the background for the most of the time (typical 80% to 20%) and I want to start the DAW from my app while syncing the playback, which works already like I desribed in my last post - MIDI Clock master mode is fully covered in DWM. But there are cases (the 20% of cases that makes the app complete ;-) when the roles will change - the user will be in the DAW and works its way through sound design heaven (;-) and wants to start playback in the DAW and also trigger my app feeding MIDI into the DAW. And this is the case where Abelton Link can solve all requirements, including getting iOS apps into the musical setup : as it forms a session of equitable peers - after peers have joined the session, any peer can start and stop the session and every peer has to keep up with synchronisation. +++ Regarding Ableton Link suport, for sure, it makes sense to see this as seperate project and it could make sense to open a seperate discussion for that (if you like). But you see a possible integration into DWM, so that e.g. the Playback API gets an extension, in which Ableton Link synchronisation will be used (instead of MIDI Clock synchronisation)? Dependecy on the Ableton Link SDK is surely not a nice design decision for DWM - stay decoupled from Ableton LInk SDK, but also integrateable in DWM is a nice architecure kind of project ... someting like a sibling / child nuget package, that implements its features in a subordinated namespace? Just another hint from a iPad & PC music app users perspective (I have not looked deeply into the technical documentation of Ableton Link) : as any peer can change the tempo, e.g. from 96 to 132 BPM, and peers can enter and leave the session, I do not know in which case also the "master role" changes, even if the concept of the master is compareable to that we know from MIDI Clock - so it might be tricky (your MIDI Clock synchronisation implementation might need substantial rework / redesign) and one has to implement a fluid master / slave behavior that might has to change on the fly. +++ Another technical question: did you ever looked into creation of virtual MIDI ports on windows? On MacOS you can easly create virtual MIDI Ports via Apples Audio-MIDI-Setup app. I know there are several client for windows, just for the purpose of creating virtual MIDI Ports (e.g. loopMIDI : https://www.tobias-erichsen.de/software/loopmidi.html) and there is also a commercial SDK with C# bindings (https://www.tobias-erichsen.de/software/virtualmidi/virtualmidi-sdk.html). I guess its more a hassle and needs a deep dive into Windows Driver technology - or do you know a Windows API that can create virtual MIDI Ports which can be used via Bindings in a C# app? cheers |
Hi Ayaphi, First of all, about how We can turn off this built-in timer as I've described above and use manual ticking. What does that mean? Every MIDI event in the list within Now let's imagine we have external app or device that should manage playback ticking. We receive a signal from it and call So there are no any time adjustments in So that's what I mean by manual ticking and how it works. I think you're talking about something different.
Yes, extension points in API is a good practice. Regarding Ableton Link, it's interesting, but obviously I don't have time right now to implement such integration, sorry.
Unfortunately those options you've mentioned are the only. We need to implement Windows driver to make virtual MIDI ports creation API. Yes, macOS is more flexible and DryWetMIDI allows to create virtual MIDI ports on macOS. But Windows is out of the game with its MIDI API. Thanks, |
Hi Max, finally I got the test with DWM nativeless and .NET MAUI for iOS, working fine on its own (which is great), but unfortunally the This is really a pity (aka "sad bad true" as a famous band shout out ;-), because Apples API are sufficient for MIDI and Audio playback. I got it working with a Swift + Swift UI app on iOS using only Apple APIs, recently using a great CoreMIDI wrapper library (MIDIKit : https://github.com/orchetect/MIDIKit) that makes some things easier - but I got it working on bare iOS API metal also ;-). Creating virtual midi endpoints via CoreMIDI and AVAudioEngine + AVAudioSequencer is all that's needed - this way its also possible to route MIDI outside the app into the operating system MIDI pipline to other apps or using the MIDI Network session as well and hence route MIDI to deskop machines. So theoretically DWM nativeless would be sufficient, as the playback engine is provided by Apple APIs as well as an audio render engine with soundfont files ... but unfortunately .NET MAUI iOS bindings are more than "out of time". In particular : I cannot create and append a track to So my idea to use DWM So what I could tested with DWM nativeless : I read a MIDI file from the app bundle and traversed the track-chunks and notes and printed them to the debug console - all being correct. So you've done a great job already ... and to be honest my intial request was senseless ;-), as all that I've asked for is already present in Apples iOS APIs - so again : sorry for that. But it makes another thing clear, that I felt before I start learnig Swift and SwiftUI : if a .NET MAUI developer wants to or even has to drop down to native plattform APIs like for iOS, one has to be experienced in this native plattform APIs and developing for the plattform - practical experience is absolute necessary. Or in other words : the beauty and practical relevance of .NET MAUI is the area ahead the native plattform API's - as long as the app can be developed with the natural and "OS agnostic" .NET MAUI APIs it makes sense ... let it mature and it will be useful in these areas. +++ Meanwhile another thing came into my mind regarding DWM on windows. Is it possible to have an efficient "multi track & multi MIDI port" playback with DWM? For example: multiple I think this should be possible as you have descirbe here : https://melanchall.github.io/drywetmidi/articles/playback/Tick-generator.html, using the
but using a shared Or does a shared Is there a DWM structure for grouping Thanks & cheers |
Hi Ayaphi, Yes, it's the approach I would recommend you – use a shared instance of the As for a kind of container for multiple Thanks, |
Hi @ayaphi - did you ever get DryWetMidi working on iOS? Using DllImport Internal? If so could you create a GitHub sample project to show how this can be done? I'm stuck trying to make it work as well. |
Hi @akshayabd, since my last post from February I did no further serious investigations - neither .NET MAUI on iOS nor DryWetMidi on iOS. I switched to Swift as I wanted a robust and reliable foundation to build upon. For my needs .NET MAUI unfortunately did not deliver what I need and I got the feeling that this field of application is "too low level" for .NET MAUI in that moment and that .NET MAUI "needs time" to grow to be used for such applications ... talking from a point of view and know-how at the beginning of this year. So, DryWetMidi was not the "real" problem, but .NET MAUI was. Cheers |
@ayaphi Thanks for the info. I guess I'll look into Swift too. |
So is it fair to say if you wanted to use this in Unity to do full cross-platform you can't, due to it not working on iOS? Edit: Just checking description again on https://assetstore.unity.com/packages/tools/audio/drywetmidi-222171#description and I see it also doesn't mention Android either. So, just to confirm, this is neither iOS nor Android compatible? |
@sonicviz Please take a look into Supported OS article. Right now Devices API of the library can be used on Windows and macOS only. But if you are ready to help me a bit, we can try to investigate running on iOS within a Unity project. After some googling I found this article - Building plug-ins for iOS. So can you please try this instruction:
Looks like Unity will compile sources for iOS by yourself. As for Android - it requires to implement entire native layer for it, so I can't say you any dates unfortunately. macOS and iOS have similar API and there is hope we can handle iOS support with a little effort. Thanks, |
Hey Max, Unfortunately I need full cross platform out of the box, as it's an old app that's always been xplatform and android and iOS are both key. So I'll have to put this one on the shelf for now, but I'll keep it in mind if I do a desktop only version. I haven't actually started on the upgrade yet, but when I do I'll reacquaint myself with the code to see what I did last time to get full xplatform support working. iirc I didn't do anything too out of the ordinary, as in I didn't have to write any native layers and the timing seems ok for midi playback and control. |
@akshayabd Can you please try the instruction above? |
Hey @melanchall, I have been trying but its difficult. However I am able to work with Windows and MacOS so thats great. I will keep trying - thanks for all the instructions - I really appreciate all your help! |
Well, I did quick tests some time ago and regarding Unity, we need to make the library IL2CPP-compatible. The issue is open on the subject – #282. But right now I have no idea how to rework my code. |
#282 ticket seems to have magically disappeared @melanchall - are there any updates on this for iOS?.. I've very happy to assist with building and testing! |
@antiero It's strange... I've asked GitHub Support to provide details on the missed issue. Well, as for iOS support. Thanks for your response! Can you please say, what kind of project you can help with? Unity? |
Yes, a cross platform Unity project, you guessed it! 🙂 |
Oh, it's a pretty hard task :-) I'm afraid I can't provide any dates when I get back to it... But thank you a lot for your assistance offer! I'll definitely contact you when I start the task. As for the issue #282, here the support answer:
|
Discussed in #235
The discussion moved to the issue for more convenient conversation. Key moments:
This issue for discussion on iOS support within DryWetMIDI library.
The text was updated successfully, but these errors were encountered: