Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rough-and-ready context lifetime management. #32

Closed
wants to merge 1 commit into from
Closed

Conversation

dabrahams
Copy link
Contributor

This is a mostly-mechanical change; careful review is required to make sure it's being done right.

This is a mostly-mechanical change; careful attention should be paid to make sure it's being done
right.
@kyouko-taiga
Copy link
Contributor

kyouko-taiga commented Feb 29, 2024

So IIUC, the idea is to hold a reference to the context in every data structure that it could manage. I think that will work but probably we need to guarantee more lifetimes than just that of the context. It's very likely that we also need to guarantee the lifetime of the module in which we create functions. In fact, I suppose pretty much every class that has a public destructor in C++ (and therefore a corresponding XXXDispose wrapper in the C API) needs some kind of lifetime management mechanism. So probably instructions would need to be written like this:

protocol InModule: Contextual { ... }
protocol Instruction: InModule { ... }

There's a lot of boilerplate involved in adding a reference to all of these data structures and I'm wondering if perhaps we should try a less clever approach. The context is a the most annoying object to track because it really doesn't fit nicely into a MVS-oriented wrapper. So my original idea was to hide LLVMContext under the rug. With the benefit of insights, however, I think it was a bad idea.

We need to use different contexts if we want to use LLVM in a multithreaded environment, which we'll probably want to do sooner than later (LLVM codegen currently takes ~10% of our compile time). So I guess at that point we'll want to uncouple the context from the module and write the code of a single thread in a lambda like:

queue.addOperation {
  SwiftyLLVM.withContext { (llvm) in
    // Emit some modules ...
  }
}

AFAICT, pretty much everything else is tied to the lifetime of a module. So perhaps we need to pull the same trick:

SwiftyLLVM.withContext { (llvm) in
  llvm.withNewModule("MyModule") { (m) in
    // Emit some functions ...
  }
}

Then we document that it is UB to use any context/module property outside of the corresponding closure and all existing data structures can remain as they are. In practice it shouldn't degrade the UX because I expect the common use case to look like this:

SwiftyLLVM.withContext { (llvm) in
  llvm.withNewModule("xyz") { (m) in emitModule(&m) }
}

func emitModule(_ m: inout SwfityLLVM.Module) {
  // Lifetime-extending closures shenanigans have no impact here
}

It is of course disappointing to offer an unsafe API in a safe-by-default language but at the same time I don't think we should make it a goal to create a safe API for LLVM. That's beyond the current purpose of SwiftyLLVM.

Copy link
Contributor

@kyouko-taiga kyouko-taiga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As I commented on the PR, I think that's a sound approach but we should also consider the lifetime of other "disposable" LLVM objects.

@dabrahams
Copy link
Contributor Author

dabrahams commented Mar 1, 2024

Yes, the withXXX idiom is how I probably would have done this if I had started coding this from scratch. It's a way to make unsafety manageable with a little care. Lifetimes are probably not the only way we can create UB by working with the LLVM API, so managing them doesn't get us all the way to "safe". We should carefully document all the preconditions.

FWIW, this is how Swift is doing it with C++ interop; not very principled.

@kyouko-taiga
Copy link
Contributor

not very principled.

Indeed, it feels "hacky".

@dabrahams
Copy link
Contributor Author

@kyouko-taiga we should close this, then?

@kyouko-taiga
Copy link
Contributor

Yes, let's close. I will submit another PR using the withXXX approach, hopefully this afternoon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants