Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't pass unused arguments to optimizer_argparse #2999

Closed
wants to merge 3 commits into from

Commits on Oct 31, 2024

  1. Remove custom optimizer_argparse case for qKG (facebook#2997)

    Summary:
    
    Context: It appears that using `qKnowledgeGradient` with MBM doesn't work, since [this line](https://github.com/facebook/Ax/blob/535af4edff70cbf20a49c676377f5c8945560d03/ax/models/torch/botorch_modular/acquisition.py#L339) passes the argument `optimizer` to `_argparse_kg`, which errors [here](https://github.com/facebook/Ax/blob/535af4edff70cbf20a49c676377f5c8945560d03/ax/models/torch/botorch_modular/optimizer_argparse.py#L169) because it has now received the argument "optimizer" twice.
    
    We don't really need the `optimizer_argparse` special case for qKG anymore. This existed for two reasons:
    - in order to construct initial conditions, which can be handled by `optimize_acqf`, and
    - to ensure that the optimizer is `optimize_acqf`, because others are not supported
    
    This diff:
    * Modifies the `optimize_argparse` case for qKG to do nothing except update the optimizer to `optimize_acqf` and then call the base case
    
    Implementation notes:
    * Isn't it nonintuitive to set the optimizer then override it? Yes, a little, but the user can't choose the optimizer, so we're not overriding a user-specified choice. Also, lots of arguments to the `optimizer_argparse` functions get ignored. The "right" thing might be to put the choice of optimizer inside a dispatcher so that it can depend on the acquisition class, but that would be a bigger change.
    * Do we really need this dispatcher anymore, if it is doing so little? Maybe. Third parties may wish to use it to accommodate acquisition functions that are not in Ax. On the other hand, this dispatcher is currently not doing much of anything.
    * Do the `optimize_argparse` functions still need to support so many arguments, given that some of them seemed to just be there for constructing initial conditions? Probably not; I mean to look into that in a follow-up.
    
    Reviewed By: saitcakmak
    
    Differential Revision: D65227420
    esantorella authored and facebook-github-bot committed Oct 31, 2024
    Configuration menu
    Copy the full SHA
    088b79d View commit details
    Browse the repository at this point in the history
  2. Remove dispatching functionality from optimizer_argparse (facebook#…

    …2998)
    
    Summary:
    
    Context: This dispatcher's only usage is to raise an exception with qKG. There is no need for it to be a dispatcher.
    
    This diff:
    * makes `optimizer_argparse` no longer a dispatcher
    * Moves the error for qKG into the body of the now-only `optimizer_argparse` function
    * Removes special function for qKG
    * Changes type annotations so that the first argument is always an `AcquisitionFunction`; it was always used that way, with different types used only in tests.
    
    Reviewed By: saitcakmak, Balandat
    
    Differential Revision: D65231763
    esantorella authored and facebook-github-bot committed Oct 31, 2024
    Configuration menu
    Copy the full SHA
    da80d72 View commit details
    Browse the repository at this point in the history
  3. Don't pass unused arguments to optimizer_argparse (facebook#2999)

    Summary:
    
    Context:
    - The only arguments ever passed to `optimizer_argparse` are `acqf`, `optimizer_options`, and `optimizer`, `q`, and `bounds`. The latter two are always ignored.
    - `optimizer_argparse` accepts a bunch of arguments that are never passed to it... and never should be, because, as the docstring explains, `optimizer_options` is the right place to pass those.
    
    Reviewed By: saitcakmak, Balandat
    
    Differential Revision: D65233328
    esantorella authored and facebook-github-bot committed Oct 31, 2024
    Configuration menu
    Copy the full SHA
    c0da6fb View commit details
    Browse the repository at this point in the history