Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chimera Objective #455

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from
Draft

Chimera Objective #455

wants to merge 3 commits into from

Conversation

jsmz97
Copy link
Collaborator

@jsmz97 jsmz97 commented Jan 3, 2025

This PR introduces Chimera, a general-purpose achievement scalarizing function for multi-target optimization that allows users to establish a hierarchy of targets with relative or absolute thresholds for concurrent optimization.

This implementation includes a new objective class, ChimeraObjective, which follows a similar approach to the DesirabilityObjective. It scalarizes multiple targets into a single score, termed Chimera Merits, which is to be minimized.

For further details, please refer to the following publication:
F. Häse, L.M. Roch, and A. Aspuru-Guzik. Chimera: enabling hierarchy-based multi-objective optimization for self-driving laboratories. Chemical Science 2018, 9(39), 7642-7655.


WIP:

  • Further testing and validation of the ChimeraObjective implementation.
  • Consideration of whether the optimization should be adjusted to a maximization approach, similar to the DesirabilityObjective.
  • Additional documentation and examples to illustrate the usage of ChimeraObjective.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@Scienfitz Scienfitz changed the title Chimera Extension Chimera Objective Jan 3, 2025
@Scienfitz
Copy link
Collaborator

@AVHopp @AdrianSosic PR in draft mode, NOT for review

@@ -127,6 +128,12 @@ def to_botorch(
additional_params["best_f"] = (
bo_surrogate.posterior(train_x).mean.max().item()
)
case ChimeraObjective():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

move to above block utilizing minimization as discussed


@define(frozen=True, slots=False)
class ChimeraObjective(Objective):
"""An objective scalarizing multiple targets using desirability values."""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wrong docstring mentioning desirability
incude a link to the publication, see other code parts (eg edbo) for how to properly include links

)
"The targets considered by the objective."

targets_threshold_values: tuple[float, ...] = field(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

targets_ in fornt is not necessary for either of the two threshold attributes

"""The softness parameter regulating the Heaviside function."""

@targets_threshold_values.default
def _default_targets_threshold_values(self) -> tuple[float, ...]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think there are no rasonable defaults so these should not have defaults
apart from that the defaults you specified do not satisfy the requirements ge(0.0)

return default_values

@softness.default # TODO: do we need to add warning here?
def _default_softness(self) -> float:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can simply be done by writing field(..., default=1e-3, ...) above where the attribute is specified
no warning needed

if not all(target._is_transform_normalized for target in targets):
raise ValueError(
"All targets must have normalized computational representations to "
"enable the computation of desirability values. This requires having "
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update message, still mentions desirability

arg = -value / softness
return np.exp(-np.logaddexp(0, arg))

def _hard_heaviside(self, value: float) -> float:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the hard function really needed? I think the soft heavyside should just recover the hard one for an extreme value of softness. If so that can simply be sued wherever a hard heaviside is needed


return self._soft_heaviside(value, softness)

def _invert_binary(self, a: float) -> float:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems so small and only used once, is this really needed?

for target, threshold_value, threshold_type in zip(
self.targets, transformed_threshold_values, self.targets_threshold_types
):
if threshold_type == ThresholdType.FRACTION:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all of these == checks should instead be done with is. The possible enum values are singletons and they should be checked via is jsut as for checks of soemthing is None

threshold_value, interpolation="linear"
)
elif threshold_type == ThresholdType.ABSOLUTE:
_threshold = threshold_value
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is fraction and absolute treated in the exact same way?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants