Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Direct Hybrid Factor Specification #1805

Merged
merged 36 commits into from
Sep 20, 2024
Merged
Changes from 1 commit
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
75d4724
remove extra imports in DiscreteBayesNet.cpp
varunagrawal Aug 22, 2024
dce5641
minor edits
varunagrawal Aug 22, 2024
9e77eba
rename X1 to X0 and X2 to X1
varunagrawal Aug 22, 2024
3fc1019
provide logNormalizers directly to the augment method
varunagrawal Aug 22, 2024
cfef6d3
update GaussianMixture::likelihood to compute the logNormalizers
varunagrawal Aug 22, 2024
30bf261
Tests which verify direct factor specification works well
varunagrawal Aug 22, 2024
bfaff50
remove extra prints
varunagrawal Aug 22, 2024
665d755
add docstring and GTSAM_EXPORT for ComputeLogNormalizer
varunagrawal Aug 22, 2024
03e61f4
Merge branch 'working-hybrid' into direct-hybrid-fg
varunagrawal Aug 23, 2024
07a0088
compute logNormalizers and pass to GaussianMixtureFactor
varunagrawal Aug 23, 2024
62b32fa
Merge branch 'working-hybrid' into direct-hybrid-fg
varunagrawal Aug 25, 2024
fbffd79
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 5, 2024
13193a1
better comments
varunagrawal Sep 5, 2024
0bab8ec
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 5, 2024
51a2fd5
improved comments
varunagrawal Sep 6, 2024
615c04a
some more refactor and remove redundant test
varunagrawal Sep 6, 2024
9dc29e0
fix test
varunagrawal Sep 6, 2024
05a4b7a
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 6, 2024
24ec30e
replace emplace_back with emplace_shared
varunagrawal Sep 6, 2024
506cda8
Merge branch 'hybrid-error-scalars' into direct-hybrid-fg
varunagrawal Sep 15, 2024
336b494
fixes
varunagrawal Sep 15, 2024
de68aec
fix tests
varunagrawal Sep 15, 2024
b895e64
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 18, 2024
987ecd4
undo accidental rename
varunagrawal Sep 18, 2024
80d9a5a
remove duplicate test and focus only on direct specification
varunagrawal Sep 19, 2024
717eb7e
relinearization test
varunagrawal Sep 19, 2024
f875b86
print nonlinear part of HybridValues
varunagrawal Sep 19, 2024
2937533
Merge branch 'develop' into direct-hybrid-fg
varunagrawal Sep 19, 2024
9b6facd
add documentation for additive scalar in the error and remove the 0.5…
varunagrawal Sep 19, 2024
244661a
rename ComputeLogNormalizer to ComputeLogNormalizerConstant
varunagrawal Sep 19, 2024
4f88829
fix docstring for HybridGaussianFactor
varunagrawal Sep 19, 2024
d60a253
logNormalizationConstant is now a method for Gaussian noise model
varunagrawal Sep 19, 2024
cea0dd5
update tests
varunagrawal Sep 19, 2024
1ab82f3
hide sqrt(2*value) so the user doesn't have to premultiply by 2
varunagrawal Sep 20, 2024
364b4b4
logDetR method which leverages noise model for efficiency. Build logD…
varunagrawal Sep 20, 2024
67a8b8f
comprehensive unit testing
varunagrawal Sep 20, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
hide sqrt(2*value) so the user doesn't have to premultiply by 2
varunagrawal committed Sep 20, 2024
commit 1ab82f382cf390704e7321336c174f4cf12dd945
3 changes: 1 addition & 2 deletions gtsam/hybrid/HybridGaussianConditional.cpp
Original file line number Diff line number Diff line change
@@ -222,8 +222,7 @@ std::shared_ptr<HybridGaussianFactor> HybridGaussianConditional::likelihood(
} else {
// Add a constant to the likelihood in case the noise models
// are not all equal.
double c = 2.0 * Cgm_Kgcm;
return {likelihood_m, c};
return {likelihood_m, Cgm_Kgcm};
}
});
return std::make_shared<HybridGaussianFactor>(
19 changes: 11 additions & 8 deletions gtsam/hybrid/HybridGaussianFactor.cpp
Original file line number Diff line number Diff line change
@@ -46,31 +46,34 @@ HybridGaussianFactor::Factors augment(
AlgebraicDecisionTree<Key> valueTree;
std::tie(gaussianFactors, valueTree) = unzip(factors);

// Normalize
// Compute minimum value for normalization.
double min_value = valueTree.min();
AlgebraicDecisionTree<Key> values =
valueTree.apply([&min_value](double n) { return n - min_value; });

// Finally, update the [A|b] matrices.
auto update = [&values](const Assignment<Key> &assignment,
const HybridGaussianFactor::sharedFactor &gf) {
auto update = [&min_value](const GaussianFactorValuePair &gfv) {
auto [gf, value] = gfv;

auto jf = std::dynamic_pointer_cast<JacobianFactor>(gf);
if (!jf) return gf;

double normalized_value = value - min_value;

// If the value is 0, do nothing
if (values(assignment) == 0.0) return gf;
if (normalized_value == 0.0) return gf;

GaussianFactorGraph gfg;
gfg.push_back(jf);

Vector c(1);
c << std::sqrt(values(assignment));
// When hiding c inside the `b` vector, value == 0.5*c^2
c << std::sqrt(2.0 * normalized_value);
auto constantFactor = std::make_shared<JacobianFactor>(c);

gfg.push_back(constantFactor);
return std::dynamic_pointer_cast<GaussianFactor>(
std::make_shared<JacobianFactor>(gfg));
};
return gaussianFactors.apply(update);
return HybridGaussianFactor::Factors(factors, update);
}

/* *******************************************************************************/
7 changes: 4 additions & 3 deletions gtsam/hybrid/tests/testHybridGaussianFactor.cpp
Original file line number Diff line number Diff line change
@@ -770,10 +770,11 @@ static HybridGaussianFactorGraph CreateFactorGraph(
->linearize(values);

// Create HybridGaussianFactor
// We multiply by -2 since the we want the underlying scalar to be log(|2πΣ|)
// We take negative since we want
// the underlying scalar to be log(\sqrt(|2πΣ|))
std::vector<GaussianFactorValuePair> factors{
{f0, -2 * model0->logNormalizationConstant()},
{f1, -2 * model1->logNormalizationConstant()}};
{f0, -model0->logNormalizationConstant()},
{f1, -model1->logNormalizationConstant()}};
HybridGaussianFactor motionFactor({X(0), X(1)}, m1, factors);

HybridGaussianFactorGraph hfg;
7 changes: 4 additions & 3 deletions gtsam/hybrid/tests/testHybridNonlinearFactorGraph.cpp
Original file line number Diff line number Diff line change
@@ -868,10 +868,11 @@ static HybridNonlinearFactorGraph CreateFactorGraph(
std::make_shared<BetweenFactor<double>>(X(0), X(1), means[1], model1);

// Create HybridNonlinearFactor
// We multiply by -2 since the we want the underlying scalar to be log(|2πΣ|)
// We take negative since we want
// the underlying scalar to be log(\sqrt(|2πΣ|))
std::vector<NonlinearFactorValuePair> factors{
{f0, -2 * model0->logNormalizationConstant()},
{f1, -2 * model1->logNormalizationConstant()}};
{f0, -model0->logNormalizationConstant()},
{f1, -model1->logNormalizationConstant()}};

HybridNonlinearFactor mixtureFactor({X(0), X(1)}, m1, factors);