Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Poly commit refactoring #35

Closed
wants to merge 62 commits into from
Closed

Poly commit refactoring #35

wants to merge 62 commits into from

Conversation

phoinic
Copy link
Collaborator

@phoinic phoinic commented Dec 9, 2021

This PR contains the phase 2 poly-commit refactoring based on refactored ginger-lib

src/lib.rs Outdated Show resolved Hide resolved
src/lib.rs Outdated Show resolved Hide resolved
Copy link

@UlrichHaboeck75 UlrichHaboeck75 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As pointed out in my comments, or discussed offline:

  • let us find out if we can take G: Group as template for PolynomialCommitment and related traits. If so, let us declare the domain extended scheme as polynomial commitment with the "vector group" as commitment group.
  • let us trim() return again a PCParameters and reduce to a single function max_degree() in the traits for the key material,
  • let us clarify in comments why the IPA implementation of the fn get_hash() in the latter traits does "break" with the original intention.

src/domain_extended/data_structures.rs Show resolved Hide resolved
src/ipa_pc/mod.rs Show resolved Hide resolved
src/domain_extended/mod.rs Outdated Show resolved Hide resolved
Copy link

@UlrichHaboeck75 UlrichHaboeck75 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problems we faced in the implementation of the succinct verifier gadget (which partially generic in the concrete scheme) are due to our functional abstraction of the types for commitments and randomnesses, which leaves the concrete form/structure open and hence inaccessible.

To overcome this problem, and also for the reason of a more natural definition of our Group trait, I propose the following:

  • Let us remove the ScalarField and the Mul by such from the Group trait (and put it back in the Curve trait). Imposing a prime torsion is a rather structural than functional requirement. (It enforces that the implementor is either a prime group, or a vectorized version of it.)

  • We return to a polynomial commitment scheme over a Curve, i.e. PolynomialCommitment<G: Curve>, and give both Commitment and Randomnesses a specific structure, by demanding that they are vectors of items which implement G, or allow to iterate over such (using IntoIterator).

  • As we do not have the scalar multiplication in the group trait anymore, we provide separate helper functions (or trait implementations) for the addition, and scalar multiplication for these associated types.

This allows an implementation for G:EndoMulCurve which uses endomorphism based scalar multiplication instead of the ordinary one. (Hence we could drop the workaround by letting the Fiat Shamir primitive output the expanded native scalar representation of the squeezed challenges.)

src/lib.rs Show resolved Hide resolved
src/data_structures.rs Outdated Show resolved Hide resolved
src/domain_extended/mod.rs Outdated Show resolved Hide resolved
src/domain_extended/mod.rs Outdated Show resolved Hide resolved
src/domain_extended/mod.rs Outdated Show resolved Hide resolved
)
}

/// The verifier for proofs generated by `fn single_point_multi_poly_open()`.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe worth to comment, that both values and labeled commitments reflect the same order.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We often pass as parameters Iterators over references, and we must guarantee they have the same life time as the function itself

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if I understand. Is it clear that if one implements IntoIterator for labeled_commitments and values that they iterate in a consistent manner, so at each iteration step value and commitment "belong together"?
If it is not clear, isn't it worth to document this demand?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Put an additional comment on this

src/ipa_pc/mod.rs Outdated Show resolved Hide resolved
src/data_structures.rs Outdated Show resolved Hide resolved
src/ipa_pc/data_structures.rs Show resolved Hide resolved
src/tests.rs Outdated Show resolved Hide resolved
// If for some `i`, `polynomials[i].is_hiding() == false`, then the
// corresponding randomness is `G::ScalarField::empty()`.
// TODO: move this function back to poly-commit when merging into mono-repo.
fn commit_many<'a>(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just out of curiosity: What exactly is the reason that we re-introduce commit_many having it in the trait?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah well it was needed and it doesn't really harm, so

)
}

/// The verifier for proofs generated by `fn single_point_multi_poly_open()`.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if I understand. Is it clear that if one implements IntoIterator for labeled_commitments and values that they iterate in a consistent manner, so at each iteration step value and commitment "belong together"?
If it is not clear, isn't it worth to document this demand?


criterion_group!(
name = tweedle_open_proof;
config = Criterion::default().sample_size(10);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally, I am not a big fan of such low sample sizes. Why not increasing it to 30 at minimum?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Comment on lines +98 to +115
for n in 16..22 {
bench_open_proof::<TweedleDee, Blake2s>(
c,
"open proof in tweedle-dee, coeffs",
1 << n
);
}
}

fn bench_open_proof_tweedle_dum(c: &mut Criterion) {

for n in 16..22 {
bench_open_proof::<TweedleDum, Blake2s>(
c,
"open proof in tweedle-dum, coeffs",
1 << n
);
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It think it is good enough to do the benches up to 21. Is there a reason for the 2^22?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The iteration for the upper bound won't be executed, so it really benches only up to 2^21

src/lib.rs Outdated
Comment on lines 87 to 93
pub type QueryMap<'a, F> = BTreeMap<PointLabel, (F, BTreeSet<PolynomialLabel>)>;

/// `Evaluations` is the result of querying a set of labeled polynomials or linear combinations
/// `p` at a `QueryMap` `Q`.
/// It maps each element of `Q` to the resulting evaluation, e.g. `evaluation.get((label, query))`
/// should equal to `p[label].evaluate(query)`.
pub type Evaluations<'a, F> = BTreeMap<(PolynomialLabel, PointLabel), F>;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These lifetimes are useless actually

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Degree bounds and LCs Restructuring PCs
4 participants