From 5c459c227455b7b364b210a78cf375fa628d91d1 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 5 Oct 2020 16:48:49 +0100 Subject: [PATCH 01/94] Commented DistributedProver to map closely to Groth's notations --- .../reductions/r1cs_to_qap/R1CStoQAP.java | 12 +++--- .../zkSNARK/DistributedProver.java | 38 +++++++++++++++---- .../zkSNARK/SerialProver.java | 3 ++ .../zkSNARK/objects/ProvingKey.java | 1 + 4 files changed, 41 insertions(+), 13 deletions(-) diff --git a/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java b/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java index 72e686f..0515889 100755 --- a/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java +++ b/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java @@ -26,9 +26,9 @@ public class R1CStoQAP { * instance. * *

Namely, given a R1CSRelation constraint system r1cs and a field element x, construct a QAP - * instance (evaluated at t) for which: - At := (A_0(t),A_1(t),...,A_m(t)) - Bt := - * (B_0(t),B_1(t),...,B_m(t)) - Ct := (C_0(t),C_1(t),...,C_m(t)) - Ht := (1,t,t^2,...,t^n) - Zt := - * Z(t) ("vanishing polynomial of a certain set S, evaluated at t") where m = number of variables + * instance (evaluated at t) for which: At := (A_0(t),A_1(t),...,A_m(t)), Bt := + * (B_0(t),B_1(t),...,B_m(t)), Ct := (C_0(t),C_1(t),...,C_m(t)), Ht := (1,t,t^2,...,t^n), Zt := + * Z(t) ("vanishing polynomial of a certain set S, evaluated at t"); where m = number of variables * of the QAP */ public static > @@ -95,9 +95,9 @@ QAPRelation R1CStoQAPRelation(final R1CSRelation r1cs, final Fie * Witness map for the R1CSRelation-to-QAP reduction. * *

More precisely, compute the coefficients of the polynomial H(z) := - * (A(z)*B(z)-C(z))/Z(z) where: - A(z) := A_0(z) + \sum_{k=1}^{m} w_k A_k(z) + d1 * Z(z) - B(z) := - * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z) - C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + - * d3 * Z(z) - Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = + * (A(z)*B(z)-C(z))/Z(z) where: A(z) := A_0(z) + \sum_{k=1}^{m} w_k A_k(z) + d1 * Z(z), B(z) := + * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z), C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + + * d3 * Z(z), Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = * degree of the QAP * *

This is done as follows: (1) compute evaluations of A,B,C on S = {sigma_1,...,sigma_n} (2) diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java index cb14198..202914a 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java @@ -47,6 +47,10 @@ Proof prove( // We are dividing degree 2(d-1) polynomial by degree d polynomial // and not adding a PGHR-style ZK-patch, so our H is degree d-2. final FieldT zero = fieldFactory.zero(); + // 1. Filter the coeffs to only get the 3 coeffs at indices d-2, d-1, d + // 2. Carry out checks on these 3 coeffs, namely: + // - coeff at d-2 can not be 0 (we want a poly of deg d-2) + // - coeffs at d-1, d must be 0 (we want a poly of deg d-2) qapWitness .coefficientsH() .filter(e -> e._1 >= qapWitness.degree() - 2) @@ -58,7 +62,9 @@ Proof prove( assert (coeff._2.equals(zero)); } }); - // Check that the witness satisfies the QAP relation. + // Check that the witness satisfies the QAP relation + // To that end, we pick a random evaluation point t and check + // that the QAP is satisfied (i.e. this random evaluation point is not the one in the SRS) final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final QAPRelationRDD qap = R1CStoQAPRDD.R1CStoQAPRelation(provingKey.r1cs(), t, config); @@ -82,25 +88,40 @@ Proof prove( final G2T deltaG2 = provingKey.deltaG2(); final G1T rsDelta = deltaG1.mul(r.mul(s)); + // Number of partitions per RDD set in the config final int numPartitions = config.numPartitions(); - config.beginRuntime("Proof"); + config.beginRuntime("Generate proof"); config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); + // Get an RDD containing all pairs of elements with matching keys in `oneFullAssignment` and `provingKey.queryA()`. The result of this .join will be a (k, (v1, v2)) tuple, where (k, v1) is in `oneFullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the right scalar to the right group element in light of the `VariableBaseMSM` triggered at the next line. final JavaRDD> computationA = oneFullAssignment.join(provingKey.queryA(), numPartitions).values(); + // `evaluationAt` = \sum_{i=0}^{m} a_i * u_i(x) (in Groth16) + // where m = total number of wires + // a_i = ith wire/variable final G1T evaluationAt = VariableBaseMSM.distributedMSM(computationA); + // Once `queryA` is not useful anymore, mark the RDD as non-persistent, and remove all blocks for it from memory and disk. provingKey.queryA().unpersist(); config.endLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); config.beginLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); final JavaRDD>> computationB = oneFullAssignment.join(provingKey.queryB(), numPartitions).values(); + // `evaluationBt` = \sum_{i=0}^{m} a_i * v_i(x) (in Groth16) + // where m = total number of wires + // a_i = ith wire/variable + // Note: B is evaluated in G1 and G2, because B \in G2 but + // the encoded B in G1 also shows up in C (\in G1) final Tuple2 evaluationBt = VariableBaseMSM.distributedDoubleMSM(computationB); provingKey.queryB().unpersist(); config.endLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); - // Compute evaluationABC = a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + // Compute evaluationABC = variable_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + // In Groth16 notation, this is used for the computation of C: + // [\sum_{i=l+1}^{m} a_i*((beta*u_i(x) + alpha*v_i(x) + w_i(x))] /delta. + // where m = total number of wires + // a_i = ith wire/variable config.beginLog("Computing evaluation to deltaABC"); final JavaRDD> deltaABCAuxiliary = oneFullAssignment.join(provingKey.deltaABCG1(), numPartitions).values(); @@ -110,11 +131,15 @@ Proof prove( config.endLog("Computing evaluation to deltaABC"); config.beginLog("Computing evaluation to query H"); + // In Groth16 notations, queryH is the encoding in G1 of the vector <(x^i * t(x))/delta>, for i \in [0, n-2] + // As such, the value of `evaluationHtZt` actually is: (h(x)t(x))/delta if we follow Groth's notations final JavaRDD> computationH = qapWitness.coefficientsH().join(provingKey.queryH(), numPartitions).values(); final G1T evaluationHtZt = VariableBaseMSM.distributedMSM(computationH); provingKey.queryH().unpersist(); - evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta + // Add H(t)*Z(t)/delta to `evaluationABC` to get the first term of C, namely (following Groth's notations): + // [\sum_{i = l+1}^{m} a_i * (beta * u_i(x) + alpha * v_i(x) + w_i(x) + h(x)t(x))]/delta + evaluationABC = evaluationABC.add(evaluationHtZt); config.endLog("Computing evaluation to query H"); // A = alpha + sum_i(a_i*A_i(t)) + r*delta @@ -126,11 +151,10 @@ Proof prove( betaG1.add(evaluationBt._1).add(deltaG1.mul(s)), betaG2.add(evaluationBt._2).add(deltaG2.mul(s))); - // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - - // r*s*delta + // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*delta final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); - config.endRuntime("Proof"); + config.endRuntime("Generate proof"); return new Proof<>(A, B._2, C); } diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java index 89c3d24..8a7fdd2 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java @@ -31,6 +31,7 @@ Proof prove( final Assignment auxiliary, final FieldT fieldFactory, final Configuration config) { + // If the debug flag is set, check up-front that the R1CS is satisfied if (config.debugFlag()) { assert (provingKey.r1cs().isSatisfied(primary, auxiliary)); } @@ -46,7 +47,9 @@ Proof prove( // We are dividing degree 2(d-1) polynomial by degree d polynomial // and not adding a PGHR-style ZK-patch, so our H is degree d-2. final FieldT zero = fieldFactory.zero(); + // 1. Verify that H has a non-zero d-2 coefficient assert (!qapWitness.coefficientsH(qapWitness.degree() - 2).equals(zero)); + // 2. Make sure that coefficients d-1 and d are 0 to make sure that the polynomial hasn't a degree higher than d-2 assert (qapWitness.coefficientsH(qapWitness.degree() - 1).equals(zero)); assert (qapWitness.coefficientsH(qapWitness.degree()).equals(zero)); // Check that the witness satisfies the QAP relation. diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java index 7e3b4ab..bf31d3a 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java @@ -31,6 +31,7 @@ public class ProvingKey< private final List queryA; private final List> queryB; private final List queryH; + // The proving key holds the arithmetized relation private final R1CSRelation r1cs; public ProvingKey( From 16a00582b1faccf6f147d5209970ae5541ada06f Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 5 Oct 2020 16:55:43 +0100 Subject: [PATCH 02/94] Renamed evaluationHtZt properly to account for division by delta --- .../zk_proof_systems/zkSNARK/DistributedProver.java | 13 ++++++------- 1 file changed, 6 insertions(+), 7 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java index 202914a..6bd325c 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java @@ -94,7 +94,7 @@ Proof prove( config.beginRuntime("Generate proof"); config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); - // Get an RDD containing all pairs of elements with matching keys in `oneFullAssignment` and `provingKey.queryA()`. The result of this .join will be a (k, (v1, v2)) tuple, where (k, v1) is in `oneFullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the right scalar to the right group element in light of the `VariableBaseMSM` triggered at the next line. + // Get an RDD containing all pairs of elements with **matching keys** in `oneFullAssignment` and `provingKey.queryA()`. The result of this `.join()` will be a (k, (v1, v2)) tuple, where (k, v1) is in `oneFullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the right scalar to the right group element in light of the `VariableBaseMSM` triggered at the next line. final JavaRDD> computationA = oneFullAssignment.join(provingKey.queryA(), numPartitions).values(); // `evaluationAt` = \sum_{i=0}^{m} a_i * u_i(x) (in Groth16) @@ -111,8 +111,7 @@ Proof prove( // `evaluationBt` = \sum_{i=0}^{m} a_i * v_i(x) (in Groth16) // where m = total number of wires // a_i = ith wire/variable - // Note: B is evaluated in G1 and G2, because B \in G2 but - // the encoded B in G1 also shows up in C (\in G1) + // Note: We get an evaluation in G1 and G2, because B \in G2 is formed using this term, and C (\in G1) also uses this term (see below, B is of type `Tuple2` and will actually be computed in both G1 and G2 for this exact purpose). final Tuple2 evaluationBt = VariableBaseMSM.distributedDoubleMSM(computationB); provingKey.queryB().unpersist(); config.endLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); @@ -131,15 +130,15 @@ Proof prove( config.endLog("Computing evaluation to deltaABC"); config.beginLog("Computing evaluation to query H"); - // In Groth16 notations, queryH is the encoding in G1 of the vector <(x^i * t(x))/delta>, for i \in [0, n-2] - // As such, the value of `evaluationHtZt` actually is: (h(x)t(x))/delta if we follow Groth's notations + // In Groth16 notations, `queryH` is the encoding in G1 of the vector <(x^i * t(x))/delta>, for i \in [0, n-2] + // As such, the value of `evaluationHtZtOverDelta` actually is: (h(x)t(x))/delta if we follow Groth's notations final JavaRDD> computationH = qapWitness.coefficientsH().join(provingKey.queryH(), numPartitions).values(); - final G1T evaluationHtZt = VariableBaseMSM.distributedMSM(computationH); + final G1T evaluationHtZtOverDelta = VariableBaseMSM.distributedMSM(computationH); provingKey.queryH().unpersist(); // Add H(t)*Z(t)/delta to `evaluationABC` to get the first term of C, namely (following Groth's notations): // [\sum_{i = l+1}^{m} a_i * (beta * u_i(x) + alpha * v_i(x) + w_i(x) + h(x)t(x))]/delta - evaluationABC = evaluationABC.add(evaluationHtZt); + evaluationABC = evaluationABC.add(evaluationHtZtOverDelta); config.endLog("Computing evaluation to query H"); // A = alpha + sum_i(a_i*A_i(t)) + r*delta From d3752ed4fd4faaa88e76d95b6323fdb10d9365f2 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 5 Oct 2020 17:32:18 +0100 Subject: [PATCH 03/94] Added comments to proving key source to map to Groth notations --- .../zkSNARK/objects/ProvingKeyRDD.java | 13 +++++++++++++ 1 file changed, 13 insertions(+) diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java index d92649b..20a4db1 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java @@ -21,15 +21,28 @@ public class ProvingKeyRDD< G2T extends AbstractG2> implements Serializable { + // Below, [x]_1 (resp. [x]_2 and []_T) represents the encoding of x in G1 (resp. G2 and GT) + // We follow the notations in Groth16 (namely, polynomials are denoted u, v, w, h, t instead of A, B, C, H, Z. Moreoverm the evaluation point is denoted by x) + // + // [alpha]_1 private final G1T alphaG1; + // [beta]_1 private final G1T betaG1; + // [beta]_2 private final G2T betaG2; + // [delta]_1 private final G1T deltaG1; + // [delta]_2 private final G2T deltaG2; + // {[(beta * u_i(x) + alpha * v_i(x) + w_i(x))/delta]_1} private final JavaPairRDD deltaABCG1; + // {[u_i(x)]_1} private final JavaPairRDD queryA; + // {[v_i(x)]_1} private final JavaPairRDD> queryB; + // {[(x^i * t(x))/delta]_1} private final JavaPairRDD queryH; + // The proving key contains an arithmetized relation private final R1CSRelationRDD r1cs; public ProvingKeyRDD( From a01328bce9f80e8603399061b3b12cf33d1c3104 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 5 Oct 2020 17:33:00 +0100 Subject: [PATCH 04/94] Renamed wire assignment variable --- .../reductions/r1cs_to_qap/R1CStoQAPRDD.java | 22 +++++++++++-------- .../zkSNARK/DistributedProver.java | 21 +++++++++--------- 2 files changed, 24 insertions(+), 19 deletions(-) diff --git a/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java b/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java index 364b78e..1c78261 100755 --- a/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java +++ b/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java @@ -167,9 +167,9 @@ QAPRelationRDD R1CStoQAPRelation( * Witness map for the R1CSRelation-to-QAP reduction. * *

More precisely, compute the coefficients h_0,h_1,...,h_n of the polynomial H(z) := - * (A(z)*B(z)-C(z))/Z(z) where: - A(z) := A_0(z) + \sum_{k=1}^{m} w_k A_k(z) + d1 * Z(z) - B(z) := - * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z) - C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + - * d3 * Z(z) - Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = + * (A(z)*B(z)-C(z))/Z(z) where: A(z) := A_0(z) + \sum_{k=1}^{m} w_k A_k(z) + d1 * Z(z), B(z) := + * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z), C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + + * d3 * Z(z), Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = * degree of the QAP * *

This is done as follows: (1) compute evaluations of A,B,C on S = {sigma_1,...,sigma_n} (2) @@ -183,16 +183,18 @@ QAPRelationRDD R1CStoQAPRelation( QAPWitnessRDD R1CStoQAPWitness( final R1CSRelationRDD r1cs, final Assignment primary, - final JavaPairRDD oneFullAssignment, + final JavaPairRDD fullAssignment, final FieldT fieldFactory, final Configuration config) { if (config.debugFlag()) { - assert (r1cs.isSatisfied(primary, oneFullAssignment)); + assert (r1cs.isSatisfied(primary, fullAssignment)); } final FieldT multiplicativeGenerator = fieldFactory.multiplicativeGenerator(); final FieldT zero = fieldFactory.zero(); + // We do a Radix2-FFT to retrieve the QAP (polynomial form) from the R1CS (matrix form) + // via interpolation on a given domain of size a "big enough" power of 2 final long domainSize = MathUtils.lowestPowerOfTwo(r1cs.numConstraints() + r1cs.numInputs()); config.beginLog("Account for the additional constraints input_i * 0 = 0."); @@ -207,6 +209,8 @@ QAPWitnessRDD R1CStoQAPWitness( // TODO (howardwu): Replace hardcoded popular variable assignment indices with list of // these indices from R1CSRelationRDD. config.beginLog("Compute evaluation of polynomials A, B, and C, on set S."); + // r1cs.constraints() returns a `R1CSConstraintsRDD` which is a set of `JavaPairRDD>`. + // In other, words, A, B, C are all vectors of linear terms (i.e. matrices that will be intepolated on the chosen domain) final JavaPairRDD zeroIndexedA = r1cs.constraints() .A() @@ -237,7 +241,7 @@ QAPWitnessRDD R1CStoQAPWitness( // Swap the constraint and term index positions. return new Tuple2<>(term._2.index(), new Tuple2<>(term._1, term._2.value())); }) - .join(oneFullAssignment, config.numPartitions()) + .join(fullAssignment, config.numPartitions()) .mapToPair( term -> { // Multiply the constraint value by the input value. @@ -256,7 +260,7 @@ QAPWitnessRDD R1CStoQAPWitness( // Swap the constraint and term index positions. return new Tuple2<>(term._2.index(), new Tuple2<>(term._1, term._2.value())); }) - .join(oneFullAssignment, config.numPartitions()) + .join(fullAssignment, config.numPartitions()) .mapToPair( term -> { // Multiply the constraint value by the input value. @@ -274,7 +278,7 @@ QAPWitnessRDD R1CStoQAPWitness( // Swap the constraint and term index positions. return new Tuple2<>(term._2.index(), new Tuple2<>(term._1, term._2.value())); }) - .join(oneFullAssignment, config.numPartitions()) + .join(fullAssignment, config.numPartitions()) .mapToPair( term -> { // Multiply the constraint value by the input value. @@ -322,6 +326,6 @@ QAPWitnessRDD R1CStoQAPWitness( config.endLog("Compute coefficients of polynomial H."); return new QAPWitnessRDD<>( - oneFullAssignment, coefficientsH, r1cs.numInputs(), r1cs.numVariables(), domainSize); + fullAssignment, coefficientsH, r1cs.numInputs(), r1cs.numVariables(), domainSize); } } diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java index 6bd325c..daf260f 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java @@ -30,17 +30,18 @@ public class DistributedProver { Proof prove( final ProvingKeyRDD provingKey, final Assignment primary, - final JavaPairRDD oneFullAssignment, + final JavaPairRDD fullAssignment, final FieldT fieldFactory, final Configuration config) { - if (config.debugFlag()) { - assert (provingKey.r1cs().isSatisfied(primary, oneFullAssignment)); - } + // Note: `R1CStoQAPWitness` already checks the value of the configuration `debugFlag`, and already checks that the R1CS is satisfied on input `primary` and `fullAssignment`. No need to do it again it, this is redundant. + //if (config.debugFlag()) { + // assert (provingKey.r1cs().isSatisfied(primary, fullAssignment)); + //} config.beginLog("Computing witness polynomial"); final QAPWitnessRDD qapWitness = R1CStoQAPRDD.R1CStoQAPWitness( - provingKey.r1cs(), primary, oneFullAssignment, fieldFactory, config); + provingKey.r1cs(), primary, fullAssignment, fieldFactory, config); config.endLog("Computing witness polynomial"); if (config.debugFlag()) { @@ -94,9 +95,9 @@ Proof prove( config.beginRuntime("Generate proof"); config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); - // Get an RDD containing all pairs of elements with **matching keys** in `oneFullAssignment` and `provingKey.queryA()`. The result of this `.join()` will be a (k, (v1, v2)) tuple, where (k, v1) is in `oneFullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the right scalar to the right group element in light of the `VariableBaseMSM` triggered at the next line. + // Get an RDD containing all pairs of elements with **matching keys** in `fullAssignment` and `provingKey.queryA()`. The result of this `.join()` will be a (k, (v1, v2)) tuple, where (k, v1) is in `fullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the right scalar to the right group element in light of the `VariableBaseMSM` triggered at the next line. final JavaRDD> computationA = - oneFullAssignment.join(provingKey.queryA(), numPartitions).values(); + fullAssignment.join(provingKey.queryA(), numPartitions).values(); // `evaluationAt` = \sum_{i=0}^{m} a_i * u_i(x) (in Groth16) // where m = total number of wires // a_i = ith wire/variable @@ -107,7 +108,7 @@ Proof prove( config.beginLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); final JavaRDD>> computationB = - oneFullAssignment.join(provingKey.queryB(), numPartitions).values(); + fullAssignment.join(provingKey.queryB(), numPartitions).values(); // `evaluationBt` = \sum_{i=0}^{m} a_i * v_i(x) (in Groth16) // where m = total number of wires // a_i = ith wire/variable @@ -123,10 +124,10 @@ Proof prove( // a_i = ith wire/variable config.beginLog("Computing evaluation to deltaABC"); final JavaRDD> deltaABCAuxiliary = - oneFullAssignment.join(provingKey.deltaABCG1(), numPartitions).values(); + fullAssignment.join(provingKey.deltaABCG1(), numPartitions).values(); G1T evaluationABC = VariableBaseMSM.distributedMSM(deltaABCAuxiliary); provingKey.deltaABCG1().unpersist(); - oneFullAssignment.unpersist(); + fullAssignment.unpersist(); config.endLog("Computing evaluation to deltaABC"); config.beginLog("Computing evaluation to query H"); From 02f0785a8e56f52ae2626aa56f8f4a6f45959189 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 10:46:55 +0100 Subject: [PATCH 05/94] Added extra comments --- src/main/java/relations/objects/LinearTerm.java | 2 ++ src/main/java/relations/objects/R1CSConstraint.java | 1 + .../java/relations/objects/R1CSConstraintsRDD.java | 13 +++++++++++-- 3 files changed, 14 insertions(+), 2 deletions(-) diff --git a/src/main/java/relations/objects/LinearTerm.java b/src/main/java/relations/objects/LinearTerm.java index b12c8af..04e28f2 100755 --- a/src/main/java/relations/objects/LinearTerm.java +++ b/src/main/java/relations/objects/LinearTerm.java @@ -9,6 +9,8 @@ import java.io.Serializable; +// Equivalent to: +// https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/variable.hpp#L92 public class LinearTerm implements Serializable { private final long index; diff --git a/src/main/java/relations/objects/R1CSConstraint.java b/src/main/java/relations/objects/R1CSConstraint.java index d82491e..3c1242f 100755 --- a/src/main/java/relations/objects/R1CSConstraint.java +++ b/src/main/java/relations/objects/R1CSConstraint.java @@ -20,6 +20,7 @@ * *

A R1CSRelation constraint is used to construct a R1CSRelation constraint system. */ +// Similar to https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/constraint_satisfaction_problems/r1cs/r1cs.tcc#L31-L37 public class R1CSConstraint> implements Serializable { diff --git a/src/main/java/relations/objects/R1CSConstraintsRDD.java b/src/main/java/relations/objects/R1CSConstraintsRDD.java index 9d3de9b..c237f1d 100755 --- a/src/main/java/relations/objects/R1CSConstraintsRDD.java +++ b/src/main/java/relations/objects/R1CSConstraintsRDD.java @@ -16,8 +16,8 @@ * *

{ < A_k , X > * < B_k , X > = < C_k , X > }_{k=1}^{n} * - *

In other words, the system is satisfied if and only if there exist a USCS variable assignment - * for which each R1CSRelation constraint is satisfied. + *

In other words, the system is satisfied if and only if there exist a USCS (Unitary-Square Constraint System) + * variable assignment for which each R1CSRelation constraint is satisfied. * *

NOTE: The 0-th variable (i.e., "x_{0}") always represents the constant 1 Thus, the 0-th * variable is not included in num_variables. @@ -25,6 +25,15 @@ public class R1CSConstraintsRDD> implements Serializable { + // Linear combinations represent additive sub-circuits + // In other words, each LinearCombination can be represented as a dot product between X (the "wire" vector) + // and another vector "selecting each wires" to "route" them. + // + // An R1CS is a set of 3 matrices A,B,C that are represented here by a set of 3 sets of + // linear combinations. + // + // NOTE: `R1CSConstraintsRDD` is handled differently than "normal"/non-distributed R1CS which is a set of "constraints" each of which being a triple of linear combinations. + // TODO: Understand how the RDD R1CS is structured and processed. private JavaPairRDD> A; private JavaPairRDD> B; private JavaPairRDD> C; From fdae021934acd65c3e9dcaa1caf9e6f4c72164f9 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 14:21:09 +0100 Subject: [PATCH 06/94] Further renaming for wire assignments and added comments about RDD R1CS structure --- .../profiler/generation/R1CSConstruction.java | 143 ++++++++++-------- .../relations/objects/R1CSConstraints.java | 4 + .../relations/objects/R1CSConstraintsRDD.java | 4 +- 3 files changed, 86 insertions(+), 65 deletions(-) diff --git a/src/main/java/profiler/generation/R1CSConstruction.java b/src/main/java/profiler/generation/R1CSConstruction.java index 02d65dd..c3b93a9 100755 --- a/src/main/java/profiler/generation/R1CSConstruction.java +++ b/src/main/java/profiler/generation/R1CSConstruction.java @@ -39,38 +39,49 @@ Tuple3, Assignment, Assignment> serialConst FieldT a = fieldFactory.random(config.seed(), config.secureSeed()); FieldT b = fieldFactory.random(config.seed(), config.secureSeed()); - final Assignment oneFullAssignment = new Assignment<>(); - oneFullAssignment.add(one); - oneFullAssignment.add(a); - oneFullAssignment.add(b); - + // Wire assignment - this is the vector X containing all values of all wires in the circuit + // it is of the form: [ONE, var1, var2, ..., varN], where N is the total number of wires + // (i.e. |instance_set| + |witness_set|) + final Assignment fullAssignment = new Assignment<>(); + fullAssignment.add(one); // index 0 + fullAssignment.add(a); // index 1 + fullAssignment.add(b); // index 2 + + // Create the set of testing/profiling constraints final R1CSConstraints constraints = new R1CSConstraints<>(); for (int i = 0; i < numConstraints - 1; i++) { final LinearCombination A = new LinearCombination<>(); final LinearCombination B = new LinearCombination<>(); final LinearCombination C = new LinearCombination<>(); + // Below the indexes of the linear terms are shifted to avoid "capturing" ONE + // in the assignment vector (which index is 0) if (i % 2 != 0) { - // a * b = c. + // Simple multiplication gate: a * b = c A.add(new LinearTerm<>((long) i + 1, one)); B.add(new LinearTerm<>((long) i + 2, one)); C.add(new LinearTerm<>((long) i + 3, one)); + // Compute a valid assignment by setting the output wire to a*b and storing the + // right value in the assignment vector. final FieldT tmp = a.mul(b); a = b; b = tmp; - oneFullAssignment.add(tmp); + fullAssignment.add(tmp); } else { - // a + b = c + // Simple addition gate: a + b = c A.add(new LinearTerm<>((long) i + 1, one)); A.add(new LinearTerm<>((long) i + 2, one)); + // Select the ONE variable in the assignment vector during addition gates B.add(new LinearTerm<>((long) 0, one)); C.add(new LinearTerm<>((long) i + 3, one)); + // Compute a valid assignment by setting the output wire to a+b and storing the + // right value in the assignment vector. final FieldT tmp = a.add(b); a = b; b = tmp; - oneFullAssignment.add(tmp); + fullAssignment.add(tmp); } constraints.add(new R1CSConstraint<>(A, B, C)); @@ -85,21 +96,21 @@ Tuple3, Assignment, Assignment> serialConst A.add(new LinearTerm<>((long) i, one)); B.add(new LinearTerm<>((long) i, one)); - res = res.add(oneFullAssignment.get(i)); + res = res.add(fullAssignment.get(i)); } C.add(new LinearTerm<>((long) numVariables - 1, one)); - oneFullAssignment.add(res.square()); + fullAssignment.add(res.square()); constraints.add(new R1CSConstraint<>(A, B, C)); final R1CSRelation r1cs = new R1CSRelation<>(constraints, numInputs, numAuxiliary); - final Assignment primary = new Assignment<>(oneFullAssignment.subList(0, numInputs)); + final Assignment primary = new Assignment<>(fullAssignment.subList(0, numInputs)); final Assignment auxiliary = - new Assignment<>(oneFullAssignment.subList(numInputs, oneFullAssignment.size())); + new Assignment<>(fullAssignment.subList(numInputs, fullAssignment.size())); assert (r1cs.numInputs() == numInputs); assert (r1cs.numVariables() >= numInputs); - assert (r1cs.numVariables() == oneFullAssignment.size()); + assert (r1cs.numVariables() == fullAssignment.size()); assert (r1cs.numConstraints() == numConstraints); assert (r1cs.isSatisfied(primary, auxiliary)); @@ -140,7 +151,10 @@ Tuple3, Assignment, Assignment> serialConst part == numPartitions ? totalSize % (totalSize / numPartitions) : totalSize / numPartitions; - + // Linear combinations are now represented as: `ArrayList>>`. + // For instance, [Tuple2<0, LinearTerm(0,1)>, Tuple2<0, LinearTerm(1,5)>, Tuple2<0, LinearTerm(2,10)>] + // represents the 0th linear combination (0th line in the A matrix): A_0 = [1,5,10] + // In other words, the `Long` key in the tuple is the linear combination index. final ArrayList>> A = new ArrayList<>(); for (long i = 0; i < partSize; i++) { final long index = part * (totalSize / numPartitions) + i; @@ -275,7 +289,7 @@ Tuple3, Assignment, Assignment> serialConst if (totalSize % 2 != 0) { assignmentPartitions.add(numExecutors); } - JavaPairRDD oneFullAssignment = + JavaPairRDD fullAssignment = config .sparkContext() .parallelize(assignmentPartitions, numExecutors) @@ -352,9 +366,10 @@ Tuple3, Assignment, Assignment> serialConst FieldT serialA = a; FieldT serialB = b; final Assignment primary = new Assignment<>(); - primary.add(one); - primary.add(serialA); - primary.add(serialB); + primary.add(one); // Index 0 + primary.add(serialA); // Index 1 + primary.add(serialB); // Index 2 + // Start at index 3 for (int i = 4; i <= numInputs; i++) { if (i % 2 != 0) { // a * b = c @@ -371,8 +386,8 @@ Tuple3, Assignment, Assignment> serialConst } } - // This action will store oneFullAssignment and the linear combinations into persistent storage. - final long oneFullAssignmentSize = oneFullAssignment.count(); + // This action will store fullAssignment and the linear combinations into persistent storage. + final long oneFullAssignmentSize = fullAssignment.count(); linearCombinationA.count(); linearCombinationB.count(); linearCombinationC.count(); @@ -388,9 +403,9 @@ Tuple3, Assignment, Assignment> serialConst assert (r1cs.numVariables() >= numInputs); assert (r1cs.numVariables() == oneFullAssignmentSize); assert (r1cs.numConstraints() == numConstraints); - assert (r1cs.isSatisfied(primary, oneFullAssignment)); + assert (r1cs.isSatisfied(primary, fullAssignment)); - return new Tuple3<>(r1cs, primary, oneFullAssignment); + return new Tuple3<>(r1cs, primary, fullAssignment); } /** Linear algebra applications */ @@ -721,7 +736,7 @@ public static > void printMa } } - JavaPairRDD oneFullAssignment = + JavaPairRDD fullAssignment = JavaPairRDD.fromJavaRDD(config.sparkContext().parallelize(assignment, numPartitions)); final R1CSConstraintsRDD constraints = new R1CSConstraintsRDD<>(ALC, BLC, CLC, numConstraints); @@ -731,15 +746,15 @@ public static > void printMa final Assignment primary = new Assignment<>( Utils.convertFromPairs( - oneFullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); + fullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); assert (r1cs.numInputs() == numInputs); assert (r1cs.numVariables() >= numInputs); - assert (r1cs.numVariables() == oneFullAssignment.count()); + assert (r1cs.numVariables() == fullAssignment.count()); assert (r1cs.numConstraints() == numConstraints); - assert (r1cs.isSatisfied(primary, oneFullAssignment)); + assert (r1cs.isSatisfied(primary, fullAssignment)); - return new Tuple3<>(r1cs, primary, oneFullAssignment); + return new Tuple3<>(r1cs, primary, fullAssignment); } // Again, we have A, B, and C where AB = C @@ -1625,16 +1640,16 @@ Tuple3, JavaPairRDD, Long> matmulParCon // config.endLog("[R1CSConstruction] cogroup"); // config.beginLog("[R1CSConstruction] witness calculation"); - JavaPairRDD oneFullAssignment = + JavaPairRDD fullAssignment = cogroupResult.flatMapToPair( x -> matmulParAssignHelper( fieldFactory, x._1(), x._2(), n1, n2, n3, b1, b2, b3, assignmentOffsetOutput)); - // long assignmentCount = oneFullAssignment.count(); + // long assignmentCount = fullAssignment.count(); // cogroupResult.unpersist(); - // config.endLog("[R1CSConstruction] witness calculation (oneFullAssignment count is " + + // config.endLog("[R1CSConstruction] witness calculation (fullAssignment count is " + // assignmentCount + ")"); ArrayList intList = new ArrayList<>(); for (int i = 0; i < b1 * b3; i++) { @@ -1712,7 +1727,7 @@ Tuple3, JavaPairRDD, Long> matmulParCon // config.endLog("[R1CSConstruction] constraint generation"); - // Iterable> assignments = oneFullAssignment.collect(); + // Iterable> assignments = fullAssignment.collect(); // for (Tuple2 a : assignments) { // System.out.println("ASSIGNMENT [" + a._1() + "]: " + a._2()); // } @@ -1722,12 +1737,12 @@ Tuple3, JavaPairRDD, Long> matmulParCon // final Assignment primary = new Assignment<>( // Utils.convertFromPairs( - // oneFullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), + // fullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), // numInputs)); System.out.println("[R1CSConstruction::matmulParConstruct] numConstraints = " + numConstraints); - return new Tuple3<>(constraints, oneFullAssignment, numAssignments); + return new Tuple3<>(constraints, fullAssignment, numAssignments); } public static > @@ -1806,7 +1821,7 @@ Tuple3, JavaPairRDD, Long> matmulParCon constraintOffset); final R1CSConstraintsRDD constraints = ret._1(); - JavaPairRDD oneFullAssignment = ret._2(); + JavaPairRDD fullAssignment = ret._2(); JavaPairRDD newARDD = aRDD.mapToPair( @@ -1820,16 +1835,16 @@ Tuple3, JavaPairRDD, Long> matmulParCon return new Tuple2(x._1() + 1 + n1 * n2, x._2()); }); - oneFullAssignment = oneFullAssignment.union(newARDD).union(newBRDD); - oneFullAssignment = - oneFullAssignment.union( + fullAssignment = fullAssignment.union(newARDD).union(newBRDD); + fullAssignment = + fullAssignment.union( config .sparkContext() .parallelizePairs(Collections.singletonList(new Tuple2<>((long) 0, one)))); - config.beginLog("[matmulApp] oneFullAssignment"); - long numVariables2 = oneFullAssignment.cache().count(); - config.endLog("[matmulApp] oneFullAssignment"); + config.beginLog("[matmulApp] fullAssignment"); + long numVariables2 = fullAssignment.cache().count(); + config.endLog("[matmulApp] fullAssignment"); assert (numVariables2 == numVariables); @@ -1844,13 +1859,13 @@ Tuple3, JavaPairRDD, Long> matmulParCon final Assignment primary = new Assignment<>( Utils.convertFromPairs( - oneFullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); + fullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); config.endLog("[matmulApp] primary generation"); final R1CSRelationRDD r1cs = new R1CSRelationRDD<>(constraints, numInputs, numAuxiliary); - return new Tuple3<>(r1cs, primary, oneFullAssignment); + return new Tuple3<>(r1cs, primary, fullAssignment); } public static > @@ -2029,7 +2044,7 @@ R1CSConstraintsRDD equalityConstraintGen( JavaPairRDD y = JavaPairRDD.fromJavaRDD(config.sparkContext().parallelize(yList, numPartitions)); - JavaPairRDD oneFullAssignment = + JavaPairRDD fullAssignment = config .sparkContext() .parallelizePairs(Collections.singletonList(new Tuple2<>((long) 0, one))); @@ -2086,7 +2101,7 @@ R1CSConstraintsRDD equalityConstraintGen( // is " + rhsAssignmentsOffset); R1CSConstraintsRDD constraints = X2Constraints; - oneFullAssignment = oneFullAssignment.union(X2Assignments); + fullAssignment = fullAssignment.union(X2Assignments); // (X^T X) w, map to get the results of (X^T X) BlockIndexerC RHSBlockIndexer = new BlockIndexerC(assignmentOffset, d, n, d, bd, bn, bd); @@ -2128,7 +2143,7 @@ R1CSConstraintsRDD equalityConstraintGen( long lhsConstraintOffset = rhsConstraintOffset + RHSConstraints.size(); constraints.union(RHSConstraints); - oneFullAssignment = oneFullAssignment.union(RHSAssignments); + fullAssignment = fullAssignment.union(RHSAssignments); // Calculate X^T y LinearIndexer yIndexer = new LinearIndexer(1 + n * d + d * 1); @@ -2156,7 +2171,7 @@ R1CSConstraintsRDD equalityConstraintGen( long numLHSConstraints = LHSConstraints.size(); constraints.union(LHSConstraints); - oneFullAssignment = oneFullAssignment.union(LHSAssignments); + fullAssignment = fullAssignment.union(LHSAssignments); XT.unpersist(); long finalConstraintOffsetNumber = lhsConstraintOffset + numLHSConstraints; @@ -2187,10 +2202,10 @@ R1CSConstraintsRDD equalityConstraintGen( return new Tuple2(x._1() + 1 + n * d + d * 1, x._2()); }); - oneFullAssignment = oneFullAssignment.union(newXRDD).union(newWRDD).union(newYRDD); - config.beginLog("[Linear regression app] oneFullAssignment"); - long numVariables = oneFullAssignment.cache().count(); // UNUSED - config.endLog("[Linear regression app] oneFullAssignment"); + fullAssignment = fullAssignment.union(newXRDD).union(newWRDD).union(newYRDD); + config.beginLog("[Linear regression app] fullAssignment"); + long numVariables = fullAssignment.cache().count(); // UNUSED + config.endLog("[Linear regression app] fullAssignment"); config.beginLog("[Linear regression app] constraints generation"); constraints.A().cache().count(); @@ -2203,7 +2218,7 @@ R1CSConstraintsRDD equalityConstraintGen( final Assignment primary = new Assignment<>( Utils.convertFromPairs( - oneFullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); + fullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); config.endLog("[Linear regression app] primary generation"); long numVariables2 = 1 + numInputs + d * d * (2 * n) + d * (2 * d) + d * (2 * n); @@ -2213,7 +2228,7 @@ R1CSConstraintsRDD equalityConstraintGen( new R1CSRelationRDD<>(constraints, numInputs, numVariables2 - numInputs); return new Tuple3, Assignment, JavaPairRDD>( - r1cs, primary, oneFullAssignment); + r1cs, primary, fullAssignment); } // Does a linear combination of matrices/vectors @@ -2650,11 +2665,11 @@ public int compare(Tuple2 tupleA, Tuple2 tupleB) { d * d); constraints.union(finalConstraints); - JavaPairRDD oneFullAssignment = xm.union(xm2._2()).union(covN); + JavaPairRDD fullAssignment = xm.union(xm2._2()).union(covN); long numAssignments = n * d + xm2NumAssignments + d * d; - return new Tuple3<>(constraints, oneFullAssignment, numAssignments); + return new Tuple3<>(constraints, fullAssignment, numAssignments); } public static > @@ -3066,7 +3081,7 @@ Tuple3, Assignment, JavaPairRDD> g constraintOffset); R1CSConstraintsRDD constraints = ret._1(); - JavaPairRDD oneFullAssignment = ret._2(); + JavaPairRDD fullAssignment = ret._2(); // remap X, mean, and covariance to have the correct indices JavaPairRDD newX = @@ -3085,9 +3100,9 @@ Tuple3, Assignment, JavaPairRDD> g return new Tuple2(x._1() + 1 + n * d + d, x._2()); }); - oneFullAssignment = oneFullAssignment.union(newX).union(newMean).union(newCov); - oneFullAssignment = - oneFullAssignment.union( + fullAssignment = fullAssignment.union(newX).union(newMean).union(newCov); + fullAssignment = + fullAssignment.union( config .sparkContext() .parallelizePairs(Collections.singletonList(new Tuple2<>((long) 0, one)))); @@ -3105,9 +3120,9 @@ public int compare(Tuple2 tupleA, Tuple2 tupleB) { } }; - config.beginLog("[gaussianFitApp] oneFullAssignment"); - long numVariables = oneFullAssignment.cache().count(); - config.endLog("[gaussianFitApp] oneFullAssignment"); + config.beginLog("[gaussianFitApp] fullAssignment"); + long numVariables = fullAssignment.cache().count(); + config.endLog("[gaussianFitApp] fullAssignment"); config.beginLog("[gaussianFitApp] constraints generation"); constraints.A().cache().count(); @@ -3122,13 +3137,13 @@ public int compare(Tuple2 tupleA, Tuple2 tupleB) { final Assignment primary = new Assignment( Utils.convertFromPairs( - oneFullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); + fullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), numInputs)); config.endLog("[gaussianFitApp] primary generation"); final R1CSRelationRDD r1cs = new R1CSRelationRDD(constraints, numInputs, numVariables - numInputs); return new Tuple3, Assignment, JavaPairRDD>( - r1cs, primary, oneFullAssignment); + r1cs, primary, fullAssignment); } } diff --git a/src/main/java/relations/objects/R1CSConstraints.java b/src/main/java/relations/objects/R1CSConstraints.java index b77d6d2..d34e3ca 100755 --- a/src/main/java/relations/objects/R1CSConstraints.java +++ b/src/main/java/relations/objects/R1CSConstraints.java @@ -27,6 +27,10 @@ public class R1CSConstraints private ArrayList> constraints; + // TODO: + // We may want to add "annotations" as in libsnark. Something like: + // private ArrayList constraints_annotations; + public R1CSConstraints() { constraints = new ArrayList<>(); } diff --git a/src/main/java/relations/objects/R1CSConstraintsRDD.java b/src/main/java/relations/objects/R1CSConstraintsRDD.java index c237f1d..5e1f78a 100755 --- a/src/main/java/relations/objects/R1CSConstraintsRDD.java +++ b/src/main/java/relations/objects/R1CSConstraintsRDD.java @@ -33,7 +33,9 @@ public class R1CSConstraintsRDD>` (which is "implied" by the `JavaPairRDD`) represents the "index" of the linear combination of interest. + // For instance, the 1*a+2*b+5*d represented by <1,2,0,5> . , leading to the ith matrix line [1 2 0 5] in the R1CS can be represented by: + // < Tuple2>(i, LinearTerm(0, 1)), Tuple2>(i, LinearTerm(1, 2)), Tuple2>(i, LinearTerm(3, 0)), Tuple2>(i, LinearTerm(4, 5)) > private JavaPairRDD> A; private JavaPairRDD> B; private JavaPairRDD> C; From eb5337fe0522bb73f5cfc81d00d58451c71fb310 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 14:24:07 +0100 Subject: [PATCH 07/94] Formatted files --- .../profiler/generation/R1CSConstruction.java | 9 +++++--- .../reductions/r1cs_to_qap/R1CStoQAP.java | 4 ++-- .../reductions/r1cs_to_qap/R1CStoQAPRDD.java | 10 +++++---- .../relations/objects/R1CSConstraint.java | 3 ++- .../relations/objects/R1CSConstraintsRDD.java | 21 ++++++++++++------- .../zkSNARK/SerialProver.java | 3 ++- .../zkSNARK/objects/ProvingKeyRDD.java | 3 ++- 7 files changed, 34 insertions(+), 19 deletions(-) diff --git a/src/main/java/profiler/generation/R1CSConstruction.java b/src/main/java/profiler/generation/R1CSConstruction.java index c3b93a9..68deefa 100755 --- a/src/main/java/profiler/generation/R1CSConstruction.java +++ b/src/main/java/profiler/generation/R1CSConstruction.java @@ -151,9 +151,12 @@ Tuple3, Assignment, Assignment> serialConst part == numPartitions ? totalSize % (totalSize / numPartitions) : totalSize / numPartitions; - // Linear combinations are now represented as: `ArrayList>>`. - // For instance, [Tuple2<0, LinearTerm(0,1)>, Tuple2<0, LinearTerm(1,5)>, Tuple2<0, LinearTerm(2,10)>] - // represents the 0th linear combination (0th line in the A matrix): A_0 = [1,5,10] + // Linear combinations are now represented as: `ArrayList>>`. + // For instance, [Tuple2<0, LinearTerm(0,1)>, Tuple2<0, LinearTerm(1,5)>, + // Tuple2<0, LinearTerm(2,10)>] + // represents the 0th linear combination (0th line in the A matrix): A_0 = + // [1,5,10] // In other words, the `Long` key in the tuple is the linear combination index. final ArrayList>> A = new ArrayList<>(); for (long i = 0; i < partSize; i++) { diff --git a/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java b/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java index 0515889..d8272cb 100755 --- a/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java +++ b/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java @@ -96,8 +96,8 @@ QAPRelation R1CStoQAPRelation(final R1CSRelation r1cs, final Fie * *

More precisely, compute the coefficients of the polynomial H(z) := * (A(z)*B(z)-C(z))/Z(z) where: A(z) := A_0(z) + \sum_{k=1}^{m} w_k A_k(z) + d1 * Z(z), B(z) := - * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z), C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + - * d3 * Z(z), Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = + * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z), C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + d3 + * * Z(z), Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = * degree of the QAP * *

This is done as follows: (1) compute evaluations of A,B,C on S = {sigma_1,...,sigma_n} (2) diff --git a/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java b/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java index 1c78261..5d2dbc0 100755 --- a/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java +++ b/src/main/java/reductions/r1cs_to_qap/R1CStoQAPRDD.java @@ -168,8 +168,8 @@ QAPRelationRDD R1CStoQAPRelation( * *

More precisely, compute the coefficients h_0,h_1,...,h_n of the polynomial H(z) := * (A(z)*B(z)-C(z))/Z(z) where: A(z) := A_0(z) + \sum_{k=1}^{m} w_k A_k(z) + d1 * Z(z), B(z) := - * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z), C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + - * d3 * Z(z), Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = + * B_0(z) + \sum_{k=1}^{m} w_k B_k(z) + d2 * Z(z), C(z) := C_0(z) + \sum_{k=1}^{m} w_k C_k(z) + d3 + * * Z(z), Z(z) := "vanishing polynomial of set S" and m = number of variables of the QAP n = * degree of the QAP * *

This is done as follows: (1) compute evaluations of A,B,C on S = {sigma_1,...,sigma_n} (2) @@ -209,8 +209,10 @@ QAPWitnessRDD R1CStoQAPWitness( // TODO (howardwu): Replace hardcoded popular variable assignment indices with list of // these indices from R1CSRelationRDD. config.beginLog("Compute evaluation of polynomials A, B, and C, on set S."); - // r1cs.constraints() returns a `R1CSConstraintsRDD` which is a set of `JavaPairRDD>`. - // In other, words, A, B, C are all vectors of linear terms (i.e. matrices that will be intepolated on the chosen domain) + // r1cs.constraints() returns a `R1CSConstraintsRDD` which is a set of `JavaPairRDD>`. + // In other, words, A, B, C are all vectors of linear terms (i.e. matrices that will be + // intepolated on the chosen domain) final JavaPairRDD zeroIndexedA = r1cs.constraints() .A() diff --git a/src/main/java/relations/objects/R1CSConstraint.java b/src/main/java/relations/objects/R1CSConstraint.java index 3c1242f..2e1dfd9 100755 --- a/src/main/java/relations/objects/R1CSConstraint.java +++ b/src/main/java/relations/objects/R1CSConstraint.java @@ -20,7 +20,8 @@ * *

A R1CSRelation constraint is used to construct a R1CSRelation constraint system. */ -// Similar to https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/constraint_satisfaction_problems/r1cs/r1cs.tcc#L31-L37 +// Similar to +// https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/constraint_satisfaction_problems/r1cs/r1cs.tcc#L31-L37 public class R1CSConstraint> implements Serializable { diff --git a/src/main/java/relations/objects/R1CSConstraintsRDD.java b/src/main/java/relations/objects/R1CSConstraintsRDD.java index 5e1f78a..4e59a0a 100755 --- a/src/main/java/relations/objects/R1CSConstraintsRDD.java +++ b/src/main/java/relations/objects/R1CSConstraintsRDD.java @@ -16,8 +16,8 @@ * *

{ < A_k , X > * < B_k , X > = < C_k , X > }_{k=1}^{n} * - *

In other words, the system is satisfied if and only if there exist a USCS (Unitary-Square Constraint System) - * variable assignment for which each R1CSRelation constraint is satisfied. + *

In other words, the system is satisfied if and only if there exist a USCS (Unitary-Square + * Constraint System) variable assignment for which each R1CSRelation constraint is satisfied. * *

NOTE: The 0-th variable (i.e., "x_{0}") always represents the constant 1 Thus, the 0-th * variable is not included in num_variables. @@ -26,16 +26,23 @@ public class R1CSConstraintsRDD>` (which is "implied" by the `JavaPairRDD`) represents the "index" of the linear combination of interest. - // For instance, the 1*a+2*b+5*d represented by <1,2,0,5> . , leading to the ith matrix line [1 2 0 5] in the R1CS can be represented by: - // < Tuple2>(i, LinearTerm(0, 1)), Tuple2>(i, LinearTerm(1, 2)), Tuple2>(i, LinearTerm(3, 0)), Tuple2>(i, LinearTerm(4, 5)) > + // NOTE: `R1CSConstraintsRDD` is handled differently than "normal"/non-distributed R1CS which is a + // set of "constraints" each of which being a triple of linear combinations. + // In fact, here the linear combinations are represented using "lists" of Linear Terms. The `Long` + // key of the tuple/pair `Tuple2>` (which is "implied" by the + // `JavaPairRDD`) represents the "index" of the linear combination of interest. + // For instance, the 1*a+2*b+5*d represented by <1,2,0,5> . , leading to the ith matrix + // line [1 2 0 5] in the R1CS can be represented by: + // < Tuple2>(i, LinearTerm(0, 1)), Tuple2>(i, + // LinearTerm(1, 2)), Tuple2>(i, LinearTerm(3, 0)), Tuple2>(i, LinearTerm(4, 5)) > private JavaPairRDD> A; private JavaPairRDD> B; private JavaPairRDD> C; diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java index 8a7fdd2..0529c32 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java @@ -49,7 +49,8 @@ Proof prove( final FieldT zero = fieldFactory.zero(); // 1. Verify that H has a non-zero d-2 coefficient assert (!qapWitness.coefficientsH(qapWitness.degree() - 2).equals(zero)); - // 2. Make sure that coefficients d-1 and d are 0 to make sure that the polynomial hasn't a degree higher than d-2 + // 2. Make sure that coefficients d-1 and d are 0 to make sure that the polynomial hasn't a + // degree higher than d-2 assert (qapWitness.coefficientsH(qapWitness.degree() - 1).equals(zero)); assert (qapWitness.coefficientsH(qapWitness.degree()).equals(zero)); // Check that the witness satisfies the QAP relation. diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java index 20a4db1..41c249b 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java @@ -22,7 +22,8 @@ public class ProvingKeyRDD< implements Serializable { // Below, [x]_1 (resp. [x]_2 and []_T) represents the encoding of x in G1 (resp. G2 and GT) - // We follow the notations in Groth16 (namely, polynomials are denoted u, v, w, h, t instead of A, B, C, H, Z. Moreoverm the evaluation point is denoted by x) + // We follow the notations in Groth16 (namely, polynomials are denoted u, v, w, h, t instead of A, + // B, C, H, Z. Moreover the evaluation point is denoted by x) // // [alpha]_1 private final G1T alphaG1; From d13e3593731ce5487c7589596c935c714a1dab22 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 15:38:18 +0100 Subject: [PATCH 08/94] Added test data for importing JSON R1CS programs --- src/test/java/data/README.md | 3 + src/test/java/data/simple_gadget_r1cs.json | 110 +++++++++++++++++++++ 2 files changed, 113 insertions(+) create mode 100644 src/test/java/data/README.md create mode 100644 src/test/java/data/simple_gadget_r1cs.json diff --git a/src/test/java/data/README.md b/src/test/java/data/README.md new file mode 100644 index 0000000..c23d15d --- /dev/null +++ b/src/test/java/data/README.md @@ -0,0 +1,3 @@ +# Test data + +The file `simple_gadget_r1cs.json` contains the exported JSON R1CS of the `simple_gadget` in Zeth (see: [here](https://github.com/clearmatics/zeth/blob/master/libzeth/tests/circuits/simple_test.cpp)). It is used for testing purposes here, especially to test the loading of arbitrary R1CS programs in an RDD on the cluster here. \ No newline at end of file diff --git a/src/test/java/data/simple_gadget_r1cs.json b/src/test/java/data/simple_gadget_r1cs.json new file mode 100644 index 0000000..fe5ce15 --- /dev/null +++ b/src/test/java/data/simple_gadget_r1cs.json @@ -0,0 +1,110 @@ +{ + "scalar_field_characteristic":"Not yet supported. Should be bigint in hexadecimal", + "num_variables":4, + "num_constraints":3, + "num_inputs":1, + "variables_annotations":[ + { + "index":0, + "annotation":"ONE" + }, + { + "index":1, + "annotation":"y" + }, + { + "index":2, + "annotation":"x" + }, + { + "index":3, + "annotation":"g1" + } + ], + "constraints":[ + { + "constraint_id":0, + "constraint_annotation":"g1", + "linear_combination":{ + "A":[ + { + "index":2, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ], + "B":[ + { + "index":2, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ], + "C":[ + { + "index":3, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ] + } + }, + { + "constraint_id":1, + "constraint_annotation":"g2", + "linear_combination":{ + "A":[ + { + "index":3, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ], + "B":[ + { + "index":2, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ], + "C":[ + { + "index":4, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ] + } + }, + { + "constraint_id":2, + "constraint_annotation":"y", + "linear_combination":{ + "A":[ + { + "index":0, + "value":"0x0000000000000000000000000000000000000000000000000000000000000005" + }, + { + "index":2, + "value":"0x0000000000000000000000000000000000000000000000000000000000000002" + }, + { + "index":3, + "value":"0x0000000000000000000000000000000000000000000000000000000000000004" + }, + { + "index":4, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ], + "B":[ + { + "index":0, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ], + "C":[ + { + "index":1, + "value":"0x0000000000000000000000000000000000000000000000000000000000000001" + } + ] + } + } + ] + } \ No newline at end of file From dbc83f5e6607c2e27586d8f947612094435c0481 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 17:32:17 +0100 Subject: [PATCH 09/94] Added skeleton IO files --- src/main/java/io/LoadR1CSRDD.java | 5 +++++ src/main/java/io/LoadSerialR1CS.java | 5 +++++ src/main/java/io/README.md | 4 ++++ 3 files changed, 14 insertions(+) create mode 100644 src/main/java/io/LoadR1CSRDD.java create mode 100644 src/main/java/io/LoadSerialR1CS.java create mode 100644 src/main/java/io/README.md diff --git a/src/main/java/io/LoadR1CSRDD.java b/src/main/java/io/LoadR1CSRDD.java new file mode 100644 index 0000000..a39b5bd --- /dev/null +++ b/src/main/java/io/LoadR1CSRDD.java @@ -0,0 +1,5 @@ +package io; + +public class LoadR1CSRDD { + +} diff --git a/src/main/java/io/LoadSerialR1CS.java b/src/main/java/io/LoadSerialR1CS.java new file mode 100644 index 0000000..2fa7256 --- /dev/null +++ b/src/main/java/io/LoadSerialR1CS.java @@ -0,0 +1,5 @@ +package io; + +public class LoadSerialR1CS { + +} diff --git a/src/main/java/io/README.md b/src/main/java/io/README.md new file mode 100644 index 0000000..a9e29c4 --- /dev/null +++ b/src/main/java/io/README.md @@ -0,0 +1,4 @@ +# DIZK - I/O functionalities + +This directory contains the implementation of useful primitives for I/O, such as: +- Loading JSON R1CS \ No newline at end of file From 244df005fa03841e8b1dc28600bfc9761d103944 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 17:52:46 +0100 Subject: [PATCH 10/94] Renamed R1CSConstuctor in profiling --- src/main/java/io/LoadR1CSRDD.java | 5 --- src/main/java/io/LoadSerialR1CS.java | 5 --- .../profiler/generation/R1CSConstruction.java | 34 +++++++++---------- .../MatrixMultiplicationProfiling.java | 8 ++--- .../profiling/R1CStoQAPRelationProfiling.java | 6 ++-- .../profiling/R1CStoQAPWitnessProfiling.java | 6 ++-- .../profiler/profiling/ZKSNARKProfiling.java | 10 +++--- .../zkSNARK/DistributedProver.java | 34 +++++++++++++------ .../java/reductions/R1CStoQAPRDDTest.java | 6 ++-- src/test/java/relations/MatMulTest.java | 12 +++---- .../java/relations/R1CSConstructionTest.java | 8 ++--- .../zkSNARK/DistributedzkSNARKTest.java | 6 ++-- .../zkSNARK/SerialzkSNARKTest.java | 6 ++-- troubleshooting.md | 2 +- 14 files changed, 76 insertions(+), 72 deletions(-) delete mode 100644 src/main/java/io/LoadR1CSRDD.java delete mode 100644 src/main/java/io/LoadSerialR1CS.java diff --git a/src/main/java/io/LoadR1CSRDD.java b/src/main/java/io/LoadR1CSRDD.java deleted file mode 100644 index a39b5bd..0000000 --- a/src/main/java/io/LoadR1CSRDD.java +++ /dev/null @@ -1,5 +0,0 @@ -package io; - -public class LoadR1CSRDD { - -} diff --git a/src/main/java/io/LoadSerialR1CS.java b/src/main/java/io/LoadSerialR1CS.java deleted file mode 100644 index 2fa7256..0000000 --- a/src/main/java/io/LoadSerialR1CS.java +++ /dev/null @@ -1,5 +0,0 @@ -package io; - -public class LoadSerialR1CS { - -} diff --git a/src/main/java/profiler/generation/R1CSConstruction.java b/src/main/java/profiler/generation/R1CSConstruction.java index 68deefa..f24fa33 100755 --- a/src/main/java/profiler/generation/R1CSConstruction.java +++ b/src/main/java/profiler/generation/R1CSConstruction.java @@ -22,7 +22,7 @@ import scala.Tuple2; import scala.Tuple3; -public class R1CSConstruction implements Serializable { +public class R1CSConstructor implements Serializable { public static > Tuple3, Assignment, Assignment> serialConstruct( @@ -464,7 +464,7 @@ public static > void constra new Tuple2<>(index, new LinearTerm<>(zOffset + (row * n3 + col) * (n2) + i, one))); break; default: - System.out.println("[R1CSConstruction.constraintAssignment] unknown constraint"); + System.out.println("[R1CSConstructor.constraintAssignment] unknown constraint"); break; } @@ -498,7 +498,7 @@ public static > void constra new Tuple2<>(index, new LinearTerm<>(cOffset + row * n3 + col, one))); break; default: - System.out.println("[R1CSConstruction.constraintAssignment] unknown constraint"); + System.out.println("[R1CSConstructor.constraintAssignment] unknown constraint"); break; } @@ -519,7 +519,7 @@ public static > void constra new Tuple2<>(index, new LinearTerm<>(sOffset + (row * n3 + col) * (n2 - 1), one))); break; default: - System.out.println("[R1CSConstruction.constraintAssignment] unknown constraint"); + System.out.println("[R1CSConstructor.constraintAssignment] unknown constraint"); break; } @@ -542,7 +542,7 @@ public static > void constra index, new LinearTerm<>(sOffset + (row * n3 + col) * (n2 - 1) + i, one))); break; default: - System.out.println("[R1CSConstruction.constraintAssignment] unknown constraint"); + System.out.println("[R1CSConstructor.constraintAssignment] unknown constraint"); break; } } @@ -1624,25 +1624,25 @@ Tuple3, JavaPairRDD, Long> matmulParCon JavaPairRDD> aFullRDD = A.>flatMapToPair( data -> - R1CSConstruction.matmulParAssignShuffleHelper( + R1CSConstructor.matmulParAssignShuffleHelper( data, 0, n1, n2, n3, b1, b2, b3)); JavaPairRDD> bFullRDD = B.>flatMapToPair( data -> - R1CSConstruction.matmulParAssignShuffleHelper( + R1CSConstructor.matmulParAssignShuffleHelper( data, 1, n1, n2, n3, b1, b2, b3)); // Shuffle to cogroup the tuples together // Map each group to do the matrix multiplication locally - // config.beginLog("[R1CSConstruction] cogroup"); + // config.beginLog("[R1CSConstructor] cogroup"); JavaPairRDD>, Iterable>>> cogroupResult = aFullRDD.cogroup(bFullRDD); // cogroupResult.cache(); // cogroupResult.count(); - // config.endLog("[R1CSConstruction] cogroup"); + // config.endLog("[R1CSConstructor] cogroup"); - // config.beginLog("[R1CSConstruction] witness calculation"); + // config.beginLog("[R1CSConstructor] witness calculation"); JavaPairRDD fullAssignment = cogroupResult.flatMapToPair( x -> @@ -1652,14 +1652,14 @@ Tuple3, JavaPairRDD, Long> matmulParCon // long assignmentCount = fullAssignment.count(); // cogroupResult.unpersist(); - // config.endLog("[R1CSConstruction] witness calculation (fullAssignment count is " + + // config.endLog("[R1CSConstructor] witness calculation (fullAssignment count is " + // assignmentCount + ")"); ArrayList intList = new ArrayList<>(); for (int i = 0; i < b1 * b3; i++) { intList.add(i); } - // config.beginLog("[R1CSConstruction] constraint generation"); + // config.beginLog("[R1CSConstructor] constraint generation"); // Generate the constraints in parallel JavaPairRDD> ALC = config @@ -1728,7 +1728,7 @@ Tuple3, JavaPairRDD, Long> matmulParCon // BLC.cache().count(); // CLC.cache().count(); - // config.endLog("[R1CSConstruction] constraint generation"); + // config.endLog("[R1CSConstructor] constraint generation"); // Iterable> assignments = fullAssignment.collect(); // for (Tuple2 a : assignments) { @@ -1743,7 +1743,7 @@ Tuple3, JavaPairRDD, Long> matmulParCon // fullAssignment.filter(e -> e._1 >= 0 && e._1 < numInputs).collect(), // numInputs)); - System.out.println("[R1CSConstruction::matmulParConstruct] numConstraints = " + numConstraints); + System.out.println("[R1CSConstructor::matmulParConstruct] numConstraints = " + numConstraints); return new Tuple3<>(constraints, fullAssignment, numAssignments); } @@ -2278,7 +2278,7 @@ ArrayList>> linearCombinationConstraintsGen( break; default: System.out.println( - "[R1CSConstruction.linearCombinationConstraintGen] unknown constraint"); + "[R1CSConstructor.linearCombinationConstraintGen] unknown constraint"); break; } } @@ -2384,7 +2384,7 @@ Iterator>> xMinusMeanConstraintsHelper( LinearIndexer newOutputOffset = new LinearIndexer(outputOffset.getIndex(0) + li.getIndex(0)); ArrayList>> tmp = - R1CSConstruction.linearCombinationConstraintsGen( + R1CSConstructor.linearCombinationConstraintsGen( fieldFactory, cType, inputOffsets, @@ -2733,7 +2733,7 @@ Iterator>> meanVerifyHelper( indexers.add(new LinearIndexer(cLowerCols)); CompositeIndexer newMeanOffset = new CompositeIndexer(indexers); ArrayList>> ret = - R1CSConstruction.linearCombinationConstraintsGen( + R1CSConstructor.linearCombinationConstraintsGen( fieldFactory, cType, inputOffsets, diff --git a/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java b/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java index 22e3da9..6ff21f7 100755 --- a/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java +++ b/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java @@ -4,7 +4,7 @@ import algebra.fields.fieldparameters.LargeFpParameters; import configuration.Configuration; import org.apache.spark.api.java.JavaPairRDD; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import reductions.r1cs_to_qap.R1CStoQAPRDD; import relations.objects.Assignment; import relations.qap.QAPWitnessRDD; @@ -21,7 +21,7 @@ public static void MatrixMultiplicationProfile( config.beginLog("Matmul circuit generation"); final Tuple3, Assignment, JavaPairRDD> R1CSExampleRDD = - R1CSConstruction.matmulParConstructApp(fieldFactory, b1, b2, b3, n1, n2, n3, config); + R1CSConstructor.matmulParConstructApp(fieldFactory, b1, b2, b3, n1, n2, n3, config); config.endLog("Matmul circuit generation"); final R1CSRelationRDD r1csRDD = R1CSExampleRDD._1(); @@ -43,7 +43,7 @@ public static void LRProfile(final Configuration config, int n, int d, int bn, i config.beginLog("Linear regression circuit generation"); final Tuple3, Assignment, JavaPairRDD> R1CSExampleRDD = - R1CSConstruction.linearRegressionApp(fieldFactory, config, n, d, bn, bd); + R1CSConstructor.linearRegressionApp(fieldFactory, config, n, d, bn, bd); config.endLog("Linear regression circuit generation"); final R1CSRelationRDD r1csRDD = R1CSExampleRDD._1(); @@ -65,7 +65,7 @@ public static void GaussianProfile(final Configuration config, int n, int d, int config.beginLog("Gaussian circuit generation"); final Tuple3, Assignment, JavaPairRDD> R1CSExampleRDD = - R1CSConstruction.gaussianFitApp(fieldFactory, config, n, d, bn, bd); + R1CSConstructor.gaussianFitApp(fieldFactory, config, n, d, bn, bd); config.endLog("Gaussian circuit generation"); final R1CSRelationRDD r1csRDD = R1CSExampleRDD._1(); diff --git a/src/main/java/profiler/profiling/R1CStoQAPRelationProfiling.java b/src/main/java/profiler/profiling/R1CStoQAPRelationProfiling.java index 7cbae23..3b8aac2 100755 --- a/src/main/java/profiler/profiling/R1CStoQAPRelationProfiling.java +++ b/src/main/java/profiler/profiling/R1CStoQAPRelationProfiling.java @@ -3,7 +3,7 @@ import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import configuration.Configuration; import org.apache.spark.api.java.JavaPairRDD; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import reductions.r1cs_to_qap.R1CStoQAP; import reductions.r1cs_to_qap.R1CStoQAPRDD; import relations.objects.Assignment; @@ -20,7 +20,7 @@ public static void serialQAPRelation(final Configuration config, final long numC final int numInputs = 1023; final Tuple3, Assignment, Assignment> R1CSExample = - R1CSConstruction.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = R1CSExample._1(); config.setContext("QAPRelation-Serial"); @@ -40,7 +40,7 @@ public static void distributedQAPRelation(final Configuration config, final long final Tuple3, Assignment, JavaPairRDD> R1CSExampleRDD = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1csRDD = R1CSExampleRDD._1(); config.setContext("QAPRelation"); diff --git a/src/main/java/profiler/profiling/R1CStoQAPWitnessProfiling.java b/src/main/java/profiler/profiling/R1CStoQAPWitnessProfiling.java index 68de670..0dac849 100755 --- a/src/main/java/profiler/profiling/R1CStoQAPWitnessProfiling.java +++ b/src/main/java/profiler/profiling/R1CStoQAPWitnessProfiling.java @@ -3,7 +3,7 @@ import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import configuration.Configuration; import org.apache.spark.api.java.JavaPairRDD; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import reductions.r1cs_to_qap.R1CStoQAP; import reductions.r1cs_to_qap.R1CStoQAPRDD; import relations.objects.Assignment; @@ -20,7 +20,7 @@ public static void serialQAPWitness(final Configuration config, final long numCo final int numInputs = 1023; final Tuple3, Assignment, Assignment> R1CSExample = - R1CSConstruction.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = R1CSExample._1(); final Assignment primary = R1CSExample._2(); final Assignment auxiliary = R1CSExample._3(); @@ -42,7 +42,7 @@ public static void distributedQAPWitness(final Configuration config, final long final Tuple3, Assignment, JavaPairRDD> R1CSExampleRDD = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1csRDD = R1CSExampleRDD._1(); final Assignment primary = R1CSExampleRDD._2(); final JavaPairRDD fullAssignmentRDD = R1CSExampleRDD._3(); diff --git a/src/main/java/profiler/profiling/ZKSNARKProfiling.java b/src/main/java/profiler/profiling/ZKSNARKProfiling.java index 0daa085..917e706 100755 --- a/src/main/java/profiler/profiling/ZKSNARKProfiling.java +++ b/src/main/java/profiler/profiling/ZKSNARKProfiling.java @@ -16,7 +16,7 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import configuration.Configuration; import org.apache.spark.api.java.JavaPairRDD; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import relations.objects.Assignment; import relations.r1cs.R1CSRelation; import relations.r1cs.R1CSRelationRDD; @@ -36,7 +36,7 @@ public static void serialzkSNARKProfiling(final Configuration config, final long final BN254aPairing pairing = new BN254aPairing(); final Tuple3, Assignment, Assignment> construction = - R1CSConstruction.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); final Assignment auxiliary = construction._3(); @@ -91,7 +91,7 @@ public static void serialzkSNARKLargeProfiling( final BN254bPairing pairing = new BN254bPairing(); final Tuple3, Assignment, Assignment> construction = - R1CSConstruction.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct((int) numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); final Assignment auxiliary = construction._3(); @@ -147,7 +147,7 @@ public static void distributedzkSNARKProfiling( final Tuple3, Assignment, JavaPairRDD> construction = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1cs = construction._1(); final Assignment primary = construction._2(); final JavaPairRDD fullAssignment = construction._3(); @@ -203,7 +203,7 @@ public static void distributedzkSNARKLargeProfiling( final Tuple3, Assignment, JavaPairRDD> construction = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1cs = construction._1(); final Assignment primary = construction._2(); final JavaPairRDD fullAssignment = construction._3(); diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java index daf260f..128c052 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java @@ -33,10 +33,12 @@ Proof prove( final JavaPairRDD fullAssignment, final FieldT fieldFactory, final Configuration config) { - // Note: `R1CStoQAPWitness` already checks the value of the configuration `debugFlag`, and already checks that the R1CS is satisfied on input `primary` and `fullAssignment`. No need to do it again it, this is redundant. - //if (config.debugFlag()) { + // Note: `R1CStoQAPWitness` already checks the value of the configuration `debugFlag`, and + // already checks that the R1CS is satisfied on input `primary` and `fullAssignment`. No need to + // do it again it, this is redundant. + // if (config.debugFlag()) { // assert (provingKey.r1cs().isSatisfied(primary, fullAssignment)); - //} + // } config.beginLog("Computing witness polynomial"); final QAPWitnessRDD qapWitness = @@ -95,14 +97,20 @@ Proof prove( config.beginRuntime("Generate proof"); config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); - // Get an RDD containing all pairs of elements with **matching keys** in `fullAssignment` and `provingKey.queryA()`. The result of this `.join()` will be a (k, (v1, v2)) tuple, where (k, v1) is in `fullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the right scalar to the right group element in light of the `VariableBaseMSM` triggered at the next line. + // Get an RDD containing all pairs of elements with **matching keys** in `fullAssignment` and + // `provingKey.queryA()`. The result of this `.join()` will be a (k, (v1, v2)) tuple, where (k, + // v1) is in `fullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` + // returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the + // right scalar to the right group element in light of the `VariableBaseMSM` triggered at the + // next line. final JavaRDD> computationA = fullAssignment.join(provingKey.queryA(), numPartitions).values(); // `evaluationAt` = \sum_{i=0}^{m} a_i * u_i(x) (in Groth16) // where m = total number of wires // a_i = ith wire/variable final G1T evaluationAt = VariableBaseMSM.distributedMSM(computationA); - // Once `queryA` is not useful anymore, mark the RDD as non-persistent, and remove all blocks for it from memory and disk. + // Once `queryA` is not useful anymore, mark the RDD as non-persistent, and remove all blocks + // for it from memory and disk. provingKey.queryA().unpersist(); config.endLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); @@ -112,7 +120,9 @@ Proof prove( // `evaluationBt` = \sum_{i=0}^{m} a_i * v_i(x) (in Groth16) // where m = total number of wires // a_i = ith wire/variable - // Note: We get an evaluation in G1 and G2, because B \in G2 is formed using this term, and C (\in G1) also uses this term (see below, B is of type `Tuple2` and will actually be computed in both G1 and G2 for this exact purpose). + // Note: We get an evaluation in G1 and G2, because B \in G2 is formed using this term, and C + // (\in G1) also uses this term (see below, B is of type `Tuple2` and will actually be + // computed in both G1 and G2 for this exact purpose). final Tuple2 evaluationBt = VariableBaseMSM.distributedDoubleMSM(computationB); provingKey.queryB().unpersist(); config.endLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); @@ -131,13 +141,16 @@ Proof prove( config.endLog("Computing evaluation to deltaABC"); config.beginLog("Computing evaluation to query H"); - // In Groth16 notations, `queryH` is the encoding in G1 of the vector <(x^i * t(x))/delta>, for i \in [0, n-2] - // As such, the value of `evaluationHtZtOverDelta` actually is: (h(x)t(x))/delta if we follow Groth's notations + // In Groth16 notations, `queryH` is the encoding in G1 of the vector <(x^i * t(x))/delta>, for + // i \in [0, n-2] + // As such, the value of `evaluationHtZtOverDelta` actually is: (h(x)t(x))/delta if we follow + // Groth's notations final JavaRDD> computationH = qapWitness.coefficientsH().join(provingKey.queryH(), numPartitions).values(); final G1T evaluationHtZtOverDelta = VariableBaseMSM.distributedMSM(computationH); provingKey.queryH().unpersist(); - // Add H(t)*Z(t)/delta to `evaluationABC` to get the first term of C, namely (following Groth's notations): + // Add H(t)*Z(t)/delta to `evaluationABC` to get the first term of C, namely (following Groth's + // notations): // [\sum_{i = l+1}^{m} a_i * (beta * u_i(x) + alpha * v_i(x) + w_i(x) + h(x)t(x))]/delta evaluationABC = evaluationABC.add(evaluationHtZtOverDelta); config.endLog("Computing evaluation to query H"); @@ -151,7 +164,8 @@ Proof prove( betaG1.add(evaluationBt._1).add(deltaG1.mul(s)), betaG2.add(evaluationBt._2).add(deltaG2.mul(s))); - // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*delta + // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - + // r*s*delta final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); config.endRuntime("Generate proof"); diff --git a/src/test/java/reductions/R1CStoQAPRDDTest.java b/src/test/java/reductions/R1CStoQAPRDDTest.java index 89f2010..eb46368 100755 --- a/src/test/java/reductions/R1CStoQAPRDDTest.java +++ b/src/test/java/reductions/R1CStoQAPRDDTest.java @@ -19,7 +19,7 @@ import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import reductions.r1cs_to_qap.R1CStoQAP; import reductions.r1cs_to_qap.R1CStoQAPRDD; import relations.objects.Assignment; @@ -47,9 +47,9 @@ public void setUp() { final int numInputs = 1023; final int numConstraints = 1024; - R1CSExample = R1CSConstruction.serialConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSExample = R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); R1CSExampleRDD = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); } @AfterEach diff --git a/src/test/java/relations/MatMulTest.java b/src/test/java/relations/MatMulTest.java index 1e52ef3..f23c089 100755 --- a/src/test/java/relations/MatMulTest.java +++ b/src/test/java/relations/MatMulTest.java @@ -20,7 +20,7 @@ import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import relations.objects.Assignment; import relations.r1cs.R1CSRelationRDD; import scala.Tuple3; @@ -47,7 +47,7 @@ public void tearDown() { public void runMatmul() { // OLD: serial matmul witness generation // final Tuple3, Assignment, JavaPairRDD> constructionRDD = - // R1CSConstruction.matmulConstruct(n1, n2, n3, fieldFactory, config); + // R1CSConstructor.matmulConstruct(n1, n2, n3, fieldFactory, config); // final R1CSRelationRDD r1csRDD = constructionRDD._1(); // final Assignment primaryTwo = constructionRDD._2(); // final JavaPairRDD oneFullAssignmentRDD = constructionRDD._3(); @@ -65,7 +65,7 @@ public void runMatmul() { int b3 = 2; final Tuple3, Assignment, JavaPairRDD> parConstructionRDD = - R1CSConstruction.matmulParConstructApp(fieldFactory, b1, b2, b3, n1, n2, n3, config); + R1CSConstructor.matmulParConstructApp(fieldFactory, b1, b2, b3, n1, n2, n3, config); // Retrieve each elements of the tuple by their index R1CSRelationRDD r1csRDDPar = parConstructionRDD._1(); Assignment primaryTwoPar = parConstructionRDD._2(); @@ -86,7 +86,7 @@ public void runMatmul() { b3 = 1; final Tuple3, Assignment, JavaPairRDD> matVectorProduct = - R1CSConstruction.matmulParConstructApp(fieldFactory, b1, b2, b3, n1, n2, n3, config); + R1CSConstructor.matmulParConstructApp(fieldFactory, b1, b2, b3, n1, n2, n3, config); r1csRDDPar = matVectorProduct._1(); primaryTwoPar = matVectorProduct._2(); oneFullAssignmentRDDPar = matVectorProduct._3(); @@ -107,7 +107,7 @@ public void runLR() { final BN254aFr fieldFactory = new BN254aFr(2L); final Tuple3, Assignment, JavaPairRDD> - LRProduct = R1CSConstruction.linearRegressionApp(fieldFactory, config, n, d, bn, bd); + LRProduct = R1CSConstructor.linearRegressionApp(fieldFactory, config, n, d, bn, bd); R1CSRelationRDD r1csRDDPar = LRProduct._1(); Assignment primaryTwoPar = LRProduct._2(); JavaPairRDD oneFullAssignmentRDDPar = LRProduct._3(); @@ -126,7 +126,7 @@ public void runGaussian() { final int bd = 3; final Tuple3, Assignment, JavaPairRDD> meanCov = - R1CSConstruction.gaussianFitApp(fieldFactory, config, n, d, bn, bd); + R1CSConstructor.gaussianFitApp(fieldFactory, config, n, d, bn, bd); R1CSRelationRDD r1csRDDPar = meanCov._1(); Assignment primaryTwoPar = meanCov._2(); JavaPairRDD oneFullAssignmentRDDPar = meanCov._3(); diff --git a/src/test/java/relations/R1CSConstructionTest.java b/src/test/java/relations/R1CSConstructionTest.java index 0317a2b..a612bed 100755 --- a/src/test/java/relations/R1CSConstructionTest.java +++ b/src/test/java/relations/R1CSConstructionTest.java @@ -19,13 +19,13 @@ import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import relations.objects.Assignment; import relations.r1cs.R1CSRelation; import relations.r1cs.R1CSRelationRDD; import scala.Tuple3; -public class R1CSConstructionTest implements Serializable { +public class R1CSConstructorTest implements Serializable { private transient JavaSparkContext sc; private Configuration config; private Fp fieldFactory; @@ -49,14 +49,14 @@ public void ConstructionTest() { final int numConstraints = 1024; final Tuple3, Assignment, Assignment> construction = - R1CSConstruction.serialConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); final Assignment auxiliary = construction._3(); final Assignment oneFullAssignment = new Assignment<>(primary, auxiliary); final Tuple3, Assignment, JavaPairRDD> constructionRDD = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1csRDD = constructionRDD._1(); final Assignment primaryTwo = constructionRDD._2(); final JavaPairRDD oneFullAssignmentRDD = constructionRDD._3(); diff --git a/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java index 4f0c433..7290b15 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java @@ -30,7 +30,7 @@ import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import profiler.utils.SparkUtils; import relations.objects.Assignment; import relations.r1cs.R1CSRelationRDD; @@ -99,7 +99,7 @@ void DistributedBNProofSystemTest( BNG2T g2Factory, BNPairingT pairing) { final Tuple3, Assignment, JavaPairRDD> construction = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1cs = construction._1(); final Assignment primary = construction._2(); final JavaPairRDD fullAssignment = construction._3(); @@ -132,7 +132,7 @@ public void DistributedFakeProofSystemTest() { final FakePairing fakePairing = new FakePairing(); final Tuple3, Assignment, JavaPairRDD> construction = - R1CSConstruction.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1cs = construction._1(); final Assignment primary = construction._2(); final JavaPairRDD fullAssignment = construction._3(); diff --git a/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java index 19dbac7..4638353 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java @@ -34,7 +34,7 @@ import java.io.Serializable; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; -import profiler.generation.R1CSConstruction; +import profiler.generation.R1CSConstructor; import relations.objects.Assignment; import relations.r1cs.R1CSRelation; import scala.Tuple3; @@ -88,7 +88,7 @@ void SerialBNProofSystemTest( BNG2T g2Factory, BNPairingT pairing) { final Tuple3, Assignment, Assignment> construction = - R1CSConstruction.serialConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); final Assignment fullAssignment = construction._3(); @@ -117,7 +117,7 @@ public void SerialFakeProofSystemTest() { final FakePairing fakePairing = new FakePairing(); final Tuple3, Assignment, Assignment> construction = - R1CSConstruction.serialConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); final Assignment auxiliary = construction._3(); diff --git a/troubleshooting.md b/troubleshooting.md index bae6737..5400932 100644 --- a/troubleshooting.md +++ b/troubleshooting.md @@ -61,7 +61,7 @@ Compile the project without running or compiling unit tests. Copy the **jar** file to `/home/ec2-user/` (because this is where the profiler scripts are looking for it. -Notice that the $SIZE in profile.sh must be greater than the hard-coded values for number of inputs (1023) that are provided in the R1CSConstruction class. +Notice that the $SIZE in profile.sh must be greater than the hard-coded values for number of inputs (1023) that are provided in the R1CSConstructor class. [This](https://spark.apache.org/docs/1.6.2/ec2-scripts.html) is useful. From 34c5562c19c77bd88aefb8999943b1870f081dfb Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 6 Oct 2020 18:21:57 +0100 Subject: [PATCH 11/94] [WIP] Started to craft the json loader --- pom.xml | 5 ++ src/main/java/io/JSONR1CSLoader.java | 70 ++++++++++++++++++++++++ src/test/java/io/JSONR1CSLoaderTest.java | 26 +++++++++ 3 files changed, 101 insertions(+) create mode 100644 src/main/java/io/JSONR1CSLoader.java create mode 100644 src/test/java/io/JSONR1CSLoaderTest.java diff --git a/pom.xml b/pom.xml index 2cf3500..af00305 100644 --- a/pom.xml +++ b/pom.xml @@ -53,6 +53,11 @@ versions-maven-plugin 2.8.1 + + com.googlecode.json-simple + json-simple + 1.1.1 + diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java new file mode 100644 index 0000000..cce5a9e --- /dev/null +++ b/src/main/java/io/JSONR1CSLoader.java @@ -0,0 +1,70 @@ +package io; + +import relations.r1cs.R1CSRelation; +import relations.r1cs.R1CSRelationRDD; +import algebra.fields.AbstractFieldElementExpanded; +import configuration.Configuration; +import relations.objects.LinearTerm; +import relations.objects.R1CSConstraintsRDD; + +import java.io.FileNotFoundException; +import java.io.FileReader; +import java.io.IOException; + +import org.json.simple.JSONArray; +import org.json.simple.JSONObject; +import org.json.simple.parser.JSONParser; +import org.json.simple.parser.ParseException; + +/** + * Class that implements all necessary functions to import and load an R1CS in JSON format + */ +public class JSONR1CSLoader { + private String filename; + + public JSONR1CSLoader(){}; + public JSONR1CSLoader(String jsonFile){ + this.filename = jsonFile; + }; + + // Loads the file to a "local" (i.e. non-distributed) R1CS instance + public void loadSerial(){ + //JSON parser object to parse read file + JSONParser jsonParser = new JSONParser(); + + try (FileReader reader = new FileReader(this.filename)) { + //Read JSON file + Object obj = jsonParser.parse(reader); + + JSONObject r1cs = (JSONObject) obj; + System.out.println(r1cs); + + JSONObject constraints = (JSONObject) r1cs.get("constraints"); + System.out.println(constraints); + + JSONObject numInputs = (JSONObject) r1cs.get("num_inputs"); + System.out.println(numInputs); + } catch (FileNotFoundException e) { + e.printStackTrace(); + } catch (IOException e) { + e.printStackTrace(); + } catch (ParseException e) { + e.printStackTrace(); + } + } + + // Loads the file to an RDD (i.e. distributed) R1CS instance + /* + public R1CSRelationRDD> loadRDD( + final FieldT fieldFactory, + final Configuration config + ){ + final R1CSConstraintsRDD loadedConstraints; + final int loadedNumInputs; + final long loadedNumAuxiliary; + + final R1CSRelationRDD loadedRelationRDD = new R1CSRelationRDD(loadedConstraints, loadedNumInputs, loadedNumAuxiliary); + return loadedRelationRDD; + } + */ +} diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java new file mode 100644 index 0000000..d04a813 --- /dev/null +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -0,0 +1,26 @@ +package io; + +import relations.r1cs.R1CSRelation; +import relations.r1cs.R1CSRelationRDD; +import algebra.fields.AbstractFieldElementExpanded; +import configuration.Configuration; +import relations.objects.LinearTerm; +import relations.objects.R1CSConstraintsRDD; + +import org.junit.jupiter.api.Test; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; +import io.JSONR1CSLoader; + +public class JSONR1CSLoaderTest { + + @Test + public void loadSerialTest(){ + // Load the test data file + // TODO: Use the java equivalent to boost::filesystem to support path manip on all platforms + String pathToTestFile = "test/java/data/simple_gadget_r1cs.json"; + JSONR1CSLoader loader = new JSONR1CSLoader(pathToTestFile); + loader.loadSerial(); + } +} From 4f394e04e6553811f8c1611686d9702594b5e38f Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 15:32:34 +0100 Subject: [PATCH 12/94] Added setup env script --- README.md | 4 +++- setup_env | 3 +++ 2 files changed, 6 insertions(+), 1 deletion(-) create mode 100644 setup_env diff --git a/README.md b/README.md index 7c26345..2804104 100755 --- a/README.md +++ b/README.md @@ -94,8 +94,10 @@ While other libraries for zero knowledge proof systems are written in low-level Start by cloning this repository and entering the repository working directory: ```bash -git clone https://github.com/scipr-lab/dizk.git +git clone https://github.com/clearmatics/dizk.git cd dizk +# Set up your environment +. ./setup_env ``` Next, fetch the dependency modules: diff --git a/setup_env b/setup_env new file mode 100644 index 0000000..287971c --- /dev/null +++ b/setup_env @@ -0,0 +1,3 @@ +#!/usr/bin/env bash + +export DIZK=`pwd` From 8cf1fd40f73ff4dfe53445de76be655b4c266bf2 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 15:58:55 +0100 Subject: [PATCH 13/94] Renamed R1CSConsructor file after renaming class --- .../generation/{R1CSConstruction.java => R1CSConstructor.java} | 0 .../{R1CSConstructionTest.java => R1CSConstructorTest.java} | 0 2 files changed, 0 insertions(+), 0 deletions(-) rename src/main/java/profiler/generation/{R1CSConstruction.java => R1CSConstructor.java} (100%) rename src/test/java/relations/{R1CSConstructionTest.java => R1CSConstructorTest.java} (100%) diff --git a/src/main/java/profiler/generation/R1CSConstruction.java b/src/main/java/profiler/generation/R1CSConstructor.java similarity index 100% rename from src/main/java/profiler/generation/R1CSConstruction.java rename to src/main/java/profiler/generation/R1CSConstructor.java diff --git a/src/test/java/relations/R1CSConstructionTest.java b/src/test/java/relations/R1CSConstructorTest.java similarity index 100% rename from src/test/java/relations/R1CSConstructionTest.java rename to src/test/java/relations/R1CSConstructorTest.java From 9c5b076ca154f602db7e52e1d3be94604ea91cbc Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 15:59:31 +0100 Subject: [PATCH 14/94] First working dummy test for the json loader --- src/main/java/io/JSONR1CSLoader.java | 6 +++--- src/test/java/io/JSONR1CSLoaderTest.java | 13 ++++++++++--- 2 files changed, 13 insertions(+), 6 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index cce5a9e..3519cdf 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -39,11 +39,11 @@ public void loadSerial(){ JSONObject r1cs = (JSONObject) obj; System.out.println(r1cs); - JSONObject constraints = (JSONObject) r1cs.get("constraints"); + JSONArray constraints = (JSONArray) r1cs.get("constraints"); System.out.println(constraints); - JSONObject numInputs = (JSONObject) r1cs.get("num_inputs"); - System.out.println(numInputs); + Long numInputs = (Long) r1cs.get("num_inputs"); + System.out.println(numInputs); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index d04a813..3978fbf 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -9,18 +9,25 @@ import org.junit.jupiter.api.Test; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.fail; import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import io.JSONR1CSLoader; +import java.nio.file.Paths; +import java.nio.file.Path; +import java.nio.file.Files; public class JSONR1CSLoaderTest { @Test public void loadSerialTest(){ // Load the test data file - // TODO: Use the java equivalent to boost::filesystem to support path manip on all platforms - String pathToTestFile = "test/java/data/simple_gadget_r1cs.json"; - JSONR1CSLoader loader = new JSONR1CSLoader(pathToTestFile); + String dizkHome = System.getenv("DIZK"); + Path pathToFile = Paths.get(dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); + if (!Files.exists(pathToFile)) { + fail("Test r1cs file not found."); + } + JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); loader.loadSerial(); } } From 6109908ce3a6c8ab307d2cc678f7a5ae318865e4 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 17:30:22 +0100 Subject: [PATCH 15/94] Started to fight with java generics to load field elements in memory --- src/main/java/io/JSONR1CSLoader.java | 48 +++++++++++++++++++++--- src/test/java/io/JSONR1CSLoaderTest.java | 5 ++- 2 files changed, 45 insertions(+), 8 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 3519cdf..82fb3be 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -6,6 +6,7 @@ import configuration.Configuration; import relations.objects.LinearTerm; import relations.objects.R1CSConstraintsRDD; +import relations.objects.R1CSConstraints; import java.io.FileNotFoundException; import java.io.FileReader; @@ -16,10 +17,14 @@ import org.json.simple.parser.JSONParser; import org.json.simple.parser.ParseException; +import java.util.ArrayList; +import java.math.BigInteger; + /** * Class that implements all necessary functions to import and load an R1CS in JSON format */ public class JSONR1CSLoader { + // File to parse and load in memory private String filename; public JSONR1CSLoader(){}; @@ -28,7 +33,8 @@ public JSONR1CSLoader(String jsonFile){ }; // Loads the file to a "local" (i.e. non-distributed) R1CS instance - public void loadSerial(){ + public > + void loadSerial(){ //JSON parser object to parse read file JSONParser jsonParser = new JSONParser(); @@ -36,13 +42,43 @@ public void loadSerial(){ //Read JSON file Object obj = jsonParser.parse(reader); - JSONObject r1cs = (JSONObject) obj; - System.out.println(r1cs); + JSONObject jsonR1CS = (JSONObject) obj; + + // TODO: Retrieve the field characteristic for type safety + // Once recovered, we can assert that r = FieldT::r to make sure types match + // Long fieldChar = (Long) jsonR1CS.get("scalar_field_characteristic"); - JSONArray constraints = (JSONArray) r1cs.get("constraints"); - System.out.println(constraints); + JSONArray jsonConstraints = (JSONArray) jsonR1CS.get("constraints"); + System.out.println(jsonConstraints); // DEBUG ONLY - to remove + for (int i = 0; i < jsonConstraints.size(); i++) { + JSONObject constraint = (JSONObject) jsonConstraints.get(i); + JSONObject linCombs = (JSONObject) constraint.get("linear_combination"); + JSONArray linCombA = (JSONArray) linCombs.get("A"); + ///ArrayList> termsA; + for (int j = 0; j < linCombA.size(); j++) { + JSONObject jsonTerm = (JSONObject) linCombA.get(i); + String valueStr = (String) jsonTerm.get("value"); + System.out.println("HERE2: valueStr = " + valueStr); + // Wire values are exported as hexadecimal strings + // (we remove the '0x' prefix using `substring`) + BigInteger value = new BigInteger(valueStr.substring(2), 16); + System.out.println("value = " + value.toString()); + // fieldFactory is assumed to follow the same "interface" as the "BN fields" + // and must, hence, have a `FrParameters` attibute as in: https://github.com/clearmatics/dizk/blob/master/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java#L16 + // Likewise, we assume below that the constuctor of FieldT follows the convention of + // https://github.com/clearmatics/dizk/blob/master/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java#L24-L26 + // and thus does not require to pass the parameters as second attribute + ///FieldT valueField = (FieldT) new FieldT(value); + ///termsA.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); + } + JSONArray linCombB = (JSONArray) linCombs.get("B"); + System.out.println("linCombB = " + linCombB.toString()); + JSONArray linCombC = (JSONArray) linCombs.get("C"); + System.out.println("linCombC = " + linCombC.toString()); + } + //R1CSConstraints constrainsts; - Long numInputs = (Long) r1cs.get("num_inputs"); + Long numInputs = (Long) jsonR1CS.get("num_inputs"); System.out.println(numInputs); } catch (FileNotFoundException e) { e.printStackTrace(); diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 3978fbf..683c63c 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -23,11 +23,12 @@ public class JSONR1CSLoaderTest { public void loadSerialTest(){ // Load the test data file String dizkHome = System.getenv("DIZK"); - Path pathToFile = Paths.get(dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); + Path pathToFile = Paths.get( + dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); if (!Files.exists(pathToFile)) { fail("Test r1cs file not found."); } JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); - loader.loadSerial(); + loader.loadSerial(); } } From b3f1bec67b176c747eaf2f6ac41b2e52d414eedc Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 18:07:46 +0100 Subject: [PATCH 16/94] Extended the abstract class AbstractFieldElementExpanded to build FieldT from bigints --- .../curves/barreto_naehrig/bn254a/BN254aFields.java | 8 ++++++++ .../curves/barreto_naehrig/bn254b/BN254bFields.java | 8 ++++++++ .../java/algebra/fields/AbstractFieldElementExpanded.java | 3 +++ 3 files changed, 19 insertions(+) diff --git a/src/main/java/algebra/curves/barreto_naehrig/bn254a/BN254aFields.java b/src/main/java/algebra/curves/barreto_naehrig/bn254a/BN254aFields.java index 9223e99..80b66d5 100755 --- a/src/main/java/algebra/curves/barreto_naehrig/bn254a/BN254aFields.java +++ b/src/main/java/algebra/curves/barreto_naehrig/bn254a/BN254aFields.java @@ -56,6 +56,10 @@ public BN254aFr multiplicativeGenerator() { return MULTIPLICATIVE_GENERATOR; } + public BN254aFr construct(final BigInteger number) { + return new BN254aFr(number); + } + public BN254aFr construct(final long number) { return new BN254aFr(number); } @@ -116,6 +120,10 @@ public BN254aFq multiplicativeGenerator() { return MULTIPLICATIVE_GENERATOR; } + public BN254aFq construct(final BigInteger number) { + return new BN254aFq(number); + } + public BN254aFq construct(final Fp element) { return new BN254aFq(element); } diff --git a/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java b/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java index 3853f6d..8c4159d 100755 --- a/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java +++ b/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java @@ -57,6 +57,10 @@ public BN254bFr multiplicativeGenerator() { return MULTIPLICATIVE_GENERATOR; } + public BN254bFr construct(final BigInteger number) { + return new BN254bFr(number); + } + public BN254bFr construct(final long number) { return new BN254bFr(number); } @@ -117,6 +121,10 @@ public BN254bFq multiplicativeGenerator() { return MULTIPLICATIVE_GENERATOR; } + public BN254bFq construct(final BigInteger number) { + return new BN254bFq(number); + } + public BN254bFq construct(final Fp element) { return new BN254bFq(element); } diff --git a/src/main/java/algebra/fields/AbstractFieldElementExpanded.java b/src/main/java/algebra/fields/AbstractFieldElementExpanded.java index 418be69..2530ed5 100755 --- a/src/main/java/algebra/fields/AbstractFieldElementExpanded.java +++ b/src/main/java/algebra/fields/AbstractFieldElementExpanded.java @@ -22,6 +22,9 @@ public abstract class AbstractFieldElementExpanded< /* Returns field element as FieldT(value) */ public abstract FieldT construct(final long value); + /* Returns field element as FieldT(value) */ + public abstract FieldT construct(final BigInteger value); + /* Returns this as a BigInteger */ public abstract BigInteger toBigInteger(); } From 527972b507bb88c09bae5a645ad141ff035f2b66 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 18:08:28 +0100 Subject: [PATCH 17/94] Passed field factory and used construct from bigints to bypass issues with generics --- src/main/java/io/JSONR1CSLoader.java | 22 ++++++++++------------ src/test/java/io/JSONR1CSLoaderTest.java | 2 +- 2 files changed, 11 insertions(+), 13 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 82fb3be..32008a2 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -33,13 +33,14 @@ public JSONR1CSLoader(String jsonFile){ }; // Loads the file to a "local" (i.e. non-distributed) R1CS instance + // Need to pass `fieldONE` as a was to bypass the limitations of java generics. + // The `construct` function is used to instantiate elements of type FieldT from `fieldONE` public > - void loadSerial(){ + void loadSerial(FieldT fieldONE){ //JSON parser object to parse read file JSONParser jsonParser = new JSONParser(); try (FileReader reader = new FileReader(this.filename)) { - //Read JSON file Object obj = jsonParser.parse(reader); JSONObject jsonR1CS = (JSONObject) obj; @@ -54,23 +55,20 @@ void loadSerial(){ JSONObject constraint = (JSONObject) jsonConstraints.get(i); JSONObject linCombs = (JSONObject) constraint.get("linear_combination"); JSONArray linCombA = (JSONArray) linCombs.get("A"); - ///ArrayList> termsA; + ArrayList> termsA = new ArrayList>(); for (int j = 0; j < linCombA.size(); j++) { - JSONObject jsonTerm = (JSONObject) linCombA.get(i); + JSONObject jsonTerm = (JSONObject) linCombA.get(j); String valueStr = (String) jsonTerm.get("value"); - System.out.println("HERE2: valueStr = " + valueStr); // Wire values are exported as hexadecimal strings // (we remove the '0x' prefix using `substring`) BigInteger value = new BigInteger(valueStr.substring(2), 16); System.out.println("value = " + value.toString()); - // fieldFactory is assumed to follow the same "interface" as the "BN fields" - // and must, hence, have a `FrParameters` attibute as in: https://github.com/clearmatics/dizk/blob/master/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java#L16 - // Likewise, we assume below that the constuctor of FieldT follows the convention of - // https://github.com/clearmatics/dizk/blob/master/src/main/java/algebra/curves/barreto_naehrig/bn254b/BN254bFields.java#L24-L26 - // and thus does not require to pass the parameters as second attribute - ///FieldT valueField = (FieldT) new FieldT(value); - ///termsA.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); + // FieldT is restricted to extend `AbstractFieldElementExpanded` + // which has a constructor from BigInteger + FieldT valueField = fieldONE.construct(value); + termsA.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); } + System.out.println("termsA = " + termsA.toString()); JSONArray linCombB = (JSONArray) linCombs.get("B"); System.out.println("linCombB = " + linCombB.toString()); JSONArray linCombC = (JSONArray) linCombs.get("C"); diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 683c63c..7ef6fd9 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -29,6 +29,6 @@ public void loadSerialTest(){ fail("Test r1cs file not found."); } JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); - loader.loadSerial(); + loader.loadSerial(BN254bFr.ONE); } } From 7a0513a5fcb8bbfc14f11a8687213072b2604c7b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 8 Oct 2020 18:38:11 +0100 Subject: [PATCH 18/94] Implemented linearComination and Constraints loading --- src/main/java/io/JSONR1CSLoader.java | 81 ++++++++++++++++++---------- 1 file changed, 53 insertions(+), 28 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 32008a2..7c7fdc1 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -12,6 +12,7 @@ import java.io.FileReader; import java.io.IOException; +import org.apache.arrow.memory.BaseAllocator.Reservation; import org.json.simple.JSONArray; import org.json.simple.JSONObject; import org.json.simple.parser.JSONParser; @@ -20,6 +21,9 @@ import java.util.ArrayList; import java.math.BigInteger; +import relations.objects.LinearCombination; +import relations.objects.R1CSConstraint; + /** * Class that implements all necessary functions to import and load an R1CS in JSON format */ @@ -32,6 +36,46 @@ public JSONR1CSLoader(String jsonFile){ this.filename = jsonFile; }; + private > + LinearCombination loadLinearCombination(FieldT fieldONE, JSONArray linearComb){ + LinearCombination res = new LinearCombination(); + for (int j = 0; j < linearComb.size(); j++) { + JSONObject jsonTerm = (JSONObject) linearComb.get(j); + String valueStr = (String) jsonTerm.get("value"); + // Wire values are exported as hexadecimal strings + // (we remove the '0x' prefix using `substring`) + BigInteger value = new BigInteger(valueStr.substring(2), 16); + // FieldT extends `AbstractFieldElementExpanded` which has `construct` from BigInteger + FieldT valueField = fieldONE.construct(value); + res.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); + } + + return res; + } + + private > + R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint){ + JSONObject jsonLinCombs = (JSONObject) jsonConstraint.get("linear_combination"); + + // Load `linear_combination.A` in memory + JSONArray jsonLinCombA = (JSONArray) jsonLinCombs.get("A"); + LinearCombination linearCombA = loadLinearCombination(fieldONE, jsonLinCombA); + System.out.println("linearCombA = " + linearCombA.toString()); + + // Load `linear_combination.B` in memory + JSONArray jsonLinCombB = (JSONArray) jsonLinCombs.get("B"); + LinearCombination linearCombB = loadLinearCombination(fieldONE, jsonLinCombB); + System.out.println("linearCombB = " + linearCombB.toString()); + + // Load `linear_combination.C` in memory + JSONArray jsonLinCombC = (JSONArray) jsonLinCombs.get("C"); + LinearCombination linearCombC = loadLinearCombination(fieldONE, jsonLinCombC); + System.out.println("linearCombC = " + linearCombC.toString()); + + R1CSConstraint res = new R1CSConstraint(linearCombA, linearCombB, linearCombC); + return res; + } + // Loads the file to a "local" (i.e. non-distributed) R1CS instance // Need to pass `fieldONE` as a was to bypass the limitations of java generics. // The `construct` function is used to instantiate elements of type FieldT from `fieldONE` @@ -42,42 +86,23 @@ void loadSerial(FieldT fieldONE){ try (FileReader reader = new FileReader(this.filename)) { Object obj = jsonParser.parse(reader); - JSONObject jsonR1CS = (JSONObject) obj; // TODO: Retrieve the field characteristic for type safety // Once recovered, we can assert that r = FieldT::r to make sure types match // Long fieldChar = (Long) jsonR1CS.get("scalar_field_characteristic"); + // assert (fieldChar == FieldT::mod); - JSONArray jsonConstraints = (JSONArray) jsonR1CS.get("constraints"); - System.out.println(jsonConstraints); // DEBUG ONLY - to remove - for (int i = 0; i < jsonConstraints.size(); i++) { - JSONObject constraint = (JSONObject) jsonConstraints.get(i); - JSONObject linCombs = (JSONObject) constraint.get("linear_combination"); - JSONArray linCombA = (JSONArray) linCombs.get("A"); - ArrayList> termsA = new ArrayList>(); - for (int j = 0; j < linCombA.size(); j++) { - JSONObject jsonTerm = (JSONObject) linCombA.get(j); - String valueStr = (String) jsonTerm.get("value"); - // Wire values are exported as hexadecimal strings - // (we remove the '0x' prefix using `substring`) - BigInteger value = new BigInteger(valueStr.substring(2), 16); - System.out.println("value = " + value.toString()); - // FieldT is restricted to extend `AbstractFieldElementExpanded` - // which has a constructor from BigInteger - FieldT valueField = fieldONE.construct(value); - termsA.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); - } - System.out.println("termsA = " + termsA.toString()); - JSONArray linCombB = (JSONArray) linCombs.get("B"); - System.out.println("linCombB = " + linCombB.toString()); - JSONArray linCombC = (JSONArray) linCombs.get("C"); - System.out.println("linCombC = " + linCombC.toString()); + JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); + R1CSConstraints constraintArray = new R1CSConstraints(); + for (int i = 0; i < jsonConstraintArray.size(); i++) { + R1CSConstraint constraint = loadConstraint(fieldONE, (JSONObject) jsonConstraintArray.get(i)); + constraintArray.add(constraint); } - //R1CSConstraints constrainsts; - Long numInputs = (Long) jsonR1CS.get("num_inputs"); - System.out.println(numInputs); + //Long numInputs = (Long) jsonR1CS.get("num_inputs"); + // TODO: Compute numAuxiliay inputs + //R1CSRelation relation = new R1CSRelation(constraintArray, numInputs, 42/*numAuxiliary*/); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { From b5ae2ba2b43664e224b2d681905d0804112a6468 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 10:53:15 +0100 Subject: [PATCH 19/94] Added construct from BigInteger --- src/main/java/algebra/fields/ComplexField.java | 7 +++++++ src/main/java/algebra/fields/Fp.java | 4 ++++ 2 files changed, 11 insertions(+) diff --git a/src/main/java/algebra/fields/ComplexField.java b/src/main/java/algebra/fields/ComplexField.java index 15aef97..7dafc9f 100755 --- a/src/main/java/algebra/fields/ComplexField.java +++ b/src/main/java/algebra/fields/ComplexField.java @@ -122,6 +122,13 @@ public int bitSize() { return DOUBLE_LENGTH; } + // Additional function required to comply with the extended `AbstractFieldElementExpanded` + // abstract class (with a constructor from BigInt). Do not use, converting BigInts to double isn't + // without risks (lose precision + risk to obtain `Double.NEGATIVE_INFINITY` or `Double.POSITIVE_INFINITY`) + public ComplexField construct(final BigInteger number) { + return new ComplexField(number.doubleValue()); + } + public ComplexField construct(final long value) { return new ComplexField(value); } diff --git a/src/main/java/algebra/fields/Fp.java b/src/main/java/algebra/fields/Fp.java index 27d5a18..1c3eafb 100755 --- a/src/main/java/algebra/fields/Fp.java +++ b/src/main/java/algebra/fields/Fp.java @@ -111,6 +111,10 @@ public int bitSize() { return number.bitLength(); } + public Fp construct(final BigInteger number) { + return new Fp(number, FpParameters); + } + public Fp construct(final long value) { return new Fp(value, FpParameters); } From fd3d2733152370a564bbc21808f20c83f9317812 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 10:53:50 +0100 Subject: [PATCH 20/94] Added hex modulus for BNa254 in json r1cs test data --- src/test/java/data/simple_gadget_r1cs.json | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/test/java/data/simple_gadget_r1cs.json b/src/test/java/data/simple_gadget_r1cs.json index fe5ce15..3a6552b 100644 --- a/src/test/java/data/simple_gadget_r1cs.json +++ b/src/test/java/data/simple_gadget_r1cs.json @@ -1,5 +1,5 @@ { - "scalar_field_characteristic":"Not yet supported. Should be bigint in hexadecimal", + "scalar_field_characteristic":"0x30644e72e131a029b85045b68181585d2833e84879b9709143e1f593f0000001", "num_variables":4, "num_constraints":3, "num_inputs":1, From 55e88d6397de7f45bb9b935608278d71e013115d Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 10:54:14 +0100 Subject: [PATCH 21/94] Added modulus check and switched tests to BNa254 --- src/main/java/io/JSONR1CSLoader.java | 14 ++++++++------ src/test/java/io/JSONR1CSLoaderTest.java | 6 +++--- 2 files changed, 11 insertions(+), 9 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 7c7fdc1..84edb89 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -3,6 +3,7 @@ import relations.r1cs.R1CSRelation; import relations.r1cs.R1CSRelationRDD; import algebra.fields.AbstractFieldElementExpanded; +import algebra.fields.abstractfieldparameters.AbstractFpParameters; import configuration.Configuration; import relations.objects.LinearTerm; import relations.objects.R1CSConstraintsRDD; @@ -79,8 +80,8 @@ R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint // Loads the file to a "local" (i.e. non-distributed) R1CS instance // Need to pass `fieldONE` as a was to bypass the limitations of java generics. // The `construct` function is used to instantiate elements of type FieldT from `fieldONE` - public > - void loadSerial(FieldT fieldONE){ + public , FieldParamsT extends AbstractFpParameters> + void loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ //JSON parser object to parse read file JSONParser jsonParser = new JSONParser(); @@ -88,10 +89,11 @@ void loadSerial(FieldT fieldONE){ Object obj = jsonParser.parse(reader); JSONObject jsonR1CS = (JSONObject) obj; - // TODO: Retrieve the field characteristic for type safety - // Once recovered, we can assert that r = FieldT::r to make sure types match - // Long fieldChar = (Long) jsonR1CS.get("scalar_field_characteristic"); - // assert (fieldChar == FieldT::mod); + // Retrieve the field characteristic for type safety + // Once recovered, we assert that r = FieldT::r to make sure types match + String jsonMod = (String) jsonR1CS.get("scalar_field_characteristic"); + BigInteger mod = new BigInteger(jsonMod.substring(2), 16); + assert (mod.equals(fieldParams.modulus())) : "Modulus mismatch while loading R1CS"; JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); R1CSConstraints constraintArray = new R1CSConstraints(); diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 7ef6fd9..3360068 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -11,7 +11,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.fail; -import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; +import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import io.JSONR1CSLoader; import java.nio.file.Paths; import java.nio.file.Path; @@ -21,7 +21,7 @@ public class JSONR1CSLoaderTest { @Test public void loadSerialTest(){ - // Load the test data file + // Load the test data String dizkHome = System.getenv("DIZK"); Path pathToFile = Paths.get( dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); @@ -29,6 +29,6 @@ public void loadSerialTest(){ fail("Test r1cs file not found."); } JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); - loader.loadSerial(BN254bFr.ONE); + loader.loadSerial(BN254aFr.ONE, BN254aFr.FrParameters); } } From c634709a021319fb639e1195be12c30b8398fd4d Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 14:52:38 +0100 Subject: [PATCH 22/94] Fixed linear combination evaluation by multiplying the ternary operation result by the coefficient --- .../relations/objects/LinearCombination.java | 30 ++++++++++++++++++- 1 file changed, 29 insertions(+), 1 deletion(-) diff --git a/src/main/java/relations/objects/LinearCombination.java b/src/main/java/relations/objects/LinearCombination.java index 2cd7c60..78cdd13 100755 --- a/src/main/java/relations/objects/LinearCombination.java +++ b/src/main/java/relations/objects/LinearCombination.java @@ -35,13 +35,41 @@ public boolean isValid(final int numVariables) { return true; } + /// WARNING: There are a few differences between how the protoboard is implemented in libsnark and how + /// the constraint system is handled here. In libsnark, the variable ONE is added by default as the first variable + /// on the protoboard (the user does not have to bother specifying ONE in their assignement). This is why + /// in the linear combination evaluation (see here: https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/variable.tcc#L267) + /// there is shift by one (in `assignment[lt.index-1]`) because the linear_combination.size() = assignment.size() + 1 since + /// the linear combination's first entry will be for ONE, while the first entry in `assignment` will be for the next/first user-defined + /// variable. As such we have, for instance: + /// ___ + /// | 5 | ____ + /// | 0 | | 12 | + /// | 2 | | 1 | + /// | 4 | | 1 | + /// | 1 | | 1 | + /// ^ ^ + /// | | + /// LinComb Assignment + /// + /// Will return: 5*ONE + 12*0 + 2*1 + 4*1 + 1*1 = 5+2+4+1 = 12 + /// when evaluated by the function below (i.e. the first entry in the linear combination is interpreted as factor of ONE) + /// + /// HOWEVER, here in DIZK, things are managed differently, and the ONE variable needs to be given as part of the assignment. + /// As such, there is not such "shift-by-one", hence, the ith entry in LinComb gets multiplied by the ith entry in assignment + /// (and not by the (i-1)th entry as in libsnark). public FieldT evaluate(final Assignment input) { FieldT result = input.get(0).zero(); final FieldT one = result.one(); + // Note: Assuming the "coefficient" representation of the linear combination A = [0 0 1 0 0] + // then the 0 terms are not represented in the LinearCombination object. In fact, in this + // example, A will be an Array of LinearTerms of size 1, with a single term LinearTerm(index = 2, value = 1) + // Hence, the ternary operation in the `for` loop below will assign `one` to `value` only when the `ONE` variable + // is selected in the linear combination. for (int i = 0; i < terms.size(); i++) { final long index = terms.get(i).index(); - final FieldT value = index == 0 ? one : input.get((int) index).mul(terms.get(i).value()); + final FieldT value = (index == 0 ? one : input.get((int) index)).mul(terms.get(i).value()); result = result.add(value); } From c125e777f23244d93824679e58e9597a614e7f68 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 14:55:09 +0100 Subject: [PATCH 23/94] Added ONE variable management to bridge with libsnark protoboard --- src/main/java/io/JSONR1CSLoader.java | 44 +++++++++++++++------------- 1 file changed, 23 insertions(+), 21 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 84edb89..a31e3d4 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -1,29 +1,22 @@ package io; -import relations.r1cs.R1CSRelation; -import relations.r1cs.R1CSRelationRDD; -import algebra.fields.AbstractFieldElementExpanded; -import algebra.fields.abstractfieldparameters.AbstractFpParameters; -import configuration.Configuration; -import relations.objects.LinearTerm; -import relations.objects.R1CSConstraintsRDD; -import relations.objects.R1CSConstraints; - import java.io.FileNotFoundException; import java.io.FileReader; import java.io.IOException; +import java.math.BigInteger; -import org.apache.arrow.memory.BaseAllocator.Reservation; import org.json.simple.JSONArray; import org.json.simple.JSONObject; import org.json.simple.parser.JSONParser; import org.json.simple.parser.ParseException; -import java.util.ArrayList; -import java.math.BigInteger; - +import algebra.fields.AbstractFieldElementExpanded; +import algebra.fields.abstractfieldparameters.AbstractFpParameters; import relations.objects.LinearCombination; +import relations.objects.LinearTerm; import relations.objects.R1CSConstraint; +import relations.objects.R1CSConstraints; +import relations.r1cs.R1CSRelation; /** * Class that implements all necessary functions to import and load an R1CS in JSON format @@ -61,17 +54,14 @@ R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint // Load `linear_combination.A` in memory JSONArray jsonLinCombA = (JSONArray) jsonLinCombs.get("A"); LinearCombination linearCombA = loadLinearCombination(fieldONE, jsonLinCombA); - System.out.println("linearCombA = " + linearCombA.toString()); // Load `linear_combination.B` in memory JSONArray jsonLinCombB = (JSONArray) jsonLinCombs.get("B"); LinearCombination linearCombB = loadLinearCombination(fieldONE, jsonLinCombB); - System.out.println("linearCombB = " + linearCombB.toString()); // Load `linear_combination.C` in memory JSONArray jsonLinCombC = (JSONArray) jsonLinCombs.get("C"); LinearCombination linearCombC = loadLinearCombination(fieldONE, jsonLinCombC); - System.out.println("linearCombC = " + linearCombC.toString()); R1CSConstraint res = new R1CSConstraint(linearCombA, linearCombB, linearCombC); return res; @@ -81,10 +71,11 @@ R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint // Need to pass `fieldONE` as a was to bypass the limitations of java generics. // The `construct` function is used to instantiate elements of type FieldT from `fieldONE` public , FieldParamsT extends AbstractFpParameters> - void loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ + R1CSRelation loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ //JSON parser object to parse read file JSONParser jsonParser = new JSONParser(); - + + R1CSRelation empty = new R1CSRelation(new R1CSConstraints(), 0, 0); try (FileReader reader = new FileReader(this.filename)) { Object obj = jsonParser.parse(reader); JSONObject jsonR1CS = (JSONObject) obj; @@ -102,9 +93,17 @@ void loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ constraintArray.add(constraint); } - //Long numInputs = (Long) jsonR1CS.get("num_inputs"); - // TODO: Compute numAuxiliay inputs - //R1CSRelation relation = new R1CSRelation(constraintArray, numInputs, 42/*numAuxiliary*/); + // Convert Long to int "safely": an exception is raised if Long overflows the int + // see: https://docs.oracle.com/javase/10/docs/api/java/lang/Math.html#toIntExact(long) + // Add +1 to `numInputs` to account for the manual handling of ONE + int numInputs = Math.toIntExact((Long) jsonR1CS.get("num_inputs")) + 1; + // Add +1 to `numVariables` to account for the manual handling of ONE + int numVariables = Math.toIntExact((Long) jsonR1CS.get("num_variables")) + 1; + int numAuxiliary = numVariables - numInputs; + R1CSRelation relation = new R1CSRelation(constraintArray, numInputs, numAuxiliary); + + assert relation.isValid(): "Loaded relation is invalid"; + return relation; } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { @@ -112,6 +111,9 @@ void loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ } catch (ParseException e) { e.printStackTrace(); } + + // Return the empty relation if the body of the `try` threw + return empty; } // Loads the file to an RDD (i.e. distributed) R1CS instance From 2279b1ecc4d743375c32145640b735436f3b9ffb Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 14:56:26 +0100 Subject: [PATCH 24/94] Provided valid wire assignment to test loaded R1CS --- src/test/java/io/JSONR1CSLoaderTest.java | 22 +++++++++++++++++++++- 1 file changed, 21 insertions(+), 1 deletion(-) diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 3360068..575e7f0 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -17,6 +17,11 @@ import java.nio.file.Path; import java.nio.file.Files; +import relations.objects.Assignment; + +import java.util.ArrayList; +import java.util.List; + public class JSONR1CSLoaderTest { @Test @@ -29,6 +34,21 @@ public void loadSerialTest(){ fail("Test r1cs file not found."); } JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); - loader.loadSerial(BN254aFr.ONE, BN254aFr.FrParameters); + R1CSRelation loadedRelation = loader.loadSerial(BN254aFr.ONE, BN254aFr.FrParameters); + assertTrue(loadedRelation.isValid()); + + // Make sure the loaded relation is satisfied with a valid assignment + Assignment primary = new Assignment(); + // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) + // see further discussion in the `evaluate` function in `LinearCombination.java` + primary.add(BN254aFr.ONE); + primary.add(new BN254aFr("12")); + Assignment auxiliary = new Assignment(); + auxiliary.add(new BN254aFr("1")); + auxiliary.add(new BN254aFr("1")); + auxiliary.add(new BN254aFr("1")); + //assertTrue(loadedRelation.isSatisfied(primary, auxiliary)); + boolean res = loadedRelation.isSatisfied(primary, auxiliary); + System.out.println("Res: " + res); } } From ae6243c2fbfcd674ab6d399b1b334c07907e413b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 9 Oct 2020 15:44:37 +0100 Subject: [PATCH 25/94] Added invalid assignment test to serial loader test --- src/test/java/io/JSONR1CSLoaderTest.java | 17 +++++++++++++---- 1 file changed, 13 insertions(+), 4 deletions(-) diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 575e7f0..3aa7b7d 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -9,6 +9,7 @@ import org.junit.jupiter.api.Test; import static org.junit.jupiter.api.Assertions.assertTrue; +import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.fail; import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; @@ -37,7 +38,7 @@ public void loadSerialTest(){ R1CSRelation loadedRelation = loader.loadSerial(BN254aFr.ONE, BN254aFr.FrParameters); assertTrue(loadedRelation.isValid()); - // Make sure the loaded relation is satisfied with a valid assignment + // Make sure the loaded relation is satisfied with a VALID assignment Assignment primary = new Assignment(); // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) // see further discussion in the `evaluate` function in `LinearCombination.java` @@ -47,8 +48,16 @@ public void loadSerialTest(){ auxiliary.add(new BN254aFr("1")); auxiliary.add(new BN254aFr("1")); auxiliary.add(new BN254aFr("1")); - //assertTrue(loadedRelation.isSatisfied(primary, auxiliary)); - boolean res = loadedRelation.isSatisfied(primary, auxiliary); - System.out.println("Res: " + res); + assertTrue(loadedRelation.isSatisfied(primary, auxiliary)); + + // Make sure the loaded relation is satisfied with an INVALID assignment + Assignment invalidPrimary = new Assignment(); + invalidPrimary.add(BN254aFr.ONE); + invalidPrimary.add(new BN254aFr("12")); + Assignment invalidAuxiliary = new Assignment(); + invalidAuxiliary.add(new BN254aFr("2")); + invalidAuxiliary.add(new BN254aFr("1")); + invalidAuxiliary.add(new BN254aFr("1")); + assertFalse(loadedRelation.isSatisfied(invalidPrimary, invalidAuxiliary)); } } From e1e830d8d934a89011bef24e5e8dd8fb2d821e80 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 16:39:39 +0100 Subject: [PATCH 26/94] Added comment in Configuration class --- src/main/java/configuration/Configuration.java | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/src/main/java/configuration/Configuration.java b/src/main/java/configuration/Configuration.java index 0f81e3e..1d3d1ee 100755 --- a/src/main/java/configuration/Configuration.java +++ b/src/main/java/configuration/Configuration.java @@ -36,10 +36,20 @@ public class Configuration implements Serializable { /* HashMap>>> */ private HashMap>>> runtimeLogs; - /* System Configurations */ + /* System Configuration */ + // Nb of executors + // Note: Executors are worker nodes' processes in charge of running + // individual tasks in a given Spark job. private int numExecutors; + // Nb cores per executor private int numCores; + // Amount GB RAM memory per executor private int numMemory; + // Nb of partitions per RDD + // Resilient Distributed Datasets (RDD) are a collection of various data + // that are so big in size, that they cannot fit into a single node and + // should be partitioned across various nodes. + // See: https://www.talend.com/blog/2018/03/05/intro-apache-spark-partitioning-need-know/ private int numPartitions; private StorageLevel storageLevel; private JavaSparkContext sc; From 7acc0bc6646541bad4f198bcc5a8782caafc4081 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 16:40:09 +0100 Subject: [PATCH 27/94] Implemented first shot of RDD loader --- src/main/java/io/JSONR1CSLoader.java | 181 ++++++++++++++++++++++++--- 1 file changed, 166 insertions(+), 15 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index a31e3d4..b413075 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -4,6 +4,7 @@ import java.io.FileReader; import java.io.IOException; import java.math.BigInteger; +import java.util.ArrayList; import org.json.simple.JSONArray; import org.json.simple.JSONObject; @@ -17,6 +18,12 @@ import relations.objects.R1CSConstraint; import relations.objects.R1CSConstraints; import relations.r1cs.R1CSRelation; +import relations.r1cs.R1CSRelationRDD; +import scala.Tuple2; +import relations.objects.R1CSConstraintsRDD; + +import configuration.Configuration; +import org.apache.spark.api.java.JavaPairRDD; /** * Class that implements all necessary functions to import and load an R1CS in JSON format @@ -67,15 +74,22 @@ R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint return res; } - // Loads the file to a "local" (i.e. non-distributed) R1CS instance - // Need to pass `fieldONE` as a was to bypass the limitations of java generics. - // The `construct` function is used to instantiate elements of type FieldT from `fieldONE` - public , FieldParamsT extends AbstractFpParameters> - R1CSRelation loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ + /// Loads the file to a "local" (i.e. non-distributed) R1CS instance + /// Need to pass `fieldONE` as a was to bypass the limitations of java generics. + /// The `construct` function is used to instantiate elements of type FieldT from `fieldONE` + public < + FieldT extends AbstractFieldElementExpanded, + FieldParamsT extends AbstractFpParameters> + R1CSRelation loadSerial( + final FieldT fieldONE, + final FieldParamsT fieldParams + ){ //JSON parser object to parse read file JSONParser jsonParser = new JSONParser(); R1CSRelation empty = new R1CSRelation(new R1CSConstraints(), 0, 0); + // TODO: Mark the function with a `throws` instead of having such a big try/catch + // i.e. add `throws IOException, ParseException` try (FileReader reader = new FileReader(this.filename)) { Object obj = jsonParser.parse(reader); JSONObject jsonR1CS = (JSONObject) obj; @@ -116,18 +130,155 @@ R1CSRelation loadSerial(FieldT fieldONE, FieldParamsT fieldParams){ return empty; } - // Loads the file to an RDD (i.e. distributed) R1CS instance - /* - public R1CSRelationRDD> loadRDD( - final FieldT fieldFactory, + /// RDD + + private > + JavaPairRDD> loadLinearCombinationRDD( + final FieldT fieldONE, + final ArrayList partitionSet, + final JSONArray jsonConstraintArray, + final String linearCombKey, final Configuration config ){ - final R1CSConstraintsRDD loadedConstraints; - final int loadedNumInputs; - final long loadedNumAuxiliary; + final int nbPartitions = config.numPartitions(); + final int nbConstraints = jsonConstraintArray.size(); + + // Normally, Spark tries to set the number of partitions automatically based on the cluster. + // However, it is possible to pass it manually via a second parameter to parallelize + // (e.g. sc.parallelize(data, 10)) + // See: https://spark.apache.org/docs/3.0.0/rdd-programming-guide.html#parallelized-collections + // + // NOTE1: Here "the initial dataset" is just an array of integers (the "partitions" which is + // an ArrayList). This array (e.g. [0,1,2]) is used to build the data in each partitions + // via the use of a lambda expression in `flatMapToPair` which maps an Integer to the output + // of the block below (i.e. `return linearCombinationChunk.iterator();`). + // In other words, for each "partition index" a partition "content" is returned. + // See: https://www.w3schools.com/java/java_lambda.asp for syntax of java lambda functions + // + // NOTE2: The signature of the `flatMapToPair` function is: + // `flatMapToPair[K2, V2](f: PairFlatMapFunction[T, K2, V2]): JavaPairRDD[K2, V2]` + // Hence, it returns a JavaRDDPair just as expected. + return config.sparkContext().parallelize(partitionSet, nbPartitions) + .flatMapToPair(partitionID -> { + // Compute the size of the data set processed by the partition `partitionID` + // Note: all partitions will have data sets of the same size, apart from the last one + // (if it exists) which will have a dataset made of the remaining entries (i.e. the + // cardinality of this set will be < than the one of the other partitions) + final long partitionSize = (partitionID == nbPartitions ? + nbConstraints % nbPartitions : nbConstraints / nbPartitions); + + final ArrayList>> linearCombinationChunk = new ArrayList<>(); + // Build the partition data set + // For each constraints in the data set + for (long i = 0; i < partitionSize; i++) { + final long constraintIndex = partitionID * (nbConstraints / nbPartitions) + i; + + // Here `constraintIndex` determines the constraint ID + // and `constraintArrayIndex` determines the lin comb of this constraint + // Hence, `next` contains the linear combination to parse + JSONObject jsonConstraint = (JSONObject) jsonConstraintArray.get((int) constraintIndex); + JSONObject jsonLinCombs = (JSONObject) jsonConstraint.get("linear_combination"); + JSONArray jsonLinComb = (JSONArray) jsonLinCombs.get(linearCombKey); + + // Parse all LinearTerms of the linear combination of interest + for (int j = 0; j < jsonLinComb.size(); j++) { + JSONObject jsonTerm = (JSONObject) jsonLinComb.get(j); + String valueStr = (String) jsonTerm.get("value"); + // Wire values are exported as hexadecimal strings + // (we remove the '0x' prefix using `substring`) + BigInteger value = new BigInteger(valueStr.substring(2), 16); + // FieldT extends `AbstractFieldElementExpanded` which has `construct` from BigInteger + FieldT valueField = fieldONE.construct(value); + linearCombinationChunk.add(new Tuple2<>(constraintIndex, new LinearTerm((Long) jsonTerm.get("index"), valueField))); + } + } + return linearCombinationChunk.iterator(); + }); + } + + // TODO: Spend some time to properly understand the behavior of the JSON library and the file reader + // we don't want to load all the file in memory, ideally everything should be "streamed". + // This may lead to several passes on the file (as we need to navigate all constraints) multiple + // times to load each individual linear combination, but this may be more desirable than reading + // and keeping an enormous file in memory (e.g. an R1CS of billion gates). + // Else, refactor the code to replace `loadLinearCombinationsRDD` by `loadConstraintsRDD` and + // only navigate the constraint set once. + + // Loads the file to an RDD (i.e. distributed) R1CS instance + public , FieldParamsT extends AbstractFpParameters> + R1CSRelationRDD loadRDD( + final FieldT fieldONE, + final FieldParamsT fieldParams, + final Configuration config + ) throws FileNotFoundException, IOException, ParseException { + // JSON parser object to parse read file + JSONParser jsonParser = new JSONParser(); + + FileReader reader = new FileReader(this.filename); + Object obj = jsonParser.parse(reader); + JSONObject jsonR1CS = (JSONObject) obj; + + // Retrieve the field characteristic for type safety + // Once recovered, we assert that r = FieldT::r to make sure types match + String jsonMod = (String) jsonR1CS.get("scalar_field_characteristic"); + BigInteger mod = new BigInteger(jsonMod.substring(2), 16); + assert (mod.equals(fieldParams.modulus())) : "Modulus mismatch while loading R1CS"; + + long nbJSONConstraintArray = (Long) jsonR1CS.get("num_constraints"); + final int numPartitions = config.numPartitions(); + + // This array of partitions is useful to feed into the lambda function to map + // a partition ID (an index in [numPartition]) to the corresponding R1CS chunk + // See in `loadLinearCombinationRDD`. + // + // The default config has 2 partitions per RDD. + // We add "partition 0" and "partition 1" to the array of partitions. + final ArrayList partitions = new ArrayList<>(); + for (int i = 0; i < numPartitions; i++) { + partitions.add(i); + } + // If the number of constraints is not a multiple of `numPartitions`, + // we add an extra partition to process the remaining part of the dataset. + // NOTE: This only makes sense when the # constraints > # partitions + // which will be the case in most (realistic) scenarios + if (nbJSONConstraintArray % numPartitions != 0) { + partitions.add(numPartitions); + } + + // Get a ref to the array of constraints + JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); + + // Load all LinearCombinations "A" + JavaPairRDD> linCombinationA = loadLinearCombinationRDD( + fieldONE, partitions, jsonConstraintArray, "A", config); + + // Load all LinearCombinations "B" + JavaPairRDD> linCombinationB = loadLinearCombinationRDD( + fieldONE, partitions, jsonConstraintArray, "B", config); + + // Load all LinearCombinations "C" + JavaPairRDD> linCombinationC = loadLinearCombinationRDD( + fieldONE, partitions, jsonConstraintArray, "C", config); + + // Make sure that the loaded data is sound + assert linCombinationA.count() == linCombinationB.count(); + assert linCombinationB.count() == linCombinationC.count(); + assert linCombinationC.count() == nbJSONConstraintArray; + + final R1CSConstraintsRDD constraints = new R1CSConstraintsRDD<>( + linCombinationA, linCombinationB, linCombinationC, nbJSONConstraintArray); + + // Convert Long to int "safely": an exception is raised if Long overflows the int + // see: https://docs.oracle.com/javase/10/docs/api/java/lang/Math.html#toIntExact(long) + // Add +1 to `numInputs` to account for the manual handling of ONE + int numInputs = Math.toIntExact((Long) jsonR1CS.get("num_inputs")) + 1; + // Add +1 to `numVariables` to account for the manual handling of ONE + int numVariables = Math.toIntExact((Long) jsonR1CS.get("num_variables")) + 1; + int numAuxiliary = numVariables - numInputs; + + R1CSRelationRDD relation = new R1CSRelationRDD(constraints, numInputs, numAuxiliary); + assert relation.isValid(): "Loaded relation is invalid"; - final R1CSRelationRDD loadedRelationRDD = new R1CSRelationRDD(loadedConstraints, loadedNumInputs, loadedNumAuxiliary); - return loadedRelationRDD; + return relation; } - */ } From a8400733def3221332959e127a906d29558e7a1f Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 17:26:08 +0100 Subject: [PATCH 28/94] Added precision about .count method on RDD --- src/main/java/io/JSONR1CSLoader.java | 1 + 1 file changed, 1 insertion(+) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index b413075..4a8213d 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -261,6 +261,7 @@ R1CSRelationRDD loadRDD( fieldONE, partitions, jsonConstraintArray, "C", config); // Make sure that the loaded data is sound + // `.count()` returns the number of elements in the RDD. assert linCombinationA.count() == linCombinationB.count(); assert linCombinationB.count() == linCombinationC.count(); assert linCombinationC.count() == nbJSONConstraintArray; From 165dde787818ab8d93461f8c74d51c045a95fda3 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 17:26:48 +0100 Subject: [PATCH 29/94] Implemented first shot of RDD load test --- src/test/java/io/JSONR1CSLoaderTest.java | 75 +++++++++++++++++++++++- 1 file changed, 73 insertions(+), 2 deletions(-) diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 3aa7b7d..b385847 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -2,6 +2,7 @@ import relations.r1cs.R1CSRelation; import relations.r1cs.R1CSRelationRDD; +import scala.Tuple2; import algebra.fields.AbstractFieldElementExpanded; import configuration.Configuration; import relations.objects.LinearTerm; @@ -16,13 +17,25 @@ import io.JSONR1CSLoader; import java.nio.file.Paths; import java.nio.file.Path; -import java.nio.file.Files; +import java.io.FileNotFoundException; +import java.io.IOException; +import java.nio.file.Files; import relations.objects.Assignment; import java.util.ArrayList; +import java.util.Arrays; import java.util.List; +import profiler.utils.SparkUtils; + +import org.apache.spark.api.java.JavaPairRDD; +import org.apache.spark.api.java.JavaRDD; +import org.apache.spark.api.java.JavaSparkContext; +import org.apache.spark.storage.StorageLevel; +import org.json.simple.parser.ParseException; +import org.apache.spark.SparkConf; + public class JSONR1CSLoaderTest { @Test @@ -50,7 +63,7 @@ public void loadSerialTest(){ auxiliary.add(new BN254aFr("1")); assertTrue(loadedRelation.isSatisfied(primary, auxiliary)); - // Make sure the loaded relation is satisfied with an INVALID assignment + // Make sure the loaded relation is NOT satisfied with an INVALID assignment Assignment invalidPrimary = new Assignment(); invalidPrimary.add(BN254aFr.ONE); invalidPrimary.add(new BN254aFr("12")); @@ -60,4 +73,62 @@ public void loadSerialTest(){ invalidAuxiliary.add(new BN254aFr("1")); assertFalse(loadedRelation.isSatisfied(invalidPrimary, invalidAuxiliary)); } + + @Test + public void loadRDDTest(){ + // Load the test data + String dizkHome = System.getenv("DIZK"); + Path pathToFile = Paths.get( + dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); + if (!Files.exists(pathToFile)) { + fail("Test r1cs file not found."); + } + + // Set up configuration and SPARK context + final SparkConf conf = new SparkConf().setMaster("local").setAppName("loader"); + conf.set("spark.files.overwrite", "true"); + conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); + conf.registerKryoClasses(SparkUtils.zksparkClasses()); + + final JavaSparkContext sc; + final Configuration config; + sc = new JavaSparkContext(conf); + config = new Configuration(1, 1, 1, 2, sc, StorageLevel.MEMORY_ONLY()); + config.setRuntimeFlag(false); + config.setDebugFlag(true); + + JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); + + try{ + R1CSRelationRDD loadedRelationRDD = loader.loadRDD(BN254aFr.ONE, BN254aFr.FrParameters, config); + assertTrue(loadedRelationRDD.isValid()); + + // Make sure the loaded relation is satisfied with a VALID assignment + Assignment primary = new Assignment(); + // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) + // see further discussion in the `evaluate` function in `LinearCombination.java` + primary.add(BN254aFr.ONE); + primary.add(new BN254aFr("12")); + + List> fullAssignment = Arrays.asList( + new Tuple2<>((long) 0, BN254aFr.ONE), // Primary + new Tuple2<>((long) 1, new BN254aFr("12")), + new Tuple2<>((long) 2, new BN254aFr("1")), // Auxiliary + new Tuple2<>((long) 3, new BN254aFr("1")), + new Tuple2<>((long) 4, new BN254aFr("1")) + ); + JavaRDD> assignmentRDD = sc.parallelize(fullAssignment); + JavaPairRDD pairAssignmentRDD = JavaPairRDD.fromJavaRDD(assignmentRDD); + + boolean result = loadedRelationRDD.isSatisfied(primary, pairAssignmentRDD); + System.out.println("Result after assignment: " + result); + assertTrue(result); + } catch (FileNotFoundException e) { + e.printStackTrace(); + } catch (IOException e) { + e.printStackTrace(); + } catch (ParseException e) { + e.printStackTrace(); + } + } } From 8842bcc26a24d4546e953d640057171d52a57fcf Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 17:47:54 +0100 Subject: [PATCH 30/94] Added debug block to assert on RDDs --- src/main/java/io/JSONR1CSLoader.java | 16 +++++++++++++--- src/test/java/io/JSONR1CSLoaderTest.java | 4 ++-- 2 files changed, 15 insertions(+), 5 deletions(-) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 4a8213d..899a754 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -262,9 +262,19 @@ R1CSRelationRDD loadRDD( // Make sure that the loaded data is sound // `.count()` returns the number of elements in the RDD. - assert linCombinationA.count() == linCombinationB.count(); - assert linCombinationB.count() == linCombinationC.count(); - assert linCombinationC.count() == nbJSONConstraintArray; + + // [BEGIN DEBUG BLOCK] + // Be careful, these operations are very expensive, only do with very small datasets! + //System.out.println("linCombinationA.groupByKey().count() = " + linCombinationA.groupByKey().count()); + //System.out.println("linCombinationB.groupByKey().count() = " + linCombinationB.groupByKey().count()); + //System.out.println("linCombinationC.groupByKey().count() = " + linCombinationC.groupByKey().count()); + //for(Tuple2>> entry:linCombinationA.groupByKey().collect()){ + // System.out.println("* " + entry); + //} + assert linCombinationA.groupByKey().count() == linCombinationB.groupByKey().count(); + assert linCombinationB.groupByKey().count() == linCombinationC.groupByKey().count(); + assert linCombinationC.groupByKey().count() == nbJSONConstraintArray; + // [END DEBUG BLOCK] final R1CSConstraintsRDD constraints = new R1CSConstraintsRDD<>( linCombinationA, linCombinationB, linCombinationC, nbJSONConstraintArray); diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index b385847..6ee0e61 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -110,7 +110,7 @@ public void loadRDDTest(){ primary.add(BN254aFr.ONE); primary.add(new BN254aFr("12")); - List> fullAssignment = Arrays.asList( + List> fullAssignment = Arrays.asList( new Tuple2<>((long) 0, BN254aFr.ONE), // Primary new Tuple2<>((long) 1, new BN254aFr("12")), new Tuple2<>((long) 2, new BN254aFr("1")), // Auxiliary @@ -121,7 +121,7 @@ public void loadRDDTest(){ JavaPairRDD pairAssignmentRDD = JavaPairRDD.fromJavaRDD(assignmentRDD); boolean result = loadedRelationRDD.isSatisfied(primary, pairAssignmentRDD); - System.out.println("Result after assignment: " + result); + System.out.println("==========> Result after assignment: " + result); assertTrue(result); } catch (FileNotFoundException e) { e.printStackTrace(); From 146abf2f66d81f0c41e9e70227de5eb59d6dd8ec Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 17:51:27 +0100 Subject: [PATCH 31/94] Added negative test case for RDD loader --- src/test/java/io/JSONR1CSLoaderTest.java | 21 +++++++++++++++++++++ 1 file changed, 21 insertions(+) diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index 6ee0e61..a2c903d 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -123,6 +123,27 @@ public void loadRDDTest(){ boolean result = loadedRelationRDD.isSatisfied(primary, pairAssignmentRDD); System.out.println("==========> Result after assignment: " + result); assertTrue(result); + + // Make sure the loaded relation is NOT satisfied with an INVALID assignment + Assignment primaryInvalid = new Assignment(); + // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) + // see further discussion in the `evaluate` function in `LinearCombination.java` + primaryInvalid.add(BN254aFr.ONE); + primaryInvalid.add(new BN254aFr("12")); + + List> fullAssignmentInvalid = Arrays.asList( + new Tuple2<>((long) 0, BN254aFr.ONE), // Primary + new Tuple2<>((long) 1, new BN254aFr("12")), + new Tuple2<>((long) 2, new BN254aFr("2")), // Invalid Auxiliary + new Tuple2<>((long) 3, new BN254aFr("1")), + new Tuple2<>((long) 4, new BN254aFr("1")) + ); + JavaRDD> invalidAssignmentRDD = sc.parallelize(fullAssignmentInvalid); + JavaPairRDD invalidPairAssignmentRDD = JavaPairRDD.fromJavaRDD(invalidAssignmentRDD); + + boolean invalidResult = loadedRelationRDD.isSatisfied(primaryInvalid, invalidPairAssignmentRDD); + System.out.println("==========> Result after INVALID assignment: " + invalidResult); + assertFalse(invalidResult); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e) { From 31790c2fde8e6eef087c8ebf6d1b7c7baa71047b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 17:52:42 +0100 Subject: [PATCH 32/94] Formatted code --- .../java/algebra/fields/ComplexField.java | 3 +- src/main/java/io/JSONR1CSLoader.java | 546 +++++++++--------- .../relations/objects/LinearCombination.java | 35 +- src/test/java/io/JSONR1CSLoaderTest.java | 258 ++++----- 4 files changed, 430 insertions(+), 412 deletions(-) diff --git a/src/main/java/algebra/fields/ComplexField.java b/src/main/java/algebra/fields/ComplexField.java index 7dafc9f..14564a0 100755 --- a/src/main/java/algebra/fields/ComplexField.java +++ b/src/main/java/algebra/fields/ComplexField.java @@ -124,7 +124,8 @@ public int bitSize() { // Additional function required to comply with the extended `AbstractFieldElementExpanded` // abstract class (with a constructor from BigInt). Do not use, converting BigInts to double isn't - // without risks (lose precision + risk to obtain `Double.NEGATIVE_INFINITY` or `Double.POSITIVE_INFINITY`) + // without risks (lose precision + risk to obtain `Double.NEGATIVE_INFINITY` or + // `Double.POSITIVE_INFINITY`) public ComplexField construct(final BigInteger number) { return new ComplexField(number.doubleValue()); } diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 899a754..649ae18 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -1,295 +1,309 @@ package io; +import algebra.fields.AbstractFieldElementExpanded; +import algebra.fields.abstractfieldparameters.AbstractFpParameters; +import configuration.Configuration; import java.io.FileNotFoundException; import java.io.FileReader; import java.io.IOException; import java.math.BigInteger; import java.util.ArrayList; - +import org.apache.spark.api.java.JavaPairRDD; import org.json.simple.JSONArray; import org.json.simple.JSONObject; import org.json.simple.parser.JSONParser; import org.json.simple.parser.ParseException; - -import algebra.fields.AbstractFieldElementExpanded; -import algebra.fields.abstractfieldparameters.AbstractFpParameters; import relations.objects.LinearCombination; import relations.objects.LinearTerm; import relations.objects.R1CSConstraint; import relations.objects.R1CSConstraints; +import relations.objects.R1CSConstraintsRDD; import relations.r1cs.R1CSRelation; import relations.r1cs.R1CSRelationRDD; import scala.Tuple2; -import relations.objects.R1CSConstraintsRDD; -import configuration.Configuration; -import org.apache.spark.api.java.JavaPairRDD; - -/** - * Class that implements all necessary functions to import and load an R1CS in JSON format - */ +/** Class that implements all necessary functions to import and load an R1CS in JSON format */ public class JSONR1CSLoader { - // File to parse and load in memory - private String filename; - - public JSONR1CSLoader(){}; - public JSONR1CSLoader(String jsonFile){ - this.filename = jsonFile; - }; - - private > - LinearCombination loadLinearCombination(FieldT fieldONE, JSONArray linearComb){ - LinearCombination res = new LinearCombination(); - for (int j = 0; j < linearComb.size(); j++) { - JSONObject jsonTerm = (JSONObject) linearComb.get(j); - String valueStr = (String) jsonTerm.get("value"); - // Wire values are exported as hexadecimal strings - // (we remove the '0x' prefix using `substring`) - BigInteger value = new BigInteger(valueStr.substring(2), 16); - // FieldT extends `AbstractFieldElementExpanded` which has `construct` from BigInteger - FieldT valueField = fieldONE.construct(value); - res.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); - } - - return res; + // File to parse and load in memory + private String filename; + + public JSONR1CSLoader() {} + ; + + public JSONR1CSLoader(String jsonFile) { + this.filename = jsonFile; + } + ; + + private > + LinearCombination loadLinearCombination(FieldT fieldONE, JSONArray linearComb) { + LinearCombination res = new LinearCombination(); + for (int j = 0; j < linearComb.size(); j++) { + JSONObject jsonTerm = (JSONObject) linearComb.get(j); + String valueStr = (String) jsonTerm.get("value"); + // Wire values are exported as hexadecimal strings + // (we remove the '0x' prefix using `substring`) + BigInteger value = new BigInteger(valueStr.substring(2), 16); + // FieldT extends `AbstractFieldElementExpanded` which has `construct` from BigInteger + FieldT valueField = fieldONE.construct(value); + res.add(new LinearTerm((Long) jsonTerm.get("index"), valueField)); } - private > - R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint){ - JSONObject jsonLinCombs = (JSONObject) jsonConstraint.get("linear_combination"); - - // Load `linear_combination.A` in memory - JSONArray jsonLinCombA = (JSONArray) jsonLinCombs.get("A"); - LinearCombination linearCombA = loadLinearCombination(fieldONE, jsonLinCombA); - - // Load `linear_combination.B` in memory - JSONArray jsonLinCombB = (JSONArray) jsonLinCombs.get("B"); - LinearCombination linearCombB = loadLinearCombination(fieldONE, jsonLinCombB); - - // Load `linear_combination.C` in memory - JSONArray jsonLinCombC = (JSONArray) jsonLinCombs.get("C"); - LinearCombination linearCombC = loadLinearCombination(fieldONE, jsonLinCombC); - - R1CSConstraint res = new R1CSConstraint(linearCombA, linearCombB, linearCombC); - return res; + return res; + } + + private > + R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint) { + JSONObject jsonLinCombs = (JSONObject) jsonConstraint.get("linear_combination"); + + // Load `linear_combination.A` in memory + JSONArray jsonLinCombA = (JSONArray) jsonLinCombs.get("A"); + LinearCombination linearCombA = loadLinearCombination(fieldONE, jsonLinCombA); + + // Load `linear_combination.B` in memory + JSONArray jsonLinCombB = (JSONArray) jsonLinCombs.get("B"); + LinearCombination linearCombB = loadLinearCombination(fieldONE, jsonLinCombB); + + // Load `linear_combination.C` in memory + JSONArray jsonLinCombC = (JSONArray) jsonLinCombs.get("C"); + LinearCombination linearCombC = loadLinearCombination(fieldONE, jsonLinCombC); + + R1CSConstraint res = new R1CSConstraint(linearCombA, linearCombB, linearCombC); + return res; + } + + /// Loads the file to a "local" (i.e. non-distributed) R1CS instance + /// Need to pass `fieldONE` as a was to bypass the limitations of java generics. + /// The `construct` function is used to instantiate elements of type FieldT from `fieldONE` + public < + FieldT extends AbstractFieldElementExpanded, + FieldParamsT extends AbstractFpParameters> + R1CSRelation loadSerial(final FieldT fieldONE, final FieldParamsT fieldParams) { + // JSON parser object to parse read file + JSONParser jsonParser = new JSONParser(); + + R1CSRelation empty = new R1CSRelation(new R1CSConstraints(), 0, 0); + // TODO: Mark the function with a `throws` instead of having such a big try/catch + // i.e. add `throws IOException, ParseException` + try (FileReader reader = new FileReader(this.filename)) { + Object obj = jsonParser.parse(reader); + JSONObject jsonR1CS = (JSONObject) obj; + + // Retrieve the field characteristic for type safety + // Once recovered, we assert that r = FieldT::r to make sure types match + String jsonMod = (String) jsonR1CS.get("scalar_field_characteristic"); + BigInteger mod = new BigInteger(jsonMod.substring(2), 16); + assert (mod.equals(fieldParams.modulus())) : "Modulus mismatch while loading R1CS"; + + JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); + R1CSConstraints constraintArray = new R1CSConstraints(); + for (int i = 0; i < jsonConstraintArray.size(); i++) { + R1CSConstraint constraint = + loadConstraint(fieldONE, (JSONObject) jsonConstraintArray.get(i)); + constraintArray.add(constraint); + } + + // Convert Long to int "safely": an exception is raised if Long overflows the int + // see: https://docs.oracle.com/javase/10/docs/api/java/lang/Math.html#toIntExact(long) + // Add +1 to `numInputs` to account for the manual handling of ONE + int numInputs = Math.toIntExact((Long) jsonR1CS.get("num_inputs")) + 1; + // Add +1 to `numVariables` to account for the manual handling of ONE + int numVariables = Math.toIntExact((Long) jsonR1CS.get("num_variables")) + 1; + int numAuxiliary = numVariables - numInputs; + R1CSRelation relation = + new R1CSRelation(constraintArray, numInputs, numAuxiliary); + + assert relation.isValid() : "Loaded relation is invalid"; + return relation; + } catch (FileNotFoundException e) { + e.printStackTrace(); + } catch (IOException e) { + e.printStackTrace(); + } catch (ParseException e) { + e.printStackTrace(); } - /// Loads the file to a "local" (i.e. non-distributed) R1CS instance - /// Need to pass `fieldONE` as a was to bypass the limitations of java generics. - /// The `construct` function is used to instantiate elements of type FieldT from `fieldONE` - public < - FieldT extends AbstractFieldElementExpanded, - FieldParamsT extends AbstractFpParameters> - R1CSRelation loadSerial( - final FieldT fieldONE, - final FieldParamsT fieldParams - ){ - //JSON parser object to parse read file - JSONParser jsonParser = new JSONParser(); - - R1CSRelation empty = new R1CSRelation(new R1CSConstraints(), 0, 0); - // TODO: Mark the function with a `throws` instead of having such a big try/catch - // i.e. add `throws IOException, ParseException` - try (FileReader reader = new FileReader(this.filename)) { - Object obj = jsonParser.parse(reader); - JSONObject jsonR1CS = (JSONObject) obj; - - // Retrieve the field characteristic for type safety - // Once recovered, we assert that r = FieldT::r to make sure types match - String jsonMod = (String) jsonR1CS.get("scalar_field_characteristic"); - BigInteger mod = new BigInteger(jsonMod.substring(2), 16); - assert (mod.equals(fieldParams.modulus())) : "Modulus mismatch while loading R1CS"; - - JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); - R1CSConstraints constraintArray = new R1CSConstraints(); - for (int i = 0; i < jsonConstraintArray.size(); i++) { - R1CSConstraint constraint = loadConstraint(fieldONE, (JSONObject) jsonConstraintArray.get(i)); - constraintArray.add(constraint); - } - - // Convert Long to int "safely": an exception is raised if Long overflows the int - // see: https://docs.oracle.com/javase/10/docs/api/java/lang/Math.html#toIntExact(long) - // Add +1 to `numInputs` to account for the manual handling of ONE - int numInputs = Math.toIntExact((Long) jsonR1CS.get("num_inputs")) + 1; - // Add +1 to `numVariables` to account for the manual handling of ONE - int numVariables = Math.toIntExact((Long) jsonR1CS.get("num_variables")) + 1; - int numAuxiliary = numVariables - numInputs; - R1CSRelation relation = new R1CSRelation(constraintArray, numInputs, numAuxiliary); - - assert relation.isValid(): "Loaded relation is invalid"; - return relation; - } catch (FileNotFoundException e) { - e.printStackTrace(); - } catch (IOException e) { - e.printStackTrace(); - } catch (ParseException e) { - e.printStackTrace(); - } - - // Return the empty relation if the body of the `try` threw - return empty; + // Return the empty relation if the body of the `try` threw + return empty; + } + + /// RDD + + private > + JavaPairRDD> loadLinearCombinationRDD( + final FieldT fieldONE, + final ArrayList partitionSet, + final JSONArray jsonConstraintArray, + final String linearCombKey, + final Configuration config) { + final int nbPartitions = config.numPartitions(); + final int nbConstraints = jsonConstraintArray.size(); + + // Normally, Spark tries to set the number of partitions automatically based on the cluster. + // However, it is possible to pass it manually via a second parameter to parallelize + // (e.g. sc.parallelize(data, 10)) + // See: https://spark.apache.org/docs/3.0.0/rdd-programming-guide.html#parallelized-collections + // + // NOTE1: Here "the initial dataset" is just an array of integers (the "partitions" which is + // an ArrayList). This array (e.g. [0,1,2]) is used to build the data in each + // partitions + // via the use of a lambda expression in `flatMapToPair` which maps an Integer to the output + // of the block below (i.e. `return linearCombinationChunk.iterator();`). + // In other words, for each "partition index" a partition "content" is returned. + // See: https://www.w3schools.com/java/java_lambda.asp for syntax of java lambda functions + // + // NOTE2: The signature of the `flatMapToPair` function is: + // `flatMapToPair[K2, V2](f: PairFlatMapFunction[T, K2, V2]): JavaPairRDD[K2, V2]` + // Hence, it returns a JavaRDDPair just as expected. + return config + .sparkContext() + .parallelize(partitionSet, nbPartitions) + .flatMapToPair( + partitionID -> { + // Compute the size of the data set processed by the partition `partitionID` + // Note: all partitions will have data sets of the same size, apart from the last one + // (if it exists) which will have a dataset made of the remaining entries (i.e. the + // cardinality of this set will be < than the one of the other partitions) + final long partitionSize = + (partitionID == nbPartitions + ? nbConstraints % nbPartitions + : nbConstraints / nbPartitions); + + final ArrayList>> linearCombinationChunk = + new ArrayList<>(); + // Build the partition data set + // For each constraints in the data set + for (long i = 0; i < partitionSize; i++) { + final long constraintIndex = partitionID * (nbConstraints / nbPartitions) + i; + + // Here `constraintIndex` determines the constraint ID + // and `constraintArrayIndex` determines the lin comb of this constraint + // Hence, `next` contains the linear combination to parse + JSONObject jsonConstraint = + (JSONObject) jsonConstraintArray.get((int) constraintIndex); + JSONObject jsonLinCombs = (JSONObject) jsonConstraint.get("linear_combination"); + JSONArray jsonLinComb = (JSONArray) jsonLinCombs.get(linearCombKey); + + // Parse all LinearTerms of the linear combination of interest + for (int j = 0; j < jsonLinComb.size(); j++) { + JSONObject jsonTerm = (JSONObject) jsonLinComb.get(j); + String valueStr = (String) jsonTerm.get("value"); + // Wire values are exported as hexadecimal strings + // (we remove the '0x' prefix using `substring`) + BigInteger value = new BigInteger(valueStr.substring(2), 16); + // FieldT extends `AbstractFieldElementExpanded` which has `construct` from + // BigInteger + FieldT valueField = fieldONE.construct(value); + linearCombinationChunk.add( + new Tuple2<>( + constraintIndex, + new LinearTerm((Long) jsonTerm.get("index"), valueField))); + } + } + return linearCombinationChunk.iterator(); + }); + } + + // TODO: Spend some time to properly understand the behavior of the JSON library and the file + // reader + // we don't want to load all the file in memory, ideally everything should be "streamed". + // This may lead to several passes on the file (as we need to navigate all constraints) multiple + // times to load each individual linear combination, but this may be more desirable than reading + // and keeping an enormous file in memory (e.g. an R1CS of billion gates). + // Else, refactor the code to replace `loadLinearCombinationsRDD` by `loadConstraintsRDD` and + // only navigate the constraint set once. + + // Loads the file to an RDD (i.e. distributed) R1CS instance + public < + FieldT extends AbstractFieldElementExpanded, + FieldParamsT extends AbstractFpParameters> + R1CSRelationRDD loadRDD( + final FieldT fieldONE, final FieldParamsT fieldParams, final Configuration config) + throws FileNotFoundException, IOException, ParseException { + // JSON parser object to parse read file + JSONParser jsonParser = new JSONParser(); + + FileReader reader = new FileReader(this.filename); + Object obj = jsonParser.parse(reader); + JSONObject jsonR1CS = (JSONObject) obj; + + // Retrieve the field characteristic for type safety + // Once recovered, we assert that r = FieldT::r to make sure types match + String jsonMod = (String) jsonR1CS.get("scalar_field_characteristic"); + BigInteger mod = new BigInteger(jsonMod.substring(2), 16); + assert (mod.equals(fieldParams.modulus())) : "Modulus mismatch while loading R1CS"; + + long nbJSONConstraintArray = (Long) jsonR1CS.get("num_constraints"); + final int numPartitions = config.numPartitions(); + + // This array of partitions is useful to feed into the lambda function to map + // a partition ID (an index in [numPartition]) to the corresponding R1CS chunk + // See in `loadLinearCombinationRDD`. + // + // The default config has 2 partitions per RDD. + // We add "partition 0" and "partition 1" to the array of partitions. + final ArrayList partitions = new ArrayList<>(); + for (int i = 0; i < numPartitions; i++) { + partitions.add(i); } - - /// RDD - - private > - JavaPairRDD> loadLinearCombinationRDD( - final FieldT fieldONE, - final ArrayList partitionSet, - final JSONArray jsonConstraintArray, - final String linearCombKey, - final Configuration config - ){ - final int nbPartitions = config.numPartitions(); - final int nbConstraints = jsonConstraintArray.size(); - - // Normally, Spark tries to set the number of partitions automatically based on the cluster. - // However, it is possible to pass it manually via a second parameter to parallelize - // (e.g. sc.parallelize(data, 10)) - // See: https://spark.apache.org/docs/3.0.0/rdd-programming-guide.html#parallelized-collections - // - // NOTE1: Here "the initial dataset" is just an array of integers (the "partitions" which is - // an ArrayList). This array (e.g. [0,1,2]) is used to build the data in each partitions - // via the use of a lambda expression in `flatMapToPair` which maps an Integer to the output - // of the block below (i.e. `return linearCombinationChunk.iterator();`). - // In other words, for each "partition index" a partition "content" is returned. - // See: https://www.w3schools.com/java/java_lambda.asp for syntax of java lambda functions - // - // NOTE2: The signature of the `flatMapToPair` function is: - // `flatMapToPair[K2, V2](f: PairFlatMapFunction[T, K2, V2]): JavaPairRDD[K2, V2]` - // Hence, it returns a JavaRDDPair just as expected. - return config.sparkContext().parallelize(partitionSet, nbPartitions) - .flatMapToPair(partitionID -> { - // Compute the size of the data set processed by the partition `partitionID` - // Note: all partitions will have data sets of the same size, apart from the last one - // (if it exists) which will have a dataset made of the remaining entries (i.e. the - // cardinality of this set will be < than the one of the other partitions) - final long partitionSize = (partitionID == nbPartitions ? - nbConstraints % nbPartitions : nbConstraints / nbPartitions); - - final ArrayList>> linearCombinationChunk = new ArrayList<>(); - // Build the partition data set - // For each constraints in the data set - for (long i = 0; i < partitionSize; i++) { - final long constraintIndex = partitionID * (nbConstraints / nbPartitions) + i; - - // Here `constraintIndex` determines the constraint ID - // and `constraintArrayIndex` determines the lin comb of this constraint - // Hence, `next` contains the linear combination to parse - JSONObject jsonConstraint = (JSONObject) jsonConstraintArray.get((int) constraintIndex); - JSONObject jsonLinCombs = (JSONObject) jsonConstraint.get("linear_combination"); - JSONArray jsonLinComb = (JSONArray) jsonLinCombs.get(linearCombKey); - - // Parse all LinearTerms of the linear combination of interest - for (int j = 0; j < jsonLinComb.size(); j++) { - JSONObject jsonTerm = (JSONObject) jsonLinComb.get(j); - String valueStr = (String) jsonTerm.get("value"); - // Wire values are exported as hexadecimal strings - // (we remove the '0x' prefix using `substring`) - BigInteger value = new BigInteger(valueStr.substring(2), 16); - // FieldT extends `AbstractFieldElementExpanded` which has `construct` from BigInteger - FieldT valueField = fieldONE.construct(value); - linearCombinationChunk.add(new Tuple2<>(constraintIndex, new LinearTerm((Long) jsonTerm.get("index"), valueField))); - } - } - return linearCombinationChunk.iterator(); - }); + // If the number of constraints is not a multiple of `numPartitions`, + // we add an extra partition to process the remaining part of the dataset. + // NOTE: This only makes sense when the # constraints > # partitions + // which will be the case in most (realistic) scenarios + if (nbJSONConstraintArray % numPartitions != 0) { + partitions.add(numPartitions); } - // TODO: Spend some time to properly understand the behavior of the JSON library and the file reader - // we don't want to load all the file in memory, ideally everything should be "streamed". - // This may lead to several passes on the file (as we need to navigate all constraints) multiple - // times to load each individual linear combination, but this may be more desirable than reading - // and keeping an enormous file in memory (e.g. an R1CS of billion gates). - // Else, refactor the code to replace `loadLinearCombinationsRDD` by `loadConstraintsRDD` and - // only navigate the constraint set once. - - // Loads the file to an RDD (i.e. distributed) R1CS instance - public , FieldParamsT extends AbstractFpParameters> - R1CSRelationRDD loadRDD( - final FieldT fieldONE, - final FieldParamsT fieldParams, - final Configuration config - ) throws FileNotFoundException, IOException, ParseException { - // JSON parser object to parse read file - JSONParser jsonParser = new JSONParser(); - - FileReader reader = new FileReader(this.filename); - Object obj = jsonParser.parse(reader); - JSONObject jsonR1CS = (JSONObject) obj; - - // Retrieve the field characteristic for type safety - // Once recovered, we assert that r = FieldT::r to make sure types match - String jsonMod = (String) jsonR1CS.get("scalar_field_characteristic"); - BigInteger mod = new BigInteger(jsonMod.substring(2), 16); - assert (mod.equals(fieldParams.modulus())) : "Modulus mismatch while loading R1CS"; - - long nbJSONConstraintArray = (Long) jsonR1CS.get("num_constraints"); - final int numPartitions = config.numPartitions(); - - // This array of partitions is useful to feed into the lambda function to map - // a partition ID (an index in [numPartition]) to the corresponding R1CS chunk - // See in `loadLinearCombinationRDD`. - // - // The default config has 2 partitions per RDD. - // We add "partition 0" and "partition 1" to the array of partitions. - final ArrayList partitions = new ArrayList<>(); - for (int i = 0; i < numPartitions; i++) { - partitions.add(i); - } - // If the number of constraints is not a multiple of `numPartitions`, - // we add an extra partition to process the remaining part of the dataset. - // NOTE: This only makes sense when the # constraints > # partitions - // which will be the case in most (realistic) scenarios - if (nbJSONConstraintArray % numPartitions != 0) { - partitions.add(numPartitions); - } - - // Get a ref to the array of constraints - JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); - - // Load all LinearCombinations "A" - JavaPairRDD> linCombinationA = loadLinearCombinationRDD( - fieldONE, partitions, jsonConstraintArray, "A", config); - - // Load all LinearCombinations "B" - JavaPairRDD> linCombinationB = loadLinearCombinationRDD( - fieldONE, partitions, jsonConstraintArray, "B", config); - - // Load all LinearCombinations "C" - JavaPairRDD> linCombinationC = loadLinearCombinationRDD( - fieldONE, partitions, jsonConstraintArray, "C", config); - - // Make sure that the loaded data is sound - // `.count()` returns the number of elements in the RDD. - - // [BEGIN DEBUG BLOCK] - // Be careful, these operations are very expensive, only do with very small datasets! - //System.out.println("linCombinationA.groupByKey().count() = " + linCombinationA.groupByKey().count()); - //System.out.println("linCombinationB.groupByKey().count() = " + linCombinationB.groupByKey().count()); - //System.out.println("linCombinationC.groupByKey().count() = " + linCombinationC.groupByKey().count()); - //for(Tuple2>> entry:linCombinationA.groupByKey().collect()){ - // System.out.println("* " + entry); - //} - assert linCombinationA.groupByKey().count() == linCombinationB.groupByKey().count(); - assert linCombinationB.groupByKey().count() == linCombinationC.groupByKey().count(); - assert linCombinationC.groupByKey().count() == nbJSONConstraintArray; - // [END DEBUG BLOCK] - - final R1CSConstraintsRDD constraints = new R1CSConstraintsRDD<>( + // Get a ref to the array of constraints + JSONArray jsonConstraintArray = (JSONArray) jsonR1CS.get("constraints"); + + // Load all LinearCombinations "A" + JavaPairRDD> linCombinationA = + loadLinearCombinationRDD(fieldONE, partitions, jsonConstraintArray, "A", config); + + // Load all LinearCombinations "B" + JavaPairRDD> linCombinationB = + loadLinearCombinationRDD(fieldONE, partitions, jsonConstraintArray, "B", config); + + // Load all LinearCombinations "C" + JavaPairRDD> linCombinationC = + loadLinearCombinationRDD(fieldONE, partitions, jsonConstraintArray, "C", config); + + // Make sure that the loaded data is sound + // `.count()` returns the number of elements in the RDD. + + // [BEGIN DEBUG BLOCK] + // Be careful, these operations are very expensive, only do with very small datasets! + // System.out.println("linCombinationA.groupByKey().count() = " + + // linCombinationA.groupByKey().count()); + // System.out.println("linCombinationB.groupByKey().count() = " + + // linCombinationB.groupByKey().count()); + // System.out.println("linCombinationC.groupByKey().count() = " + + // linCombinationC.groupByKey().count()); + // for(Tuple2>> entry:linCombinationA.groupByKey().collect()){ + // System.out.println("* " + entry); + // } + assert linCombinationA.groupByKey().count() == linCombinationB.groupByKey().count(); + assert linCombinationB.groupByKey().count() == linCombinationC.groupByKey().count(); + assert linCombinationC.groupByKey().count() == nbJSONConstraintArray; + // [END DEBUG BLOCK] + + final R1CSConstraintsRDD constraints = + new R1CSConstraintsRDD<>( linCombinationA, linCombinationB, linCombinationC, nbJSONConstraintArray); - - // Convert Long to int "safely": an exception is raised if Long overflows the int - // see: https://docs.oracle.com/javase/10/docs/api/java/lang/Math.html#toIntExact(long) - // Add +1 to `numInputs` to account for the manual handling of ONE - int numInputs = Math.toIntExact((Long) jsonR1CS.get("num_inputs")) + 1; - // Add +1 to `numVariables` to account for the manual handling of ONE - int numVariables = Math.toIntExact((Long) jsonR1CS.get("num_variables")) + 1; - int numAuxiliary = numVariables - numInputs; - - R1CSRelationRDD relation = new R1CSRelationRDD(constraints, numInputs, numAuxiliary); - assert relation.isValid(): "Loaded relation is invalid"; - - return relation; - } + + // Convert Long to int "safely": an exception is raised if Long overflows the int + // see: https://docs.oracle.com/javase/10/docs/api/java/lang/Math.html#toIntExact(long) + // Add +1 to `numInputs` to account for the manual handling of ONE + int numInputs = Math.toIntExact((Long) jsonR1CS.get("num_inputs")) + 1; + // Add +1 to `numVariables` to account for the manual handling of ONE + int numVariables = Math.toIntExact((Long) jsonR1CS.get("num_variables")) + 1; + int numAuxiliary = numVariables - numInputs; + + R1CSRelationRDD relation = + new R1CSRelationRDD(constraints, numInputs, numAuxiliary); + assert relation.isValid() : "Loaded relation is invalid"; + + return relation; + } } diff --git a/src/main/java/relations/objects/LinearCombination.java b/src/main/java/relations/objects/LinearCombination.java index 78cdd13..fe839dc 100755 --- a/src/main/java/relations/objects/LinearCombination.java +++ b/src/main/java/relations/objects/LinearCombination.java @@ -35,12 +35,18 @@ public boolean isValid(final int numVariables) { return true; } - /// WARNING: There are a few differences between how the protoboard is implemented in libsnark and how - /// the constraint system is handled here. In libsnark, the variable ONE is added by default as the first variable - /// on the protoboard (the user does not have to bother specifying ONE in their assignement). This is why - /// in the linear combination evaluation (see here: https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/variable.tcc#L267) - /// there is shift by one (in `assignment[lt.index-1]`) because the linear_combination.size() = assignment.size() + 1 since - /// the linear combination's first entry will be for ONE, while the first entry in `assignment` will be for the next/first user-defined + /// WARNING: There are a few differences between how the protoboard is implemented in libsnark and + // how + /// the constraint system is handled here. In libsnark, the variable ONE is added by default as + // the first variable + /// on the protoboard (the user does not have to bother specifying ONE in their assignement). This + // is why + /// in the linear combination evaluation (see here: + // https://github.com/clearmatics/libsnark/blob/master/libsnark/relations/variable.tcc#L267) + /// there is shift by one (in `assignment[lt.index-1]`) because the linear_combination.size() = + // assignment.size() + 1 since + /// the linear combination's first entry will be for ONE, while the first entry in `assignment` + // will be for the next/first user-defined /// variable. As such we have, for instance: /// ___ /// | 5 | ____ @@ -53,19 +59,24 @@ public boolean isValid(final int numVariables) { /// LinComb Assignment /// /// Will return: 5*ONE + 12*0 + 2*1 + 4*1 + 1*1 = 5+2+4+1 = 12 - /// when evaluated by the function below (i.e. the first entry in the linear combination is interpreted as factor of ONE) + /// when evaluated by the function below (i.e. the first entry in the linear combination is + // interpreted as factor of ONE) /// - /// HOWEVER, here in DIZK, things are managed differently, and the ONE variable needs to be given as part of the assignment. - /// As such, there is not such "shift-by-one", hence, the ith entry in LinComb gets multiplied by the ith entry in assignment - /// (and not by the (i-1)th entry as in libsnark). + /// HOWEVER, here in DIZK, things are managed differently, and the ONE variable needs to be given + // as part of the assignment. + /// As such, there is not such "shift-by-one", hence, the ith entry in LinComb gets multiplied by + // the ith entry in assignment + /// (and not by the (i-1)th entry as in libsnark). public FieldT evaluate(final Assignment input) { FieldT result = input.get(0).zero(); final FieldT one = result.one(); // Note: Assuming the "coefficient" representation of the linear combination A = [0 0 1 0 0] // then the 0 terms are not represented in the LinearCombination object. In fact, in this - // example, A will be an Array of LinearTerms of size 1, with a single term LinearTerm(index = 2, value = 1) - // Hence, the ternary operation in the `for` loop below will assign `one` to `value` only when the `ONE` variable + // example, A will be an Array of LinearTerms of size 1, with a single term LinearTerm(index = + // 2, value = 1) + // Hence, the ternary operation in the `for` loop below will assign `one` to `value` only when + // the `ONE` variable // is selected in the linear combination. for (int i = 0; i < terms.size(); i++) { final long index = terms.get(i).index(); diff --git a/src/test/java/io/JSONR1CSLoaderTest.java b/src/test/java/io/JSONR1CSLoaderTest.java index a2c903d..131caeb 100644 --- a/src/test/java/io/JSONR1CSLoaderTest.java +++ b/src/test/java/io/JSONR1CSLoaderTest.java @@ -1,155 +1,147 @@ package io; -import relations.r1cs.R1CSRelation; -import relations.r1cs.R1CSRelationRDD; -import scala.Tuple2; -import algebra.fields.AbstractFieldElementExpanded; -import configuration.Configuration; -import relations.objects.LinearTerm; -import relations.objects.R1CSConstraintsRDD; - -import org.junit.jupiter.api.Test; -import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; import static org.junit.jupiter.api.Assertions.fail; import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; -import io.JSONR1CSLoader; -import java.nio.file.Paths; -import java.nio.file.Path; +import configuration.Configuration; import java.io.FileNotFoundException; import java.io.IOException; import java.nio.file.Files; - -import relations.objects.Assignment; - -import java.util.ArrayList; +import java.nio.file.Path; +import java.nio.file.Paths; import java.util.Arrays; import java.util.List; - -import profiler.utils.SparkUtils; - +import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaPairRDD; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.storage.StorageLevel; import org.json.simple.parser.ParseException; -import org.apache.spark.SparkConf; +import org.junit.jupiter.api.Test; +import profiler.utils.SparkUtils; +import relations.objects.Assignment; +import relations.r1cs.R1CSRelation; +import relations.r1cs.R1CSRelationRDD; +import scala.Tuple2; public class JSONR1CSLoaderTest { - - @Test - public void loadSerialTest(){ - // Load the test data - String dizkHome = System.getenv("DIZK"); - Path pathToFile = Paths.get( - dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); - if (!Files.exists(pathToFile)) { - fail("Test r1cs file not found."); - } - JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); - R1CSRelation loadedRelation = loader.loadSerial(BN254aFr.ONE, BN254aFr.FrParameters); - assertTrue(loadedRelation.isValid()); - - // Make sure the loaded relation is satisfied with a VALID assignment - Assignment primary = new Assignment(); - // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) - // see further discussion in the `evaluate` function in `LinearCombination.java` - primary.add(BN254aFr.ONE); - primary.add(new BN254aFr("12")); - Assignment auxiliary = new Assignment(); - auxiliary.add(new BN254aFr("1")); - auxiliary.add(new BN254aFr("1")); - auxiliary.add(new BN254aFr("1")); - assertTrue(loadedRelation.isSatisfied(primary, auxiliary)); - - // Make sure the loaded relation is NOT satisfied with an INVALID assignment - Assignment invalidPrimary = new Assignment(); - invalidPrimary.add(BN254aFr.ONE); - invalidPrimary.add(new BN254aFr("12")); - Assignment invalidAuxiliary = new Assignment(); - invalidAuxiliary.add(new BN254aFr("2")); - invalidAuxiliary.add(new BN254aFr("1")); - invalidAuxiliary.add(new BN254aFr("1")); - assertFalse(loadedRelation.isSatisfied(invalidPrimary, invalidAuxiliary)); + + @Test + public void loadSerialTest() { + // Load the test data + String dizkHome = System.getenv("DIZK"); + Path pathToFile = Paths.get(dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); + if (!Files.exists(pathToFile)) { + fail("Test r1cs file not found."); + } + JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); + R1CSRelation loadedRelation = loader.loadSerial(BN254aFr.ONE, BN254aFr.FrParameters); + assertTrue(loadedRelation.isValid()); + + // Make sure the loaded relation is satisfied with a VALID assignment + Assignment primary = new Assignment(); + // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) + // see further discussion in the `evaluate` function in `LinearCombination.java` + primary.add(BN254aFr.ONE); + primary.add(new BN254aFr("12")); + Assignment auxiliary = new Assignment(); + auxiliary.add(new BN254aFr("1")); + auxiliary.add(new BN254aFr("1")); + auxiliary.add(new BN254aFr("1")); + assertTrue(loadedRelation.isSatisfied(primary, auxiliary)); + + // Make sure the loaded relation is NOT satisfied with an INVALID assignment + Assignment invalidPrimary = new Assignment(); + invalidPrimary.add(BN254aFr.ONE); + invalidPrimary.add(new BN254aFr("12")); + Assignment invalidAuxiliary = new Assignment(); + invalidAuxiliary.add(new BN254aFr("2")); + invalidAuxiliary.add(new BN254aFr("1")); + invalidAuxiliary.add(new BN254aFr("1")); + assertFalse(loadedRelation.isSatisfied(invalidPrimary, invalidAuxiliary)); + } + + @Test + public void loadRDDTest() { + // Load the test data + String dizkHome = System.getenv("DIZK"); + Path pathToFile = Paths.get(dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); + if (!Files.exists(pathToFile)) { + fail("Test r1cs file not found."); } - @Test - public void loadRDDTest(){ - // Load the test data - String dizkHome = System.getenv("DIZK"); - Path pathToFile = Paths.get( - dizkHome, "src", "test", "java", "data", "simple_gadget_r1cs.json"); - if (!Files.exists(pathToFile)) { - fail("Test r1cs file not found."); - } - - // Set up configuration and SPARK context - final SparkConf conf = new SparkConf().setMaster("local").setAppName("loader"); - conf.set("spark.files.overwrite", "true"); - conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); - conf.registerKryoClasses(SparkUtils.zksparkClasses()); - - final JavaSparkContext sc; - final Configuration config; - sc = new JavaSparkContext(conf); - config = new Configuration(1, 1, 1, 2, sc, StorageLevel.MEMORY_ONLY()); - config.setRuntimeFlag(false); - config.setDebugFlag(true); - - JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); - - try{ - R1CSRelationRDD loadedRelationRDD = loader.loadRDD(BN254aFr.ONE, BN254aFr.FrParameters, config); - assertTrue(loadedRelationRDD.isValid()); - - // Make sure the loaded relation is satisfied with a VALID assignment - Assignment primary = new Assignment(); - // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) - // see further discussion in the `evaluate` function in `LinearCombination.java` - primary.add(BN254aFr.ONE); - primary.add(new BN254aFr("12")); - - List> fullAssignment = Arrays.asList( - new Tuple2<>((long) 0, BN254aFr.ONE), // Primary - new Tuple2<>((long) 1, new BN254aFr("12")), - new Tuple2<>((long) 2, new BN254aFr("1")), // Auxiliary - new Tuple2<>((long) 3, new BN254aFr("1")), - new Tuple2<>((long) 4, new BN254aFr("1")) - ); - JavaRDD> assignmentRDD = sc.parallelize(fullAssignment); - JavaPairRDD pairAssignmentRDD = JavaPairRDD.fromJavaRDD(assignmentRDD); - - boolean result = loadedRelationRDD.isSatisfied(primary, pairAssignmentRDD); - System.out.println("==========> Result after assignment: " + result); - assertTrue(result); - - // Make sure the loaded relation is NOT satisfied with an INVALID assignment - Assignment primaryInvalid = new Assignment(); - // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) - // see further discussion in the `evaluate` function in `LinearCombination.java` - primaryInvalid.add(BN254aFr.ONE); - primaryInvalid.add(new BN254aFr("12")); - - List> fullAssignmentInvalid = Arrays.asList( - new Tuple2<>((long) 0, BN254aFr.ONE), // Primary - new Tuple2<>((long) 1, new BN254aFr("12")), - new Tuple2<>((long) 2, new BN254aFr("2")), // Invalid Auxiliary - new Tuple2<>((long) 3, new BN254aFr("1")), - new Tuple2<>((long) 4, new BN254aFr("1")) - ); - JavaRDD> invalidAssignmentRDD = sc.parallelize(fullAssignmentInvalid); - JavaPairRDD invalidPairAssignmentRDD = JavaPairRDD.fromJavaRDD(invalidAssignmentRDD); - - boolean invalidResult = loadedRelationRDD.isSatisfied(primaryInvalid, invalidPairAssignmentRDD); - System.out.println("==========> Result after INVALID assignment: " + invalidResult); - assertFalse(invalidResult); - } catch (FileNotFoundException e) { - e.printStackTrace(); - } catch (IOException e) { - e.printStackTrace(); - } catch (ParseException e) { - e.printStackTrace(); - } + // Set up configuration and SPARK context + final SparkConf conf = new SparkConf().setMaster("local").setAppName("loader"); + conf.set("spark.files.overwrite", "true"); + conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); + conf.registerKryoClasses(SparkUtils.zksparkClasses()); + + final JavaSparkContext sc; + final Configuration config; + sc = new JavaSparkContext(conf); + config = new Configuration(1, 1, 1, 2, sc, StorageLevel.MEMORY_ONLY()); + config.setRuntimeFlag(false); + config.setDebugFlag(true); + + JSONR1CSLoader loader = new JSONR1CSLoader(pathToFile.toString()); + + try { + R1CSRelationRDD loadedRelationRDD = + loader.loadRDD(BN254aFr.ONE, BN254aFr.FrParameters, config); + assertTrue(loadedRelationRDD.isValid()); + + // Make sure the loaded relation is satisfied with a VALID assignment + Assignment primary = new Assignment(); + // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) + // see further discussion in the `evaluate` function in `LinearCombination.java` + primary.add(BN254aFr.ONE); + primary.add(new BN254aFr("12")); + + List> fullAssignment = + Arrays.asList( + new Tuple2<>((long) 0, BN254aFr.ONE), // Primary + new Tuple2<>((long) 1, new BN254aFr("12")), + new Tuple2<>((long) 2, new BN254aFr("1")), // Auxiliary + new Tuple2<>((long) 3, new BN254aFr("1")), + new Tuple2<>((long) 4, new BN254aFr("1"))); + JavaRDD> assignmentRDD = sc.parallelize(fullAssignment); + JavaPairRDD pairAssignmentRDD = + JavaPairRDD.fromJavaRDD(assignmentRDD); + + boolean result = loadedRelationRDD.isSatisfied(primary, pairAssignmentRDD); + System.out.println("==========> Result after assignment: " + result); + assertTrue(result); + + // Make sure the loaded relation is NOT satisfied with an INVALID assignment + Assignment primaryInvalid = new Assignment(); + // Allocate ONE - needs to be done manually (as opposed to how things are done in libsnark) + // see further discussion in the `evaluate` function in `LinearCombination.java` + primaryInvalid.add(BN254aFr.ONE); + primaryInvalid.add(new BN254aFr("12")); + + List> fullAssignmentInvalid = + Arrays.asList( + new Tuple2<>((long) 0, BN254aFr.ONE), // Primary + new Tuple2<>((long) 1, new BN254aFr("12")), + new Tuple2<>((long) 2, new BN254aFr("2")), // Invalid Auxiliary + new Tuple2<>((long) 3, new BN254aFr("1")), + new Tuple2<>((long) 4, new BN254aFr("1"))); + JavaRDD> invalidAssignmentRDD = sc.parallelize(fullAssignmentInvalid); + JavaPairRDD invalidPairAssignmentRDD = + JavaPairRDD.fromJavaRDD(invalidAssignmentRDD); + + boolean invalidResult = + loadedRelationRDD.isSatisfied(primaryInvalid, invalidPairAssignmentRDD); + System.out.println("==========> Result after INVALID assignment: " + invalidResult); + assertFalse(invalidResult); + } catch (FileNotFoundException e) { + e.printStackTrace(); + } catch (IOException e) { + e.printStackTrace(); + } catch (ParseException e) { + e.printStackTrace(); } + } } From 95435742278e6730975f0a54442beb6e5d7167d5 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 12 Oct 2020 18:13:27 +0100 Subject: [PATCH 33/94] Added missing env var to CI --- .github/workflows/main.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml index d44fcf6..dac45e6 100644 --- a/.github/workflows/main.yml +++ b/.github/workflows/main.yml @@ -14,7 +14,7 @@ jobs: with: java-version: '11' - name: Run tests with Maven - run: mvn clean test + run: DIZK=`pwd` mvn clean test check-fmt: runs-on: ubuntu-20.04 From 0fa8fda0b963f662c1ef14cb04459bc5a02f02cd Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 4 Jan 2021 17:54:32 +0000 Subject: [PATCH 34/94] Added a few misc comments and fixed typos --- src/main/java/algebra/msm/NaiveMSM.java | 27 ++++++++++++++----- src/main/java/algebra/msm/README.md | 6 +++-- .../java/algebra/msm/VariableBaseMSM.java | 1 + src/main/java/common/MathUtils.java | 3 +++ src/main/java/common/NaiveEvaluation.java | 22 +++++++++++++-- src/main/java/common/Utils.java | 13 ++++----- src/main/java/io/JSONR1CSLoader.java | 2 +- 7 files changed, 57 insertions(+), 17 deletions(-) diff --git a/src/main/java/algebra/msm/NaiveMSM.java b/src/main/java/algebra/msm/NaiveMSM.java index 55706d8..04a8f44 100755 --- a/src/main/java/algebra/msm/NaiveMSM.java +++ b/src/main/java/algebra/msm/NaiveMSM.java @@ -14,10 +14,15 @@ import java.util.List; import org.apache.spark.api.java.JavaPairRDD; +/** + * Class exposing naive methods for MSM (i.e. no speedup used to compute the sum + * of multiple EC multiplications, the MSM result is obtained by summing the + * individual multiplications.) + */ public class NaiveMSM { - public static > ArrayList fixedBaseMSM( - List scalars, GroupT base) { + public static > + ArrayList fixedBaseMSM(List scalars, GroupT base) { ArrayList result = new ArrayList<>(scalars.size()); for (int i = 0; i < scalars.size(); i++) { @@ -27,8 +32,13 @@ public static > ArrayList fixedBase return result; } - public static > GroupT variableBaseMSM( - ArrayList scalars, ArrayList bases) { + /** + * Computes a Multi-Scalar Multiplication. + * i.e. on input bases and scalars , the function + * returns the group element R = s1 * P1 + ... + sn * Pn + */ + public static > + GroupT variableBaseMSM(ArrayList scalars, ArrayList bases) { assert (scalars.size() == bases.size()); assert (scalars.size() > 0); @@ -41,8 +51,13 @@ public static > GroupT variableBaseMSM( return result; } - public static > FieldT variableBaseMSM( - List scalars, List bases) { + /** + * Computes a Multi-Scalar Multiplication. + * i.e. on input bases and scalars , the function + * returns the field element R = s1 * P1 + ... + sn * Pn + */ + public static > + FieldT variableBaseMSM(List scalars, List bases) { assert (scalars.size() == bases.size()); assert (scalars.size() > 0); diff --git a/src/main/java/algebra/msm/README.md b/src/main/java/algebra/msm/README.md index ab8b07c..a862f98 100755 --- a/src/main/java/algebra/msm/README.md +++ b/src/main/java/algebra/msm/README.md @@ -4,8 +4,10 @@ The MSM package provides serial and parallel implementations of fixed-base and v ## Fixed Base (fixMSM) -The fixed-base multi-scalar multiplication is defined as follows: given a group _G_, an element _P_ in _G_, and scalars [_a1, ... , an_] in _Z^n_, compute [_a1 * P_, ... , _an * P_] in G^n. +The fixed-base multi-scalar multiplication is defined as follows: +Given a group _G_, an element _P_ in _G_, and scalars [_a1, ... , an_] in _Z^n_, compute [_a1 * P_, ... , _an * P_] in G^n. ## Variable Base (varMSM) -The variable-base multi-scalar multiplication is defined as follows: given a group _G_, elements [_P1_, ... , _Pn_] in _G^n_, and scalars [_a1, ... , an_] in _Z^n_, compute [_a1 * P1_, ... , _an * Pn_] in G^n. \ No newline at end of file +The variable-base multi-scalar multiplication is defined as follows: +Given a group _G_, elements [_P1_, ... , _Pn_] in _G^n_, and scalars [_a1, ... , an_] in _Z^n_, compute [_a1 * P1_, ... , _an * Pn_] in G^n. \ No newline at end of file diff --git a/src/main/java/algebra/msm/VariableBaseMSM.java b/src/main/java/algebra/msm/VariableBaseMSM.java index 5ae4bf5..4834f33 100755 --- a/src/main/java/algebra/msm/VariableBaseMSM.java +++ b/src/main/java/algebra/msm/VariableBaseMSM.java @@ -21,6 +21,7 @@ public class VariableBaseMSM { + /** See: Bos-Coster's algorithm for efficient MSMs */ public static final BigInteger BOS_COSTER_MSM_THRESHOLD = new BigInteger("1048576"); /** diff --git a/src/main/java/common/MathUtils.java b/src/main/java/common/MathUtils.java index 76430b4..c94d269 100755 --- a/src/main/java/common/MathUtils.java +++ b/src/main/java/common/MathUtils.java @@ -15,10 +15,12 @@ public static boolean isPowerOfTwo(final long x) { return (x & (x - 1)) == 0; } + /** Returns the smallest power of 2 greater or equal to n (where n is int) */ public static int lowestPowerOfTwo(final int n) { if (n < 1) { return 1; } + int result = 1; while (result < n) { result <<= 1; @@ -26,6 +28,7 @@ public static int lowestPowerOfTwo(final int n) { return result; } + /** Returns the smallest power of 2 greater or equal to n (where n is long) */ public static long lowestPowerOfTwo(final long n) { if (n < 1) { return 1; diff --git a/src/main/java/common/NaiveEvaluation.java b/src/main/java/common/NaiveEvaluation.java index 6cb80a6..a673dd3 100755 --- a/src/main/java/common/NaiveEvaluation.java +++ b/src/main/java/common/NaiveEvaluation.java @@ -14,8 +14,26 @@ public class NaiveEvaluation { - public static > FieldT evaluatePolynomial( - final List input, final FieldT t) { + /** + * Function evaluating polynomials using Horner's rule. + * i.e. a polynomial of the form: + * p(x) = c_m * x^m + ... + c_1 * x + c_0 + * is decomposed as: + * p(x) = (c_m * x^{m-1} + ... + c_1) * x + c_0, where q(x) = c_m * x^{m-1} + ... c_2 * x + c_1 + * is also defined as the product of a polynomial r(x) times x, deg(r) = deg(q) - 1. + * Example: + * p(x) = 3x^3 + 7x^2 + x + 10 + * = (3x^2 + 7x + 1)*x + 10 + * = ((3x + 7)*x + 1)*x + 10 + * = (((3)*x + 7)*x + 1)*x + 10 + * which is equal to: + * (((0*x + 3)*x + 7)*x + 1)*x + 10 + * + * Hence a polynomial of degree m will be evaluated via m iterations of a loop in which a single + * muliplication by the evaluation point is done + a coefficient addition. + */ + public static > + FieldT evaluatePolynomial(final List input, final FieldT t) { final int m = input.size(); FieldT result = t.zero(); diff --git a/src/main/java/common/Utils.java b/src/main/java/common/Utils.java index 4f440f4..4e414d6 100755 --- a/src/main/java/common/Utils.java +++ b/src/main/java/common/Utils.java @@ -22,8 +22,8 @@ public class Utils { * Pads the given array input to length size. Note, if the input is already of length size or * greater, then nothing is done. */ - public static > ArrayList padArray( - final ArrayList input, final int size) { + public static > + ArrayList padArray(final ArrayList input, final int size) { if (input.size() >= size) { return input; } else { @@ -41,7 +41,8 @@ public static > ArrayList ArrayList> convertToPairs(final List input) { + public static + ArrayList> convertToPairs(final List input) { ArrayList> result = new ArrayList<>(input.size()); for (int i = 0; i < input.size(); i++) { @@ -56,7 +57,7 @@ public static ArrayList> convertToPairs(final List input) * elements of input indexed by the Long value storing the FieldT element. */ public static > - ArrayList convertFromPairs(final List> input, final int size) { + ArrayList convertFromPairs(final List> input, final int size) { // assert (input.size() == size); final FieldT zero = input.get(0)._2.zero(); ArrayList result = new ArrayList<>(Collections.nCopies(size, zero)); @@ -82,8 +83,8 @@ public static > ArrayList convertFr } /** Initializes a new JavaPairRDD of length size, indexed and filled with the given element. */ - public static JavaPairRDD fillRDD( - final long size, final T element, final Configuration config) { + public static + JavaPairRDD fillRDD(final long size, final T element, final Configuration config) { final int numPartitions = size >= config.numPartitions() ? config.numPartitions() : (int) size; final ArrayList partitions = new ArrayList<>(numPartitions); diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 649ae18..55f499a 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -73,7 +73,7 @@ R1CSConstraint loadConstraint(FieldT fieldONE, JSONObject jsonConstraint } /// Loads the file to a "local" (i.e. non-distributed) R1CS instance - /// Need to pass `fieldONE` as a was to bypass the limitations of java generics. + /// Need to pass `fieldONE` as a way to bypass the limitations of java generics. /// The `construct` function is used to instantiate elements of type FieldT from `fieldONE` public < FieldT extends AbstractFieldElementExpanded, From 9359015f867948fddd2b802f25e4c67737dca55e Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 4 Jan 2021 18:00:35 +0000 Subject: [PATCH 35/94] Applied formatting --- src/main/java/algebra/msm/NaiveMSM.java | 27 ++++++++++------------- src/main/java/common/NaiveEvaluation.java | 27 +++++++++-------------- src/main/java/common/Utils.java | 13 +++++------ 3 files changed, 28 insertions(+), 39 deletions(-) diff --git a/src/main/java/algebra/msm/NaiveMSM.java b/src/main/java/algebra/msm/NaiveMSM.java index 04a8f44..6bfd0db 100755 --- a/src/main/java/algebra/msm/NaiveMSM.java +++ b/src/main/java/algebra/msm/NaiveMSM.java @@ -15,14 +15,13 @@ import org.apache.spark.api.java.JavaPairRDD; /** - * Class exposing naive methods for MSM (i.e. no speedup used to compute the sum - * of multiple EC multiplications, the MSM result is obtained by summing the - * individual multiplications.) + * Class exposing naive methods for MSM (i.e. no speedup used to compute the sum of multiple EC + * multiplications, the MSM result is obtained by summing the individual multiplications.) */ public class NaiveMSM { - public static > - ArrayList fixedBaseMSM(List scalars, GroupT base) { + public static > ArrayList fixedBaseMSM( + List scalars, GroupT base) { ArrayList result = new ArrayList<>(scalars.size()); for (int i = 0; i < scalars.size(); i++) { @@ -33,12 +32,11 @@ ArrayList fixedBaseMSM(List scalars, GroupT base) { } /** - * Computes a Multi-Scalar Multiplication. - * i.e. on input bases and scalars , the function - * returns the group element R = s1 * P1 + ... + sn * Pn + * Computes a Multi-Scalar Multiplication. i.e. on input bases and scalars , the function returns the group element R = s1 * P1 + ... + sn * Pn */ - public static > - GroupT variableBaseMSM(ArrayList scalars, ArrayList bases) { + public static > GroupT variableBaseMSM( + ArrayList scalars, ArrayList bases) { assert (scalars.size() == bases.size()); assert (scalars.size() > 0); @@ -52,12 +50,11 @@ GroupT variableBaseMSM(ArrayList scalars, ArrayList bases) { } /** - * Computes a Multi-Scalar Multiplication. - * i.e. on input bases and scalars , the function - * returns the field element R = s1 * P1 + ... + sn * Pn + * Computes a Multi-Scalar Multiplication. i.e. on input bases and scalars , the function returns the field element R = s1 * P1 + ... + sn * Pn */ - public static > - FieldT variableBaseMSM(List scalars, List bases) { + public static > FieldT variableBaseMSM( + List scalars, List bases) { assert (scalars.size() == bases.size()); assert (scalars.size() > 0); diff --git a/src/main/java/common/NaiveEvaluation.java b/src/main/java/common/NaiveEvaluation.java index a673dd3..ee41464 100755 --- a/src/main/java/common/NaiveEvaluation.java +++ b/src/main/java/common/NaiveEvaluation.java @@ -15,25 +15,18 @@ public class NaiveEvaluation { /** - * Function evaluating polynomials using Horner's rule. - * i.e. a polynomial of the form: - * p(x) = c_m * x^m + ... + c_1 * x + c_0 - * is decomposed as: - * p(x) = (c_m * x^{m-1} + ... + c_1) * x + c_0, where q(x) = c_m * x^{m-1} + ... c_2 * x + c_1 - * is also defined as the product of a polynomial r(x) times x, deg(r) = deg(q) - 1. - * Example: - * p(x) = 3x^3 + 7x^2 + x + 10 - * = (3x^2 + 7x + 1)*x + 10 - * = ((3x + 7)*x + 1)*x + 10 - * = (((3)*x + 7)*x + 1)*x + 10 - * which is equal to: - * (((0*x + 3)*x + 7)*x + 1)*x + 10 + * Function evaluating polynomials using Horner's rule. i.e. a polynomial of the form: p(x) = c_m + * * x^m + ... + c_1 * x + c_0 is decomposed as: p(x) = (c_m * x^{m-1} + ... + c_1) * x + c_0, + * where q(x) = c_m * x^{m-1} + ... c_2 * x + c_1 is also defined as the product of a polynomial + * r(x) times x, deg(r) = deg(q) - 1. Example: p(x) = 3x^3 + 7x^2 + x + 10 = (3x^2 + 7x + 1)*x + + * 10 = ((3x + 7)*x + 1)*x + 10 = (((3)*x + 7)*x + 1)*x + 10 which is equal to: (((0*x + 3)*x + + * 7)*x + 1)*x + 10 * - * Hence a polynomial of degree m will be evaluated via m iterations of a loop in which a single - * muliplication by the evaluation point is done + a coefficient addition. + *

Hence a polynomial of degree m will be evaluated via m iterations of a loop in which a + * single muliplication by the evaluation point is done + a coefficient addition. */ - public static > - FieldT evaluatePolynomial(final List input, final FieldT t) { + public static > FieldT evaluatePolynomial( + final List input, final FieldT t) { final int m = input.size(); FieldT result = t.zero(); diff --git a/src/main/java/common/Utils.java b/src/main/java/common/Utils.java index 4e414d6..4f440f4 100755 --- a/src/main/java/common/Utils.java +++ b/src/main/java/common/Utils.java @@ -22,8 +22,8 @@ public class Utils { * Pads the given array input to length size. Note, if the input is already of length size or * greater, then nothing is done. */ - public static > - ArrayList padArray(final ArrayList input, final int size) { + public static > ArrayList padArray( + final ArrayList input, final int size) { if (input.size() >= size) { return input; } else { @@ -41,8 +41,7 @@ ArrayList padArray(final ArrayList input, final int size) { } /** Indexes the given list of inputs and returns an ArrayList of (Long, T) pairs. */ - public static - ArrayList> convertToPairs(final List input) { + public static ArrayList> convertToPairs(final List input) { ArrayList> result = new ArrayList<>(input.size()); for (int i = 0; i < input.size(); i++) { @@ -57,7 +56,7 @@ ArrayList> convertToPairs(final List input) { * elements of input indexed by the Long value storing the FieldT element. */ public static > - ArrayList convertFromPairs(final List> input, final int size) { + ArrayList convertFromPairs(final List> input, final int size) { // assert (input.size() == size); final FieldT zero = input.get(0)._2.zero(); ArrayList result = new ArrayList<>(Collections.nCopies(size, zero)); @@ -83,8 +82,8 @@ public static > ArrayList convertFr } /** Initializes a new JavaPairRDD of length size, indexed and filled with the given element. */ - public static - JavaPairRDD fillRDD(final long size, final T element, final Configuration config) { + public static JavaPairRDD fillRDD( + final long size, final T element, final Configuration config) { final int numPartitions = size >= config.numPartitions() ? config.numPartitions() : (int) size; final ArrayList partitions = new ArrayList<>(numPartitions); From cc9d60e020e584c3e70fce81b77f6b5f10c0128f Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 4 Jan 2021 17:55:46 +0000 Subject: [PATCH 36/94] Added ref to bgm groth variant --- src/main/java/zk_proof_systems/zkSNARK/README.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/README.md b/src/main/java/zk_proof_systems/zkSNARK/README.md index 211bdd9..79110a2 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/README.md +++ b/src/main/java/zk_proof_systems/zkSNARK/README.md @@ -1,5 +1,4 @@ # zkSNARK -This folder contains an implementation of a publicly-verifiable preprocessing zkSNARK of Groth [\[Gro16\]](https://eprint.iacr.org/2016/260). -The NP relation supported by this protocol is *Rank-1 Constraint Satisfaction* (R1CS). -See [\[WZCPS18\]](https://eprint.iacr.org/2018/691) for pseudocode of the protocol and definition of R1CS. \ No newline at end of file +This folder contains an implementation of a variant of the publicly-verifiable preprocessing zkSNARK [\[Gro16\]](https://eprint.iacr.org/2016/260) proposed by Bowe et al [\[BGM17\]](https://eprint.iacr.org/2017/1050.pdf). +The NP relation supported by this protocol is *Rank-1 Constraint Satisfaction* (R1CS). \ No newline at end of file From 675d60c08b3eefef9eb906dbb3538277dfe3ada0 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 4 Jan 2021 17:57:27 +0000 Subject: [PATCH 37/94] Switched to BGM verification key shape --- .../zkSNARK/objects/VerificationKey.java | 22 +++++++++---------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java b/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java index 3bcd20f..0437a17 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java @@ -16,25 +16,25 @@ public class VerificationKey< G1T extends AbstractG1, G2T extends AbstractG2, GTT extends AbstractGT> { - private final GTT alphaG1betaG2; - private final G2T gammaG2; + private final G1T alphaG1; + private final G2T betaG2; private final G2T deltaG2; private final List gammaABC; public VerificationKey( - final GTT _alphaG1betaG2, final G2T _gammaG2, final G2T _deltaG2, final List _gammaABC) { - alphaG1betaG2 = _alphaG1betaG2; - gammaG2 = _gammaG2; + final G1T _alphaG1, final G2T _betaG2, final G2T _deltaG2, final List _gammaABC) { + alphaG1 = _alphaG1; + betaG2 = _betaG2; deltaG2 = _deltaG2; gammaABC = _gammaABC; } public boolean equals(final VerificationKey other) { - if (!alphaG1betaG2.equals(other.alphaG1betaG2())) { + if (!alphaG1.equals(other.alphaG1())) { return false; } - if (!gammaG2.equals(other.gammaG2())) { + if (!betaG2.equals(other.betaG2())) { return false; } @@ -55,12 +55,12 @@ public boolean equals(final VerificationKey other) { return true; } - public GTT alphaG1betaG2() { - return alphaG1betaG2; + public G1T alphaG1() { + return alphaG1; } - public G2T gammaG2() { - return gammaG2; + public G2T betaG2() { + return betaG2; } public G2T deltaG2() { From d0ab0e1049c208d63717991fe13a7bd3f74a6e0e Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 5 Jan 2021 17:53:28 +0000 Subject: [PATCH 38/94] Implemented serial bgm algos --- .../zkSNARK/SerialProver.java | 25 ++++++++-- .../zk_proof_systems/zkSNARK/SerialSetup.java | 48 ++++++++++--------- .../zk_proof_systems/zkSNARK/Verifier.java | 30 ++++++------ .../zk_proof_systems/zkSNARK/objects/CRS.java | 14 +++--- .../zkSNARK/objects/ProvingKey.java | 7 +++ .../zkSNARK/objects/VerificationKey.java | 32 +++++++------ 6 files changed, 93 insertions(+), 63 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java index 0529c32..ed4b682 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java @@ -57,12 +57,21 @@ Proof prove( final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(provingKey.r1cs(), t); assert (qap.isSatisfied(qapWitness)); + System.out.println("\n\t ===== qap.isSatisfied(qapWitness) TRUTH value: " + qap.isSatisfied(qapWitness)); } // Choose two random field elements for prover zero-knowledge. final FieldT r = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT s = fieldFactory.random(config.seed(), config.secureSeed()); + if (config.debugFlag()) { + assert(qapWitness.coefficientsABC().size() == qapWitness.numVariables()); + assert(provingKey.queryA().size() == qapWitness.numVariables()); + assert(provingKey.queryH().size() == qapWitness.degree() - 1); + assert(provingKey.deltaABCG1().size() == qapWitness.numVariables() - qapWitness.numInputs()); + System.out.println("\n\t ===== Asserts on size pass ====="); + } + // Get initial parameters from the proving key. final G1T alphaG1 = provingKey.alphaG1(); final G1T betaG1 = provingKey.betaG1(); @@ -77,6 +86,9 @@ Proof prove( config.beginRuntime("Proof"); config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); + // A = alpha + \sum_{i=0}^{numVariables} var_i * A_i(t) + r * delta + // Below, the summation is decomposed as: + // \sum_{i=0}^{numInputs} pubInp_i * A_i(t) + \sum_{i=numInputs + 1}^{numVariables} auxInp_i * A_i(t) G1T evaluationAt = VariableBaseMSM.serialMSM(primary.elements(), provingKey.queryA().subList(0, numInputs)); evaluationAt = @@ -86,6 +98,9 @@ Proof prove( config.endLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); config.beginLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); + // B = beta + \sum_{i=0}^{numVariables} var_i * B_i(t) + s * delta + // Below, the summation is decomposed as: + // \sum_{i=0}^{numInputs} pubInp_i * B_i(t) + \sum_{i=numInputs + 1}^{numVariables} auxInp_i * B_i(t) final Tuple2 evaluationBtPrimary = VariableBaseMSM.doubleMSM(primary.elements(), provingKey.queryB().subList(0, numInputs)); final Tuple2 evaluationBtWitness = @@ -100,12 +115,15 @@ Proof prove( VariableBaseMSM.serialMSM(qapWitness.coefficientsH(), provingKey.queryH()); config.endLog("Computing evaluation to query H"); - // Compute evaluationABC = a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + // Compute evaluationABC = \sum_{i=numInputs+1}^{numVariables} inp_i * ((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. config.beginLog("Computing evaluation to deltaABC"); final int numWitness = numVariables - numInputs; + //G1T evaluationABC = + // VariableBaseMSM.serialMSM( + // auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); G1T evaluationABC = VariableBaseMSM.serialMSM( - auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); + auxiliary.elements(), provingKey.deltaABCG1().subList(0, numWitness)); evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta config.endLog("Computing evaluation to deltaABC"); @@ -118,8 +136,7 @@ Proof prove( betaG1.add(evaluationBtG1).add(deltaG1.mul(s)), betaG2.add(evaluationBtG2).add(deltaG2.mul(s))); - // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - - // r*s*delta + // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*delta final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); config.endRuntime("Proof"); diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java index 77bc90a..8cfaa21 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java @@ -31,20 +31,17 @@ public class SerialSetup { G2T extends AbstractG2, GTT extends AbstractGT, PairingT extends AbstractPairing> - CRS generate( + CRS generate( final R1CSRelation r1cs, final FieldT fieldFactory, final G1T g1Factory, final G2T g2Factory, - final PairingT pairing, final Configuration config) { // Generate secret randomness. final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT alpha = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT beta = fieldFactory.random(config.seed(), config.secureSeed()); - final FieldT gamma = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT delta = fieldFactory.random(config.seed(), config.secureSeed()); - final FieldT inverseGamma = gamma.inverse(); final FieldT inverseDelta = delta.inverse(); // A quadratic arithmetic program evaluated at t. @@ -55,18 +52,22 @@ CRS generate( System.out.println("\tQAP - pre degree: " + r1cs.numConstraints()); System.out.println("\tQAP - degree: " + qap.degree()); + // Size of the instance final int numInputs = qap.numInputs(); + // Number of circuit wires final int numVariables = qap.numVariables(); - // The gamma inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * gamma^{-1}. - config.beginLog("Computing gammaABC for R1CS verification key"); - final List gammaABC = new ArrayList<>(numInputs); + // ABC for vk: {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]}_{i=0}^{numInputs} + config.beginLog("Computing ABC for R1CS verification key"); + final List vkABC = new ArrayList<>(numInputs); + // TODO: (Double check) I don't think we need to add 1 to the upper bound here (i.e. < numInputs + 1) + // because we manually add ONE to the inputs outside of this function when we construct the R1CS. for (int i = 0; i < numInputs; i++) { - gammaABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i)).mul(inverseGamma)); + vkABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i))); } - config.endLog("Computing gammaABC for R1CS verification key"); + config.endLog("Computing ABC for R1CS verification key"); - // The delta inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * delta^{-1}. + // The delta inverse product component: {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]}_{i=numInputs+1}^{numVariables} config.beginLog("Computing deltaABC for R1CS proving key"); final List deltaABC = new ArrayList<>(numVariables - numInputs); for (int i = numInputs; i < numVariables; i++) { @@ -88,7 +89,10 @@ CRS generate( config.endLog("Computing query densities"); config.beginLog("Generating G1 MSM Window Table"); - final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + //final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + // We take ONE in both G1 and G2 as generator, else we need to add the choosen generator + // as part of the SRS for the verifier to use it in the computation of [evaluationABC*1]_T + final G1T generatorG1 = g1Factory.one(); final int scalarCountG1 = nonZeroAt + nonZeroBt + numVariables; final int scalarSizeG1 = generatorG1.bitSize(); final int windowSizeG1 = FixedBaseMSM.getWindowSize(scalarCountG1, generatorG1); @@ -97,7 +101,10 @@ CRS generate( config.endLog("Generating G1 MSM Window Table"); config.beginLog("Generating G2 MSM Window Table"); - final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + //final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + // We take ONE in both G1 and G2 as generator, else we need to add the choosen generator + // as part of the SRS for the verifier to use it in the computation of [evaluationABC*1]_T + final G2T generatorG2 = g2Factory.one(); final int scalarCountG2 = nonZeroBt; final int scalarSizeG2 = generatorG2.bitSize(); final int windowSizeG2 = FixedBaseMSM.getWindowSize(scalarCountG2, generatorG2); @@ -137,7 +144,7 @@ CRS generate( config.endLog("Computing query B", false); config.beginLog("Computing query H", false); - final FieldT inverseDeltaZt = qap.Zt().mul(delta.inverse()); + final FieldT inverseDeltaZt = qap.Zt().mul(inverseDelta); for (int i = 0; i < qap.Ht().size(); i++) { qap.Ht().set(i, qap.Ht().get(i).mul(inverseDeltaZt)); } @@ -151,13 +158,10 @@ CRS generate( config.beginLog("Generating R1CS verification key"); config.beginRuntime("Verification Key"); - final GTT alphaG1betaG2 = pairing.reducedPairing(alphaG1, betaG2); - final G2T gammaG2 = generatorG2.mul(gamma); - - config.beginLog("Encoding gammaABC for R1CS verification key"); - final List gammaABCG1 = - FixedBaseMSM.batchMSM(scalarSizeG1, windowSizeG1, windowTableG1, gammaABC); - config.endLog("Encoding gammaABC for R1CS verification key"); + config.beginLog("Encoding ABC for R1CS verification key"); + final List vkABCG1 = + FixedBaseMSM.batchMSM(scalarSizeG1, windowSizeG1, windowTableG1, vkABC); + config.endLog("Encoding ABC for R1CS verification key"); config.endLog("Generating R1CS verification key"); config.endRuntime("Verification Key"); @@ -168,8 +172,8 @@ CRS generate( alphaG1, betaG1, betaG2, deltaG1, deltaG2, deltaABCG1, queryA, queryB, queryH, r1cs); // Construct the verification key. - final VerificationKey verificationKey = - new VerificationKey<>(alphaG1betaG2, gammaG2, deltaG2, gammaABCG1); + final VerificationKey verificationKey = + new VerificationKey<>(alphaG1, betaG2, deltaG2, vkABCG1); return new CRS<>(provingKey, verificationKey); } diff --git a/src/main/java/zk_proof_systems/zkSNARK/Verifier.java b/src/main/java/zk_proof_systems/zkSNARK/Verifier.java index 3d659b1..cafeb4a 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/Verifier.java +++ b/src/main/java/zk_proof_systems/zkSNARK/Verifier.java @@ -26,7 +26,7 @@ public class Verifier { GTT extends AbstractGT, PairingT extends AbstractPairing> boolean verify( - final VerificationKey verificationKey, + final VerificationKey verificationKey, final Assignment primaryInput, final Proof proof, final PairingT pairing, @@ -35,26 +35,28 @@ boolean verify( final FieldT firstElement = primaryInput.get(0); assert (firstElement.equals(firstElement.one())); - // Compute the left hand side: A * B. - final GTT AB = pairing.reducedPairing(proof.gA(), proof.gB()); + // LHS: Compute A * B + final GTT LHS = pairing.reducedPairing(proof.gA(), proof.gB()); - // Get alpha, beta, gamma, and delta from the verification key. - final GTT alphaBeta = verificationKey.alphaG1betaG2(); // alpha * beta - final G2T gamma = verificationKey.gammaG2(); + // RHS: Compute [alpha * beta]_T + final GTT alphaBeta = pairing.reducedPairing(verificationKey.alphaG1(),verificationKey.betaG2()); + + // RHS: Compute [C * delta]_T final G2T delta = verificationKey.deltaG2(); - final GTT CDelta = pairing.reducedPairing(proof.gC(), delta); // C * delta + final GTT CDelta = pairing.reducedPairing(proof.gC(), delta); - // Compute the summation of primaryInput[i] * (beta*u_i(x) + alpha*v_i(x) + w_i(x)) * gamma^{-1} + // RHS: Compute \sum_{i=0}^{numInputs} pubInp_i * (beta*A_i(x) + alpha*B_i(x) + C_i(x)) final G1T evaluationABC = - VariableBaseMSM.serialMSM(primaryInput.elements(), verificationKey.gammaABC()); + VariableBaseMSM.serialMSM(primaryInput.elements(), verificationKey.ABC()); - // Compute the right hand side: alpha*beta + evaluationABC*gamma + C*delta - final GTT C = alphaBeta.add(pairing.reducedPairing(evaluationABC, gamma)).add(CDelta); + // Compute the RHS: [alpha*beta + evaluationABC*1 + C*delta]_T + final G2T generatorG2 = delta.one(); // See SerialSetup, we take ONE in both G1 and G2 as generator. + final GTT RHS = alphaBeta.add(pairing.reducedPairing(evaluationABC, generatorG2)).add(CDelta); - final boolean verifierResult = AB.equals(C); + final boolean verifierResult = LHS.equals(RHS); if (!verifierResult && config.verboseFlag()) { - System.out.println("Verifier failed: "); - System.out.println("\n\tA * B = " + AB + "\nC = " + C); + System.out.println("Verification failed: "); + System.out.println("\n\tLHS = " + LHS + "\n\tRHS = " + RHS); } return verifierResult; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java b/src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java index ad2bed2..72d59c1 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java @@ -9,25 +9,23 @@ import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; -import algebra.curves.AbstractGT; import algebra.fields.AbstractFieldElementExpanded; import java.io.Serializable; -/** Groth16 Common Reference String (CRS) */ +/** Groth16-BGM17 Common Reference String (CRS) */ public class CRS< FieldT extends AbstractFieldElementExpanded, G1T extends AbstractG1, - G2T extends AbstractG2, - GTT extends AbstractGT> + G2T extends AbstractG2> implements Serializable { private final ProvingKey provingKey; private final ProvingKeyRDD provingKeyRDD; - private final VerificationKey verificationKey; + private final VerificationKey verificationKey; public CRS( final ProvingKeyRDD _provingKeyRDD, - final VerificationKey _verificationKey) { + final VerificationKey _verificationKey) { provingKey = null; provingKeyRDD = _provingKeyRDD; verificationKey = _verificationKey; @@ -35,7 +33,7 @@ public CRS( public CRS( final ProvingKey _provingKey, - final VerificationKey _verificationKey) { + final VerificationKey _verificationKey) { provingKey = _provingKey; provingKeyRDD = null; verificationKey = _verificationKey; @@ -49,7 +47,7 @@ public ProvingKeyRDD provingKeyRDD() { return provingKeyRDD; } - public VerificationKey verificationKey() { + public VerificationKey verificationKey() { return verificationKey; } } diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java index bf31d3a..556f96a 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java @@ -22,14 +22,21 @@ public class ProvingKey< G2T extends AbstractG2> implements Serializable { + // [alpha]_1 private final G1T alphaG1; + // [beta] private final G1T betaG1; private final G2T betaG2; + // [delta] private final G1T deltaG1; private final G2T deltaG2; + // {[(beta * A_i(t) + alpha * B_i(t) + C_i(t))/delta]_1}_{i=numInputs+1}^{numVariables} private final List deltaABCG1; + // {[A_i(t)]_1}_{i=0}^{numVariables} private final List queryA; + // {[B_i(t)]}_{i=1}^{numVariables} private final List> queryB; + // {[t^i * Z(t)/delta]_1}_{i=1}^{numVariables - 2} private final List queryH; // The proving key holds the arithmetized relation private final R1CSRelation r1cs; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java b/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java index 0437a17..3bf919d 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java @@ -9,27 +9,29 @@ import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; -import algebra.curves.AbstractGT; import java.util.List; -/** Groth16 verification key */ -public class VerificationKey< - G1T extends AbstractG1, G2T extends AbstractG2, GTT extends AbstractGT> { +/** Groth16-BGM17 verification key */ +public class VerificationKey, G2T extends AbstractG2> { + // [alpha]_1 private final G1T alphaG1; + // [beta]_2 private final G2T betaG2; + // [delta]_2 private final G2T deltaG2; - private final List gammaABC; + // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]}_{i=0}^{numInputs} + private final List ABC; public VerificationKey( - final G1T _alphaG1, final G2T _betaG2, final G2T _deltaG2, final List _gammaABC) { + final G1T _alphaG1, final G2T _betaG2, final G2T _deltaG2, final List _ABC) { alphaG1 = _alphaG1; betaG2 = _betaG2; deltaG2 = _deltaG2; - gammaABC = _gammaABC; + ABC = _ABC; } - public boolean equals(final VerificationKey other) { + public boolean equals(final VerificationKey other) { if (!alphaG1.equals(other.alphaG1())) { return false; } @@ -42,12 +44,12 @@ public boolean equals(final VerificationKey other) { return false; } - if (gammaABC.size() != other.gammaABC().size()) { + if (ABC.size() != other.ABC().size()) { return false; } - for (int i = 0; i < gammaABC.size(); i++) { - if (!gammaABC(i).equals(other.gammaABC(i))) { + for (int i = 0; i < ABC.size(); i++) { + if (!ABC(i).equals(other.ABC(i))) { return false; } } @@ -67,11 +69,11 @@ public G2T deltaG2() { return deltaG2; } - public G1T gammaABC(final int i) { - return gammaABC.get(i); + public G1T ABC(final int i) { + return ABC.get(i); } - public List gammaABC() { - return gammaABC; + public List ABC() { + return ABC; } } From 0ee1615e12094699af82c20368a2df6ef5c77a4c Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 5 Jan 2021 17:55:38 +0000 Subject: [PATCH 39/94] Removed useless GTT/pairing template/args --- .../zk_proof_systems/zkSNARK/SerialzkSNARKTest.java | 13 +++++++------ 1 file changed, 7 insertions(+), 6 deletions(-) diff --git a/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java index 4638353..65e2a5d 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java @@ -13,7 +13,7 @@ import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; -import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; +import algebra.curves.barreto_naehrig.bn254a.BN254aFields; import algebra.curves.barreto_naehrig.bn254a.BN254aG1; import algebra.curves.barreto_naehrig.bn254a.BN254aG2; import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; @@ -93,8 +93,8 @@ void SerialBNProofSystemTest( final Assignment primary = construction._2(); final Assignment fullAssignment = construction._3(); - final CRS CRS = - SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); final Proof proof = SerialProver.prove(CRS.provingKey(), primary, fullAssignment, fieldFactory, config); @@ -122,8 +122,8 @@ public void SerialFakeProofSystemTest() { final Assignment primary = construction._2(); final Assignment auxiliary = construction._3(); - final CRS CRS = - SerialSetup.generate(r1cs, fieldFactory, fakeG1Factory, fakeG2Factory, fakePairing, config); + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, fakeG1Factory, fakeG2Factory, config); final Proof proof = SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); final boolean isValid = @@ -137,7 +137,8 @@ public void SerialFakeProofSystemTest() { public void SerialBN254aProofSystemTest() { final int numInputs = 1023; final int numConstraints = 1024; - final BN254aFr fieldFactory = BN254aFr.ONE; + //final BN254aFr fieldFactory = BN254aFr.ONE; + final BN254aFields.BN254aFr fieldFactory = new BN254aFields.BN254aFr(1); final BN254aG1 g1Factory = BN254aG1Parameters.ONE; final BN254aG2 g2Factory = BN254aG2Parameters.ONE; final BN254aPairing pairing = new BN254aPairing(); From c387a4fbb436f9b28b5eb600f71aab238b38289e Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 5 Jan 2021 18:56:16 +0000 Subject: [PATCH 40/94] Removed erroneous assert and added debug levels manually --- src/main/java/zk_proof_systems/zkSNARK/SerialProver.java | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java index ed4b682..75918f7 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java @@ -57,7 +57,7 @@ Proof prove( final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(provingKey.r1cs(), t); assert (qap.isSatisfied(qapWitness)); - System.out.println("\n\t ===== qap.isSatisfied(qapWitness) TRUTH value: " + qap.isSatisfied(qapWitness)); + System.out.println("\n\t ===== [DEBUG] qap.isSatisfied(qapWitness) TRUTH value: " + qap.isSatisfied(qapWitness)); } // Choose two random field elements for prover zero-knowledge. @@ -67,9 +67,8 @@ Proof prove( if (config.debugFlag()) { assert(qapWitness.coefficientsABC().size() == qapWitness.numVariables()); assert(provingKey.queryA().size() == qapWitness.numVariables()); - assert(provingKey.queryH().size() == qapWitness.degree() - 1); assert(provingKey.deltaABCG1().size() == qapWitness.numVariables() - qapWitness.numInputs()); - System.out.println("\n\t ===== Asserts on size pass ====="); + System.out.println("\n\t ===== [DEBUG] Asserts on size pass ====="); } // Get initial parameters from the proving key. From 16407e772bcd2330e0c8c0a22f4515d1410b696b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 6 Jan 2021 17:58:34 +0000 Subject: [PATCH 41/94] Added clarification comment about parallelize() --- src/main/java/io/JSONR1CSLoader.java | 3 +++ 1 file changed, 3 insertions(+) diff --git a/src/main/java/io/JSONR1CSLoader.java b/src/main/java/io/JSONR1CSLoader.java index 55f499a..da8b710 100644 --- a/src/main/java/io/JSONR1CSLoader.java +++ b/src/main/java/io/JSONR1CSLoader.java @@ -144,6 +144,9 @@ JavaPairRDD> loadLinearCombinationRDD( // (e.g. sc.parallelize(data, 10)) // See: https://spark.apache.org/docs/3.0.0/rdd-programming-guide.html#parallelized-collections // + // WARNING: SparkContext’s parallelize() method may not be suitable outside of testing and + // prototyping because this method requires entire dataset on one machine. + // // NOTE1: Here "the initial dataset" is just an array of integers (the "partitions" which is // an ArrayList). This array (e.g. [0,1,2]) is used to build the data in each // partitions From 873afe0cb66662488bf1f86446c61e6b2d0ad675 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 6 Jan 2021 17:59:27 +0000 Subject: [PATCH 42/94] Removed GTT --- .../java/profiler/profiling/ZKSNARKProfiling.java | 14 ++++++-------- 1 file changed, 6 insertions(+), 8 deletions(-) diff --git a/src/main/java/profiler/profiling/ZKSNARKProfiling.java b/src/main/java/profiler/profiling/ZKSNARKProfiling.java index 917e706..b04f7e1 100755 --- a/src/main/java/profiler/profiling/ZKSNARKProfiling.java +++ b/src/main/java/profiler/profiling/ZKSNARKProfiling.java @@ -3,14 +3,12 @@ import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import algebra.curves.barreto_naehrig.bn254a.BN254aG1; import algebra.curves.barreto_naehrig.bn254a.BN254aG2; -import algebra.curves.barreto_naehrig.bn254a.BN254aGT; import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import algebra.curves.barreto_naehrig.bn254b.BN254bG1; import algebra.curves.barreto_naehrig.bn254b.BN254bG2; -import algebra.curves.barreto_naehrig.bn254b.BN254bGT; import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; @@ -46,8 +44,8 @@ public static void serialzkSNARKProfiling(final Configuration config, final long config.beginLog(config.context()); config.beginRuntime("Setup"); - final CRS CRS = - SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); config.endLog(config.context()); config.endRuntime("Setup"); @@ -101,8 +99,8 @@ public static void serialzkSNARKLargeProfiling( config.beginLog(config.context()); config.beginRuntime("Setup"); - final CRS CRS = - SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); config.endLog(config.context()); config.endRuntime("Setup"); @@ -157,7 +155,7 @@ public static void distributedzkSNARKProfiling( config.beginLog(config.context()); config.beginRuntime("Setup"); - final CRS CRS = + final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); config.endLog(config.context()); config.endRuntime("Setup"); @@ -213,7 +211,7 @@ public static void distributedzkSNARKLargeProfiling( config.beginLog(config.context()); config.beginRuntime("Setup"); - final CRS CRS = + final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); config.endLog(config.context()); config.endRuntime("Setup"); From b493d50f32b20002c0c6f86c4fae062755c5244c Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 11:08:22 +0000 Subject: [PATCH 43/94] Tightened comments --- .../zkSNARK/SerialProver.java | 22 ++++++++++--------- .../zk_proof_systems/zkSNARK/SerialSetup.java | 12 +++++++--- .../zk_proof_systems/zkSNARK/Verifier.java | 4 ++-- .../zkSNARK/objects/ProvingKey.java | 7 ++++-- .../zkSNARK/objects/VerificationKey.java | 2 +- 5 files changed, 29 insertions(+), 18 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java index 75918f7..5a26741 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java @@ -47,12 +47,14 @@ Proof prove( // We are dividing degree 2(d-1) polynomial by degree d polynomial // and not adding a PGHR-style ZK-patch, so our H is degree d-2. final FieldT zero = fieldFactory.zero(); - // 1. Verify that H has a non-zero d-2 coefficient - assert (!qapWitness.coefficientsH(qapWitness.degree() - 2).equals(zero)); - // 2. Make sure that coefficients d-1 and d are 0 to make sure that the polynomial hasn't a - // degree higher than d-2 + // Make sure that H has at most d+1 coeffs (which bounds deg(H(x)) <= d) + assert (qapWitness.coefficientsH().size() == qapWitness.degree() + 1); + // 1. Make sure that coefficients d-1 and d are 0 to make sure that the polynomial hasn't a + // degree higher than d-2 (these steps refine the upper bound deg(H(x)) <= d-2) assert (qapWitness.coefficientsH(qapWitness.degree() - 1).equals(zero)); assert (qapWitness.coefficientsH(qapWitness.degree()).equals(zero)); + // 2. Make sure that H has a non-zero d-2 coefficient (deg(H(x)) = d-2) + assert (!qapWitness.coefficientsH(qapWitness.degree() - 2).equals(zero)); // Check that the witness satisfies the QAP relation. final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(provingKey.r1cs(), t); @@ -61,6 +63,7 @@ Proof prove( } // Choose two random field elements for prover zero-knowledge. + // r, s \sample \FF^2 final FieldT r = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT s = fieldFactory.random(config.seed(), config.secureSeed()); @@ -114,28 +117,27 @@ Proof prove( VariableBaseMSM.serialMSM(qapWitness.coefficientsH(), provingKey.queryH()); config.endLog("Computing evaluation to query H"); - // Compute evaluationABC = \sum_{i=numInputs+1}^{numVariables} inp_i * ((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + // Compute evaluationABC = \sum_{i=numInputs+1}^{numVariables} var_i * ((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. config.beginLog("Computing evaluation to deltaABC"); final int numWitness = numVariables - numInputs; //G1T evaluationABC = // VariableBaseMSM.serialMSM( // auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); G1T evaluationABC = - VariableBaseMSM.serialMSM( - auxiliary.elements(), provingKey.deltaABCG1().subList(0, numWitness)); + VariableBaseMSM.serialMSM(auxiliary.elements(), provingKey.deltaABCG1()); evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta config.endLog("Computing evaluation to deltaABC"); - // A = alpha + sum_i(a_i*A_i(t)) + r*delta + // A = alpha + \sum_{i=0}^{numVariables}(var_i*A_i(t)) + r*delta final G1T A = alphaG1.add(evaluationAt).add(deltaG1.mul(r)); - // B = beta + sum_i(a_i*B_i(t)) + s*delta + // B = beta + sum_{i=0}^{numVariables}(var_i*B_i(t)) + s*delta final Tuple2 B = new Tuple2<>( betaG1.add(evaluationBtG1).add(deltaG1.mul(s)), betaG2.add(evaluationBtG2).add(deltaG2.mul(s))); - // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*delta + // C = \sum_{i=numInputs+1}^{numVariables}(var_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*delta final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); config.endRuntime("Proof"); diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java index 8cfaa21..8631530 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java @@ -57,17 +57,19 @@ CRS generate( // Number of circuit wires final int numVariables = qap.numVariables(); - // ABC for vk: {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]}_{i=0}^{numInputs} + // ABC for vk: + // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]_1}_{i=0}^{numInputs} config.beginLog("Computing ABC for R1CS verification key"); final List vkABC = new ArrayList<>(numInputs); - // TODO: (Double check) I don't think we need to add 1 to the upper bound here (i.e. < numInputs + 1) + // TODO: (Double check) I don't think we need to add 1 to the bounds here (i.e. i = 1 and i < numInputs + 1) // because we manually add ONE to the inputs outside of this function when we construct the R1CS. for (int i = 0; i < numInputs; i++) { vkABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i))); } config.endLog("Computing ABC for R1CS verification key"); - // The delta inverse product component: {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]}_{i=numInputs+1}^{numVariables} + // The delta inverse product component: + // {[(beta * A_i(t) + alpha * B_i(t) + C_i(t))/delta]_1}_{i=numInputs+1}^{numVariables} config.beginLog("Computing deltaABC for R1CS proving key"); final List deltaABC = new ArrayList<>(numVariables - numInputs); for (int i = numInputs; i < numVariables; i++) { @@ -115,9 +117,12 @@ CRS generate( config.beginLog("Generating R1CS proving key"); config.beginRuntime("Proving Key"); + // [alpha]_1 final G1T alphaG1 = generatorG1.mul(alpha); + // [beta] final G1T betaG1 = generatorG1.mul(beta); final G2T betaG2 = generatorG2.mul(beta); + // [delta] final G1T deltaG1 = generatorG1.mul(delta); final G2T deltaG2 = generatorG2.mul(delta); @@ -144,6 +149,7 @@ CRS generate( config.endLog("Computing query B", false); config.beginLog("Computing query H", false); + // TODO: Check size of queryH to make sure elements are in [0..n-2] final FieldT inverseDeltaZt = qap.Zt().mul(inverseDelta); for (int i = 0; i < qap.Ht().size(); i++) { qap.Ht().set(i, qap.Ht().get(i).mul(inverseDeltaZt)); diff --git a/src/main/java/zk_proof_systems/zkSNARK/Verifier.java b/src/main/java/zk_proof_systems/zkSNARK/Verifier.java index cafeb4a..22e95b5 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/Verifier.java +++ b/src/main/java/zk_proof_systems/zkSNARK/Verifier.java @@ -31,11 +31,11 @@ boolean verify( final Proof proof, final PairingT pairing, final Configuration config) { - // Assert first element == FieldT.one(). + // Assert first element of the primary inputs to be FieldT.one(). final FieldT firstElement = primaryInput.get(0); assert (firstElement.equals(firstElement.one())); - // LHS: Compute A * B + // LHS: Compute [A * B]_T final GTT LHS = pairing.reducedPairing(proof.gA(), proof.gB()); // RHS: Compute [alpha * beta]_T diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java index 556f96a..633b2cd 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java @@ -34,9 +34,12 @@ public class ProvingKey< private final List deltaABCG1; // {[A_i(t)]_1}_{i=0}^{numVariables} private final List queryA; - // {[B_i(t)]}_{i=1}^{numVariables} + // {[B_i(t)]}_{i=0}^{numVariables} + // Note: queryB is taken in both G1 and G2 because its fastens the prover: + // - the G2 part is used in the computation of proof.B (encoded in G2) + // - the G1 part is used in the computation of proof.C (i.e. the + rB term, encoded in G1) private final List> queryB; - // {[t^i * Z(t)/delta]_1}_{i=1}^{numVariables - 2} + // {[t^i * Z(t)/delta]_1}_{i=0}^{deg(Z(x)) - 2} private final List queryH; // The proving key holds the arithmetized relation private final R1CSRelation r1cs; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java b/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java index 3bf919d..d7c5697 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java @@ -20,7 +20,7 @@ public class VerificationKey, G2T extends AbstractG2 private final G2T betaG2; // [delta]_2 private final G2T deltaG2; - // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]}_{i=0}^{numInputs} + // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]_1}_{i=0}^{numInputs} private final List ABC; public VerificationKey( From f89d0f9f48e0ccb83f03a5902723a82def8ba15b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 13:17:37 +0000 Subject: [PATCH 44/94] First shot implementation of distributed BGM-GROTH --- .../profiler/generation/R1CSConstructor.java | 1 + .../zkSNARK/DistributedSetup.java | 74 ++++++++++--------- .../zk_proof_systems/zkSNARK/SerialSetup.java | 11 ++- .../zkSNARK/objects/ProvingKeyRDD.java | 23 +++--- .../zkSNARK/DistributedzkSNARKTest.java | 2 +- 5 files changed, 58 insertions(+), 53 deletions(-) diff --git a/src/main/java/profiler/generation/R1CSConstructor.java b/src/main/java/profiler/generation/R1CSConstructor.java index f24fa33..9f24dc3 100755 --- a/src/main/java/profiler/generation/R1CSConstructor.java +++ b/src/main/java/profiler/generation/R1CSConstructor.java @@ -104,6 +104,7 @@ Tuple3, Assignment, Assignment> serialConst constraints.add(new R1CSConstraint<>(A, B, C)); final R1CSRelation r1cs = new R1CSRelation<>(constraints, numInputs, numAuxiliary); + // Split the fullAssignment in 2 subsets: fullAssignment = primary \cup auxiliary final Assignment primary = new Assignment<>(fullAssignment.subList(0, numInputs)); final Assignment auxiliary = new Assignment<>(fullAssignment.subList(numInputs, fullAssignment.size())); diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java b/src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java index d2f0ab4..5f5eaba 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java @@ -32,7 +32,7 @@ public class DistributedSetup { G2T extends AbstractG2, GTT extends AbstractGT, PairingT extends AbstractPairing> - CRS generate( + CRS generate( final R1CSRelationRDD r1cs, final FieldT fieldFactory, final G1T g1Factory, @@ -43,16 +43,17 @@ CRS generate( final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT alpha = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT beta = fieldFactory.random(config.seed(), config.secureSeed()); - final FieldT gamma = fieldFactory.random(config.seed(), config.secureSeed()); final FieldT delta = fieldFactory.random(config.seed(), config.secureSeed()); - final FieldT inverseGamma = gamma.inverse(); final FieldT inverseDelta = delta.inverse(); // A quadratic arithmetic program evaluated at t. final QAPRelationRDD qap = R1CStoQAPRDD.R1CStoQAPRelation(r1cs, t, config); + // Size of the instance final int numInputs = qap.numInputs(); + // Number of circuit wires final long numVariables = qap.numVariables(); + final int numPartitions = config.numPartitions(); System.out.println("\tQAP - primary input size: " + numInputs); @@ -60,22 +61,22 @@ CRS generate( System.out.println("\tQAP - pre degree: " + r1cs.numConstraints()); System.out.println("\tQAP - degree: " + qap.degree()); - // The gamma inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * gamma^{-1} - // The delta inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * delta^{-1} - config.beginLog("Computing deltaABC and gammaABC for R1CS proving key and verification key"); + config.beginLog("Computing deltaABC for R1CS proving key and verification key"); final JavaPairRDD betaAt = qap.At().mapValues(a -> a.mul(beta)); final JavaPairRDD alphaBt = qap.Bt().mapValues(b -> b.mul(alpha)); + // ABC for vk: + // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]_1}_{i=0}^{numInputs} final JavaPairRDD ABC = betaAt .union(alphaBt) .union(qap.Ct()) .reduceByKey(FieldT::add) .persist(config.storageLevel()); - final JavaPairRDD gammaABC = - ABC.filter(e -> e._1 < numInputs).mapValues(e -> e.mul(inverseGamma)); + // The delta inverse product component: + // {[(beta * A_i(t) + alpha * B_i(t) + C_i(t))/delta]_1}_{i=numInputs+1}^{numVariables} final JavaPairRDD deltaABC = ABC.filter(e -> e._1 >= numInputs).mapValues(e -> e.mul(inverseDelta)); - config.endLog("Computing deltaABC and gammaABC for R1CS proving key and verification key"); + config.endLog("Computing deltaABC for R1CS proving key and verification key"); config.beginLog("Computing query densities"); final long numNonZeroAt = qap.At().filter(e -> !e._2.isZero()).count(); @@ -83,16 +84,20 @@ CRS generate( config.endLog("Computing query densities"); config.beginLog("Generating G1 MSM Window Table"); - final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + // For testing with the cpp code, take the identity instead of a random generator + //final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + final G1T generatorG1 = g1Factory.one(); final int scalarSizeG1 = generatorG1.bitSize(); final long scalarCountG1 = numNonZeroAt + numNonZeroBt + numVariables; + // Get window size per partition final int windowSizeG1 = FixedBaseMSM.getWindowSize(scalarCountG1 / numPartitions, generatorG1); final List> windowTableG1 = FixedBaseMSM.getWindowTable(generatorG1, scalarSizeG1, windowSizeG1); config.endLog("Generating G1 MSM Window Table"); config.beginLog("Generating G2 MSM Window Table"); - final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + //final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + final G2T generatorG2 = g2Factory.one(); final int scalarSizeG2 = generatorG2.bitSize(); final long scalarCountG2 = numNonZeroBt; final int windowSizeG2 = FixedBaseMSM.getWindowSize(scalarCountG2 / numPartitions, generatorG2); @@ -103,21 +108,15 @@ CRS generate( config.beginLog("Generating R1CS proving key"); config.beginRuntime("Proving Key"); + // [alpha]_1 final G1T alphaG1 = generatorG1.mul(alpha); + // [beta] final G1T betaG1 = generatorG1.mul(beta); final G2T betaG2 = generatorG2.mul(beta); + // [delta] final G1T deltaG1 = generatorG1.mul(delta); final G2T deltaG2 = generatorG2.mul(delta); - config.beginLog("Encoding deltaABC for R1CS proving key"); - final JavaPairRDD deltaABCG1 = - FixedBaseMSM.distributedBatchMSM( - scalarSizeG1, windowSizeG1, windowTableG1, deltaABC, config.sparkContext()) - .persist(config.storageLevel()); - deltaABCG1.count(); - qap.Ct().unpersist(); - config.endLog("Encoding deltaABC for R1CS proving key"); - config.beginLog("Computing query A"); final JavaPairRDD queryA = FixedBaseMSM.distributedBatchMSM( @@ -155,24 +154,29 @@ CRS generate( qap.Ht().unpersist(); config.endLog("Computing query H"); + config.beginLog("Encoding deltaABC for R1CS proving key"); + final JavaPairRDD deltaABCG1 = + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, deltaABC, config.sparkContext()) + .persist(config.storageLevel()); + deltaABCG1.count(); + qap.Ct().unpersist(); + config.endLog("Encoding deltaABC for R1CS proving key"); + config.endLog("Generating R1CS proving key"); config.endRuntime("Proving Key"); - config.beginLog("Computing gammaABC for R1CS verification key"); + config.beginLog("Computing ABC for R1CS verification key"); config.beginRuntime("Verification Key"); - final GTT alphaG1betaG2 = pairing.reducedPairing(alphaG1, betaG2); - final G2T gammaG2 = generatorG2.mul(gamma); - final JavaPairRDD gammaABCG1 = - FixedBaseMSM.distributedBatchMSM( - scalarSizeG1, windowSizeG1, windowTableG1, gammaABC, config.sparkContext()) - .persist(config.storageLevel()); - final JavaPairRDD fullGammaABCG1 = - Utils.fillRDD(numInputs, generatorG1.zero(), config) - .union(gammaABCG1) - .reduceByKey(G1T::add); - final List UVWGammaG1 = Utils.convertFromPair(fullGammaABCG1.collect(), numInputs); + final JavaPairRDD vkABC = ABC.filter(e -> e._1 < numInputs); + final JavaPairRDD vkABCG1 = + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, vkABC, config.sparkContext()) + .persist(config.storageLevel()); + // ABC is not stored as an RDD in the verification key, so we recover a `List` + final List vkABCFinalG1 = Utils.convertFromPair(vkABCG1.collect(), numInputs); ABC.unpersist(); - config.endLog("Computing gammaABC for R1CS verification key"); + config.endLog("Computing ABC for R1CS verification key"); config.endRuntime("Verification Key"); // Construct the proving key. @@ -181,8 +185,8 @@ CRS generate( alphaG1, betaG1, betaG2, deltaG1, deltaG2, deltaABCG1, queryA, queryB, queryH, r1cs); // Construct the verification key. - final VerificationKey verificationKey = - new VerificationKey<>(alphaG1betaG2, gammaG2, deltaG2, UVWGammaG1); + final VerificationKey verificationKey = + new VerificationKey<>(alphaG1, betaG2, deltaG2, vkABCFinalG1); return new CRS<>(provingKey, verificationKey); } diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java index 8631530..1da1da3 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java @@ -47,16 +47,16 @@ CRS generate( // A quadratic arithmetic program evaluated at t. final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(r1cs, t); - System.out.println("\tQAP - primary input size: " + qap.numInputs()); - System.out.println("\tQAP - total input size: " + qap.numVariables()); - System.out.println("\tQAP - pre degree: " + r1cs.numConstraints()); - System.out.println("\tQAP - degree: " + qap.degree()); - // Size of the instance final int numInputs = qap.numInputs(); // Number of circuit wires final int numVariables = qap.numVariables(); + System.out.println("\tQAP - primary input size: " + numInputs); + System.out.println("\tQAP - total input size: " + numVariables); + System.out.println("\tQAP - pre degree: " + r1cs.numConstraints()); + System.out.println("\tQAP - degree: " + qap.degree()); + // ABC for vk: // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]_1}_{i=0}^{numInputs} config.beginLog("Computing ABC for R1CS verification key"); @@ -149,7 +149,6 @@ CRS generate( config.endLog("Computing query B", false); config.beginLog("Computing query H", false); - // TODO: Check size of queryH to make sure elements are in [0..n-2] final FieldT inverseDeltaZt = qap.Zt().mul(inverseDelta); for (int i = 0; i < qap.Ht().size(); i++) { qap.Ht().set(i, qap.Ht().get(i).mul(inverseDeltaZt)); diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java index 41c249b..4fb13ab 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java +++ b/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java @@ -21,27 +21,28 @@ public class ProvingKeyRDD< G2T extends AbstractG2> implements Serializable { - // Below, [x]_1 (resp. [x]_2 and []_T) represents the encoding of x in G1 (resp. G2 and GT) - // We follow the notations in Groth16 (namely, polynomials are denoted u, v, w, h, t instead of A, - // B, C, H, Z. Moreover the evaluation point is denoted by x) + // Below, [x]_1 (resp. [x]_2 and [x]_T) represents the encoding of x in G1 (resp. G2 and GT) + // We do not follow the notations in Groth16 (namely, polynomials are denoted A, B, C, H, Z + // instead of u, v, w, h, t. Moreover the evaluation point is denoted by t) // // [alpha]_1 private final G1T alphaG1; - // [beta]_1 + // [beta] private final G1T betaG1; - // [beta]_2 private final G2T betaG2; - // [delta]_1 + // [delta] private final G1T deltaG1; - // [delta]_2 private final G2T deltaG2; - // {[(beta * u_i(x) + alpha * v_i(x) + w_i(x))/delta]_1} + // {[(beta * A_i(t) + alpha * B_i(t) + C_i(t))/delta]_1}_{i=numInputs+1}^{numVariables} private final JavaPairRDD deltaABCG1; - // {[u_i(x)]_1} + // {[A_i(t)]_1}_{i=0}^{numVariables} private final JavaPairRDD queryA; - // {[v_i(x)]_1} + // {[B_i(t)]}_{i=0}^{numVariables} + // Note: queryB is taken in both G1 and G2 because its fastens the prover: + // - the G2 part is used in the computation of proof.B (encoded in G2) + // - the G1 part is used in the computation of proof.C (i.e. the + rB term, encoded in G1) private final JavaPairRDD> queryB; - // {[(x^i * t(x))/delta]_1} + // {[t^i * Z(t)/delta]_1}_{i=0}^{deg(Z(x)) - 2} private final JavaPairRDD queryH; // The proving key contains an arithmetized relation private final R1CSRelationRDD r1cs; diff --git a/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java index 7290b15..f3d5ee7 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java @@ -104,7 +104,7 @@ void DistributedBNProofSystemTest( final Assignment primary = construction._2(); final JavaPairRDD fullAssignment = construction._3(); - final CRS CRS = + final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); final Proof proof = From 42b6e98de5606546076536df361e6f47746540a4 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 14:41:54 +0000 Subject: [PATCH 45/94] Moved bgmGroth into a separate folder --- .../java/zk_proof_systems/zkSNARK/README.md | 4 -- .../{ => grothBGM17}/DistributedProver.java | 6 +- .../{ => grothBGM17}/DistributedSetup.java | 8 +-- .../zkSNARK/grothBGM17/README.md | 60 +++++++++++++++++++ .../{ => grothBGM17}/SerialProver.java | 9 ++- .../zkSNARK/{ => grothBGM17}/SerialSetup.java | 8 +-- .../zkSNARK/{ => grothBGM17}/Verifier.java | 9 ++- .../zkSNARK/{ => grothBGM17}/objects/CRS.java | 2 +- .../{ => grothBGM17}/objects/Proof.java | 2 +- .../{ => grothBGM17}/objects/ProvingKey.java | 2 +- .../objects/ProvingKeyRDD.java | 2 +- .../objects/VerificationKey.java | 2 +- .../DistributedzkSNARKTest.java | 14 +++-- .../{ => grothBGM17}/SerialzkSNARKTest.java | 43 ++++++------- 14 files changed, 120 insertions(+), 51 deletions(-) delete mode 100755 src/main/java/zk_proof_systems/zkSNARK/README.md rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/DistributedProver.java (98%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/DistributedSetup.java (97%) create mode 100755 src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/SerialProver.java (93%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/SerialSetup.java (97%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/Verifier.java (84%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/objects/CRS.java (96%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/objects/Proof.java (93%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/objects/ProvingKey.java (98%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/objects/ProvingKeyRDD.java (98%) rename src/main/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/objects/VerificationKey.java (97%) rename src/test/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/DistributedzkSNARKTest.java (92%) rename src/test/java/zk_proof_systems/zkSNARK/{ => grothBGM17}/SerialzkSNARKTest.java (83%) diff --git a/src/main/java/zk_proof_systems/zkSNARK/README.md b/src/main/java/zk_proof_systems/zkSNARK/README.md deleted file mode 100755 index 79110a2..0000000 --- a/src/main/java/zk_proof_systems/zkSNARK/README.md +++ /dev/null @@ -1,4 +0,0 @@ -# zkSNARK - -This folder contains an implementation of a variant of the publicly-verifiable preprocessing zkSNARK [\[Gro16\]](https://eprint.iacr.org/2016/260) proposed by Bowe et al [\[BGM17\]](https://eprint.iacr.org/2017/1050.pdf). -The NP relation supported by this protocol is *Rank-1 Constraint Satisfaction* (R1CS). \ No newline at end of file diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java similarity index 98% rename from src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java index 128c052..9bf76ea 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; @@ -19,8 +19,8 @@ import relations.qap.QAPRelationRDD; import relations.qap.QAPWitnessRDD; import scala.Tuple2; -import zk_proof_systems.zkSNARK.objects.Proof; -import zk_proof_systems.zkSNARK.objects.ProvingKeyRDD; +import zk_proof_systems.zkSNARK.grothBGM17.objects.Proof; +import zk_proof_systems.zkSNARK.grothBGM17.objects.ProvingKeyRDD; public class DistributedProver { public static < diff --git a/src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java similarity index 97% rename from src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java index 5f5eaba..603a172 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/DistributedSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; @@ -21,9 +21,9 @@ import relations.qap.QAPRelationRDD; import relations.r1cs.R1CSRelationRDD; import scala.Tuple2; -import zk_proof_systems.zkSNARK.objects.CRS; -import zk_proof_systems.zkSNARK.objects.ProvingKeyRDD; -import zk_proof_systems.zkSNARK.objects.VerificationKey; +import zk_proof_systems.zkSNARK.grothBGM17.objects.CRS; +import zk_proof_systems.zkSNARK.grothBGM17.objects.ProvingKeyRDD; +import zk_proof_systems.zkSNARK.grothBGM17.objects.VerificationKey; public class DistributedSetup { public static < diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md new file mode 100755 index 0000000..156f13e --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md @@ -0,0 +1,60 @@ +# zkSNARK.grothBGM17 + +This folder contains an implementation of **a variant** of the publicly-verifiable preprocessing zkSNARK [\[Gro16\]](https://eprint.iacr.org/2016/260) proposed by Bowe et al. [\[BGM17\]](https://eprint.iacr.org/2017/1050.pdf). +The NP relation supported by this protocol is *Rank-1 Constraint Satisfaction* (R1CS). + +## zkSNARK.groth16 + +``` +Let \REL be defined as: +\REL = (\FF, inp, {A_i(x), B_i(x), C_i(x)}_{i=0}^{m}, t(x)) + +where: +- inp = prim_inputs \cup aux_inputs and where \inp_0 = \FF.mult_id (i.e. one) +- prim_inputs = (inp_1, ..., inp_l), i.e. l = |prim_inputs| +- aux_inputs = (inp_{l+1}, ..., inp_m), i.e. m-l = |aux_inputs|, m = |wires| + +\sum_{i=0}^{m} inp_i * A_i(x) * \sum_{i=0}^{m} inp_i * B_i(x) = \sum_{i=0}^{m} inp_i * C_i(x) + h(x)t(x) + +- deg(h(x)) = n-2 +- deg(t(x)) = n +``` + +### Setup + +``` +(\crs, \trap) \gets Setup(R) + +where: +- \trap \gets (\alpha, \beta, \gamma, \delta, x) +- \crs \gets (\crs_1, \crs_2), where \crs_1 (resp. \crs_2) is a set of \GG_1 (resp. G2) elements + +\crs_1 = \alpha, \beta, \delta, {x^i}_{i=0}^{n-1}, {(\beta * A_i + \alpha * B_i + C_i) / \gamma}_{i=0}^{l}, {(\beta * A_i + \alpha * B_i + C_i) / \delta}_{i=l+1}^{m}, {(x^i * t(x)) / \delta}_{i=0}^{n-2} + +\crs_2 = \beta, \gamma, \delta, {x^i}_{i=0}^{n-1} +``` + +### Prove + +``` +(\pi) \gets Prove(R, \crs, {inp_1, ..., inp_m}) + +\pi = (A, B, C) + +where: +- r, s \sample \FF +- A = \alpha + \sum_{i=0}^{m} inp_i*A_i(x) + r*\delta, A in \GG_1 +- B = \beta + \sum_{i=0}^{m} inp_i*B_i(x) + s*\delta, B in \GG_2 +- C = [\sum_{i=l+1}^{m} inp_i * (\beta * A_i(x) + \alpha * B_i(x) + C_i(x)) + h(x)t(x)] / \delta + A*s + B*r - rs*\delta +``` + +### Verify + +``` +(\res) \gets Verify(R, \crs, {inp_1, ..., inp_l}, \pi) + +\res \in \{0,1\} + +and where: +\res = Pairing(\pi.A, \pi.B) * Pairing(\sum_{i=0}^{l} inp_i *(\beta*A_i(x) + \alpha * B_i(x) + C_i(x)) / \gamma, \gamma) * Pairing(\pi.C, \delta) +``` \ No newline at end of file diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java similarity index 93% rename from src/main/java/zk_proof_systems/zkSNARK/SerialProver.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java index 5a26741..22044a2 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; @@ -17,8 +17,8 @@ import relations.qap.QAPRelation; import relations.qap.QAPWitness; import scala.Tuple2; -import zk_proof_systems.zkSNARK.objects.Proof; -import zk_proof_systems.zkSNARK.objects.ProvingKey; +import zk_proof_systems.zkSNARK.grothBGM17.objects.Proof; +import zk_proof_systems.zkSNARK.grothBGM17.objects.ProvingKey; public class SerialProver { public static < @@ -123,6 +123,9 @@ Proof prove( //G1T evaluationABC = // VariableBaseMSM.serialMSM( // auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); + System.out.println("[DEBUG] Prover, auxiliary.elements().size(): " + auxiliary.elements().size()); + System.out.println("[DEBUG] Prover, provingKey.deltaABCG1().size(): " + provingKey.deltaABCG1().size()); + System.out.println("[DEBUG] Both expected to be: numWitness = " + numWitness); G1T evaluationABC = VariableBaseMSM.serialMSM(auxiliary.elements(), provingKey.deltaABCG1()); evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta diff --git a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java similarity index 97% rename from src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java index 1da1da3..47eee66 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/SerialSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; @@ -20,9 +20,9 @@ import relations.qap.QAPRelation; import relations.r1cs.R1CSRelation; import scala.Tuple2; -import zk_proof_systems.zkSNARK.objects.CRS; -import zk_proof_systems.zkSNARK.objects.ProvingKey; -import zk_proof_systems.zkSNARK.objects.VerificationKey; +import zk_proof_systems.zkSNARK.grothBGM17.objects.CRS; +import zk_proof_systems.zkSNARK.grothBGM17.objects.ProvingKey; +import zk_proof_systems.zkSNARK.grothBGM17.objects.VerificationKey; public class SerialSetup { public static < diff --git a/src/main/java/zk_proof_systems/zkSNARK/Verifier.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java similarity index 84% rename from src/main/java/zk_proof_systems/zkSNARK/Verifier.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java index 22e95b5..d25dd24 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/Verifier.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; @@ -15,8 +15,8 @@ import algebra.msm.VariableBaseMSM; import configuration.Configuration; import relations.objects.Assignment; -import zk_proof_systems.zkSNARK.objects.Proof; -import zk_proof_systems.zkSNARK.objects.VerificationKey; +import zk_proof_systems.zkSNARK.grothBGM17.objects.Proof; +import zk_proof_systems.zkSNARK.grothBGM17.objects.VerificationKey; public class Verifier { public static < @@ -46,6 +46,9 @@ boolean verify( final GTT CDelta = pairing.reducedPairing(proof.gC(), delta); // RHS: Compute \sum_{i=0}^{numInputs} pubInp_i * (beta*A_i(x) + alpha*B_i(x) + C_i(x)) + System.out.println("[DEBUG] Verifier RHS, primaryInput.elements().size(): " + primaryInput.elements().size()); + System.out.println("[DEBUG] Verifier RHS, verificationKey.ABC().size(): " + verificationKey.ABC().size()); + System.out.println("[DEBUG] Both expected to be: numInputs"); final G1T evaluationABC = VariableBaseMSM.serialMSM(primaryInput.elements(), verificationKey.ABC()); diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java similarity index 96% rename from src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java index 72d59c1..d00b052 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/CRS.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK.objects; +package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/Proof.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java similarity index 93% rename from src/main/java/zk_proof_systems/zkSNARK/objects/Proof.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java index 3531bb4..6ffcf9f 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/Proof.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK.objects; +package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java similarity index 98% rename from src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java index 633b2cd..7d82413 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK.objects; +package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java similarity index 98% rename from src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java index 4fb13ab..7ec7b42 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/ProvingKeyRDD.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK.objects; +package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; diff --git a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java similarity index 97% rename from src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java rename to src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java index d7c5697..9eb6bbf 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/objects/VerificationKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK.objects; +package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; diff --git a/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java similarity index 92% rename from src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java rename to src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java index f3d5ee7..e2c71df 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java @@ -14,6 +14,12 @@ import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; +import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; +import algebra.curves.barreto_naehrig.bn254a.BN254aG1; +import algebra.curves.barreto_naehrig.bn254a.BN254aG2; +import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import algebra.curves.barreto_naehrig.bn254b.BN254bG1; import algebra.curves.barreto_naehrig.bn254b.BN254bG2; @@ -94,10 +100,10 @@ public void tearDown() { void DistributedBNProofSystemTest( final int numInputs, final int numConstraints, - BNFrT fieldFactory, - BNG1T g1Factory, - BNG2T g2Factory, - BNPairingT pairing) { + final BNFrT fieldFactory, + final BNG1T g1Factory, + final BNG2T g2Factory, + final BNPairingT pairing) { final Tuple3, Assignment, JavaPairRDD> construction = R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelationRDD r1cs = construction._1(); diff --git a/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java similarity index 83% rename from src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java rename to src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index 65e2a5d..f86d05c 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -13,23 +13,23 @@ import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; -import algebra.curves.barreto_naehrig.bn254a.BN254aFields; -import algebra.curves.barreto_naehrig.bn254a.BN254aG1; -import algebra.curves.barreto_naehrig.bn254a.BN254aG2; -import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; -import algebra.curves.barreto_naehrig.bn254b.BN254bFields; +//import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; +//import algebra.curves.barreto_naehrig.bn254a.BN254aG1; +//import algebra.curves.barreto_naehrig.bn254a.BN254aG2; +//import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; +//import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; +//import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; +import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import algebra.curves.barreto_naehrig.bn254b.BN254bG1; import algebra.curves.barreto_naehrig.bn254b.BN254bG2; import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.fake.*; -import algebra.curves.fake.fake_parameters.FakeFqParameters; -import algebra.curves.fake.fake_parameters.FakeG1Parameters; -import algebra.curves.fake.fake_parameters.FakeG2Parameters; -import algebra.fields.Fp; +//import algebra.curves.fake.*; +//import algebra.curves.fake.fake_parameters.FakeFqParameters; +//import algebra.curves.fake.fake_parameters.FakeG1Parameters; +//import algebra.curves.fake.fake_parameters.FakeG2Parameters; +//import algebra.fields.Fp; import configuration.Configuration; import java.io.Serializable; import org.junit.jupiter.api.BeforeEach; @@ -83,21 +83,21 @@ public void setUp() { void SerialBNProofSystemTest( final int numInputs, final int numConstraints, - BNFrT fieldFactory, - BNG1T g1Factory, - BNG2T g2Factory, - BNPairingT pairing) { + final BNFrT fieldFactory, + final BNG1T g1Factory, + final BNG2T g2Factory, + final BNPairingT pairing) { final Tuple3, Assignment, Assignment> construction = R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); - final Assignment fullAssignment = construction._3(); + final Assignment auxiliary = construction._3(); final CRS CRS = SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); final Proof proof = - SerialProver.prove(CRS.provingKey(), primary, fullAssignment, fieldFactory, config); + SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, pairing, config); @@ -105,6 +105,7 @@ void SerialBNProofSystemTest( assertTrue(isValid); } + /* @Test public void SerialFakeProofSystemTest() { final int numInputs = 1023; @@ -137,20 +138,20 @@ public void SerialFakeProofSystemTest() { public void SerialBN254aProofSystemTest() { final int numInputs = 1023; final int numConstraints = 1024; - //final BN254aFr fieldFactory = BN254aFr.ONE; - final BN254aFields.BN254aFr fieldFactory = new BN254aFields.BN254aFr(1); + final BN254aFr fieldFactory = BN254aFr.ONE; final BN254aG1 g1Factory = BN254aG1Parameters.ONE; final BN254aG2 g2Factory = BN254aG2Parameters.ONE; final BN254aPairing pairing = new BN254aPairing(); SerialBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); } + */ @Test public void SerialBN254bProofSystemTest() { final int numInputs = 1023; final int numConstraints = 1024; - final BN254bFields.BN254bFr fieldFactory = new BN254bFields.BN254bFr(1); + final BN254bFr fieldFactory = BN254bFr.ONE; final BN254bG1 g1Factory = BN254bG1Parameters.ONE; final BN254bG2 g2Factory = BN254bG2Parameters.ONE; final BN254bPairing pairing = new BN254bPairing(); From b7970e5b01811aef0981e6d86156df8d86d7c130 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 14:50:44 +0000 Subject: [PATCH 46/94] Added clarifying comment about FFT domain --- src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java | 1 + 1 file changed, 1 insertion(+) diff --git a/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java b/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java index d8272cb..803acdb 100755 --- a/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java +++ b/src/main/java/reductions/r1cs_to_qap/R1CStoQAP.java @@ -36,6 +36,7 @@ QAPRelation R1CStoQAPRelation(final R1CSRelation r1cs, final Fie final int numInputs = r1cs.numInputs(); final int numVariables = r1cs.numVariables(); final int numConstraints = r1cs.numConstraints(); + // SerialFFT returns a pow 2 domain final SerialFFT domain = new SerialFFT<>(numConstraints + numInputs, t); final FieldT zero = t.zero(); From 5ce1d79e3fa46ed677dbab72816637d3b119ca4b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 15:24:48 +0000 Subject: [PATCH 47/94] Pointed to BGM snark in profiler and introduced back plain groth snark --- src/main/java/profiler/Profiler.java | 12 +- .../profiler/profiling/ZKSNARKProfiling.java | 21 +- .../zkSNARK/groth16/DistributedProver.java | 175 ++++++++++++++++ .../zkSNARK/groth16/DistributedSetup.java | 189 ++++++++++++++++++ .../zkSNARK/groth16/README.md | 58 ++++++ .../zkSNARK/groth16/SerialProver.java | 129 ++++++++++++ .../zkSNARK/groth16/SerialSetup.java | 176 ++++++++++++++++ .../zkSNARK/groth16/Verifier.java | 62 ++++++ .../zkSNARK/groth16/objects/CRS.java | 55 +++++ .../zkSNARK/groth16/objects/Proof.java | 37 ++++ .../zkSNARK/groth16/objects/ProvingKey.java | 99 +++++++++ .../groth16/objects/ProvingKeyRDD.java | 111 ++++++++++ .../groth16/objects/VerificationKey.java | 77 +++++++ .../zkSNARK/grothBGM17/README.md | 56 +----- .../grothBGM17/DistributedzkSNARKTest.java | 6 +- .../zkSNARK/grothBGM17/SerialzkSNARKTest.java | 6 +- 16 files changed, 1205 insertions(+), 64 deletions(-) create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedProver.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedSetup.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/README.md create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/SerialProver.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/SerialSetup.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/Verifier.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/objects/CRS.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/objects/Proof.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKey.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKeyRDD.java create mode 100755 src/main/java/zk_proof_systems/zkSNARK/groth16/objects/VerificationKey.java diff --git a/src/main/java/profiler/Profiler.java b/src/main/java/profiler/Profiler.java index db9d791..2b3ef4d 100755 --- a/src/main/java/profiler/Profiler.java +++ b/src/main/java/profiler/Profiler.java @@ -1,11 +1,19 @@ package profiler; -import configuration.Configuration; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.SparkSession; import org.apache.spark.storage.StorageLevel; -import profiler.profiling.*; + +import configuration.Configuration; +import profiler.profiling.FFTProfiling; +import profiler.profiling.FixedBaseMSMProfiling; +import profiler.profiling.LagrangeProfiling; +import profiler.profiling.MatrixMultiplicationProfiling; +import profiler.profiling.R1CStoQAPRelationProfiling; +import profiler.profiling.R1CStoQAPWitnessProfiling; +import profiler.profiling.VariableBaseMSMProfiling; +import profiler.profiling.ZKSNARKProfiling; import profiler.utils.SparkUtils; public class Profiler { diff --git a/src/main/java/profiler/profiling/ZKSNARKProfiling.java b/src/main/java/profiler/profiling/ZKSNARKProfiling.java index b04f7e1..5943665 100755 --- a/src/main/java/profiler/profiling/ZKSNARKProfiling.java +++ b/src/main/java/profiler/profiling/ZKSNARKProfiling.java @@ -1,5 +1,7 @@ package profiler.profiling; +import org.apache.spark.api.java.JavaPairRDD; + import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import algebra.curves.barreto_naehrig.bn254a.BN254aG1; import algebra.curves.barreto_naehrig.bn254a.BN254aG2; @@ -13,15 +15,26 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import configuration.Configuration; -import org.apache.spark.api.java.JavaPairRDD; import profiler.generation.R1CSConstructor; import relations.objects.Assignment; import relations.r1cs.R1CSRelation; import relations.r1cs.R1CSRelationRDD; import scala.Tuple3; -import zk_proof_systems.zkSNARK.*; -import zk_proof_systems.zkSNARK.objects.CRS; -import zk_proof_systems.zkSNARK.objects.Proof; +import zk_proof_systems.zkSNARK.grothBGM17.DistributedProver; +import zk_proof_systems.zkSNARK.grothBGM17.DistributedSetup; +import zk_proof_systems.zkSNARK.grothBGM17.SerialProver; +import zk_proof_systems.zkSNARK.grothBGM17.SerialSetup; +import zk_proof_systems.zkSNARK.grothBGM17.Verifier; +import zk_proof_systems.zkSNARK.grothBGM17.objects.CRS; +import zk_proof_systems.zkSNARK.grothBGM17.objects.Proof; + +// TODO: Implement a more flexible way to switch from one proof system to another +// (as done in Zeth). This will certainly require to have a type parameter for the +// SNARK from which we can access the objects. + +// WARNING: Switching to groth16 here will not work out of the box. This would require +// to update the type of the CRS to introduce the GTT parameter back: +// i.e. CRS -> CRS public class ZKSNARKProfiling { diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedProver.java new file mode 100755 index 0000000..953e15c --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedProver.java @@ -0,0 +1,175 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.fields.AbstractFieldElementExpanded; +import algebra.msm.VariableBaseMSM; +import configuration.Configuration; +import org.apache.spark.api.java.JavaPairRDD; +import org.apache.spark.api.java.JavaRDD; +import reductions.r1cs_to_qap.R1CStoQAPRDD; +import relations.objects.Assignment; +import relations.qap.QAPRelationRDD; +import relations.qap.QAPWitnessRDD; +import scala.Tuple2; +import zk_proof_systems.zkSNARK.groth16.objects.Proof; +import zk_proof_systems.zkSNARK.groth16.objects.ProvingKeyRDD; + +public class DistributedProver { + public static < + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2> + Proof prove( + final ProvingKeyRDD provingKey, + final Assignment primary, + final JavaPairRDD fullAssignment, + final FieldT fieldFactory, + final Configuration config) { + // Note: `R1CStoQAPWitness` already checks the value of the configuration `debugFlag`, and + // already checks that the R1CS is satisfied on input `primary` and `fullAssignment`. No need to + // do it again it, this is redundant. + // if (config.debugFlag()) { + // assert (provingKey.r1cs().isSatisfied(primary, fullAssignment)); + // } + + config.beginLog("Computing witness polynomial"); + final QAPWitnessRDD qapWitness = + R1CStoQAPRDD.R1CStoQAPWitness( + provingKey.r1cs(), primary, fullAssignment, fieldFactory, config); + config.endLog("Computing witness polynomial"); + + if (config.debugFlag()) { + // We are dividing degree 2(d-1) polynomial by degree d polynomial + // and not adding a PGHR-style ZK-patch, so our H is degree d-2. + final FieldT zero = fieldFactory.zero(); + // 1. Filter the coeffs to only get the 3 coeffs at indices d-2, d-1, d + // 2. Carry out checks on these 3 coeffs, namely: + // - coeff at d-2 can not be 0 (we want a poly of deg d-2) + // - coeffs at d-1, d must be 0 (we want a poly of deg d-2) + qapWitness + .coefficientsH() + .filter(e -> e._1 >= qapWitness.degree() - 2) + .foreach( + coeff -> { + if (coeff._1 == qapWitness.degree() - 2) { + assert (!coeff._2.equals(zero)); + } else if (coeff._1 > qapWitness.degree() - 2) { + assert (coeff._2.equals(zero)); + } + }); + // Check that the witness satisfies the QAP relation + // To that end, we pick a random evaluation point t and check + // that the QAP is satisfied (i.e. this random evaluation point is not the one in the SRS) + final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); + final QAPRelationRDD qap = + R1CStoQAPRDD.R1CStoQAPRelation(provingKey.r1cs(), t, config); + assert (qap.isSatisfied(qapWitness)); + } + + // Unpersist the R1CS constraints RDDs and free up memory. + provingKey.r1cs().constraints().A().unpersist(); + provingKey.r1cs().constraints().B().unpersist(); + provingKey.r1cs().constraints().C().unpersist(); + + // Choose two random field elements for prover zero-knowledge. + final FieldT r = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT s = fieldFactory.random(config.seed(), config.secureSeed()); + + // Get initial parameters from the proving key. + final G1T alphaG1 = provingKey.alphaG1(); + final G1T betaG1 = provingKey.betaG1(); + final G2T betaG2 = provingKey.betaG2(); + final G1T deltaG1 = provingKey.deltaG1(); + final G2T deltaG2 = provingKey.deltaG2(); + final G1T rsDelta = deltaG1.mul(r.mul(s)); + + // Number of partitions per RDD set in the config + final int numPartitions = config.numPartitions(); + + config.beginRuntime("Generate proof"); + + config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); + // Get an RDD containing all pairs of elements with **matching keys** in `fullAssignment` and + // `provingKey.queryA()`. The result of this `.join()` will be a (k, (v1, v2)) tuple, where (k, + // v1) is in `fullAssignment` and (k, v2) is in `provingKey.queryA()`. Then, the `.value()` + // returns the tuple (v1, v2) - removing the keys - which is just an index used to associate the + // right scalar to the right group element in light of the `VariableBaseMSM` triggered at the + // next line. + final JavaRDD> computationA = + fullAssignment.join(provingKey.queryA(), numPartitions).values(); + // `evaluationAt` = \sum_{i=0}^{m} a_i * u_i(x) (in Groth16) + // where m = total number of wires + // a_i = ith wire/variable + final G1T evaluationAt = VariableBaseMSM.distributedMSM(computationA); + // Once `queryA` is not useful anymore, mark the RDD as non-persistent, and remove all blocks + // for it from memory and disk. + provingKey.queryA().unpersist(); + config.endLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); + + config.beginLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); + final JavaRDD>> computationB = + fullAssignment.join(provingKey.queryB(), numPartitions).values(); + // `evaluationBt` = \sum_{i=0}^{m} a_i * v_i(x) (in Groth16) + // where m = total number of wires + // a_i = ith wire/variable + // Note: We get an evaluation in G1 and G2, because B \in G2 is formed using this term, and C + // (\in G1) also uses this term (see below, B is of type `Tuple2` and will actually be + // computed in both G1 and G2 for this exact purpose). + final Tuple2 evaluationBt = VariableBaseMSM.distributedDoubleMSM(computationB); + provingKey.queryB().unpersist(); + config.endLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); + + // Compute evaluationABC = variable_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + // In Groth16 notation, this is used for the computation of C: + // [\sum_{i=l+1}^{m} a_i*((beta*u_i(x) + alpha*v_i(x) + w_i(x))] /delta. + // where m = total number of wires + // a_i = ith wire/variable + config.beginLog("Computing evaluation to deltaABC"); + final JavaRDD> deltaABCAuxiliary = + fullAssignment.join(provingKey.deltaABCG1(), numPartitions).values(); + G1T evaluationABC = VariableBaseMSM.distributedMSM(deltaABCAuxiliary); + provingKey.deltaABCG1().unpersist(); + fullAssignment.unpersist(); + config.endLog("Computing evaluation to deltaABC"); + + config.beginLog("Computing evaluation to query H"); + // In Groth16 notations, `queryH` is the encoding in G1 of the vector <(x^i * t(x))/delta>, for + // i \in [0, n-2] + // As such, the value of `evaluationHtZtOverDelta` actually is: (h(x)t(x))/delta if we follow + // Groth's notations + final JavaRDD> computationH = + qapWitness.coefficientsH().join(provingKey.queryH(), numPartitions).values(); + final G1T evaluationHtZtOverDelta = VariableBaseMSM.distributedMSM(computationH); + provingKey.queryH().unpersist(); + // Add H(t)*Z(t)/delta to `evaluationABC` to get the first term of C, namely (following Groth's + // notations): + // [\sum_{i = l+1}^{m} a_i * (beta * u_i(x) + alpha * v_i(x) + w_i(x) + h(x)t(x))]/delta + evaluationABC = evaluationABC.add(evaluationHtZtOverDelta); + config.endLog("Computing evaluation to query H"); + + // A = alpha + sum_i(a_i*A_i(t)) + r*delta + final G1T A = alphaG1.add(evaluationAt).add(deltaG1.mul(r)); + + // B = beta + sum_i(a_i*B_i(t)) + s*delta + final Tuple2 B = + new Tuple2<>( + betaG1.add(evaluationBt._1).add(deltaG1.mul(s)), + betaG2.add(evaluationBt._2).add(deltaG2.mul(s))); + + // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - + // r*s*delta + final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); + + config.endRuntime("Generate proof"); + + return new Proof<>(A, B._2, C); + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedSetup.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedSetup.java new file mode 100755 index 0000000..f67f0cb --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/DistributedSetup.java @@ -0,0 +1,189 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.curves.AbstractGT; +import algebra.curves.AbstractPairing; +import algebra.fields.AbstractFieldElementExpanded; +import algebra.msm.FixedBaseMSM; +import common.Utils; +import configuration.Configuration; +import java.util.List; +import org.apache.spark.api.java.JavaPairRDD; +import reductions.r1cs_to_qap.R1CStoQAPRDD; +import relations.qap.QAPRelationRDD; +import relations.r1cs.R1CSRelationRDD; +import scala.Tuple2; +import zk_proof_systems.zkSNARK.groth16.objects.CRS; +import zk_proof_systems.zkSNARK.groth16.objects.ProvingKeyRDD; +import zk_proof_systems.zkSNARK.groth16.objects.VerificationKey; + +public class DistributedSetup { + public static < + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2, + GTT extends AbstractGT, + PairingT extends AbstractPairing> + CRS generate( + final R1CSRelationRDD r1cs, + final FieldT fieldFactory, + final G1T g1Factory, + final G2T g2Factory, + final PairingT pairing, + final Configuration config) { + // Generate secret randomness. + final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT alpha = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT beta = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT gamma = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT delta = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT inverseGamma = gamma.inverse(); + final FieldT inverseDelta = delta.inverse(); + + // A quadratic arithmetic program evaluated at t. + final QAPRelationRDD qap = R1CStoQAPRDD.R1CStoQAPRelation(r1cs, t, config); + + final int numInputs = qap.numInputs(); + final long numVariables = qap.numVariables(); + final int numPartitions = config.numPartitions(); + + System.out.println("\tQAP - primary input size: " + numInputs); + System.out.println("\tQAP - total input size: " + numVariables); + System.out.println("\tQAP - pre degree: " + r1cs.numConstraints()); + System.out.println("\tQAP - degree: " + qap.degree()); + + // The gamma inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * gamma^{-1} + // The delta inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * delta^{-1} + config.beginLog("Computing deltaABC and gammaABC for R1CS proving key and verification key"); + final JavaPairRDD betaAt = qap.At().mapValues(a -> a.mul(beta)); + final JavaPairRDD alphaBt = qap.Bt().mapValues(b -> b.mul(alpha)); + final JavaPairRDD ABC = + betaAt + .union(alphaBt) + .union(qap.Ct()) + .reduceByKey(FieldT::add) + .persist(config.storageLevel()); + final JavaPairRDD gammaABC = + ABC.filter(e -> e._1 < numInputs).mapValues(e -> e.mul(inverseGamma)); + final JavaPairRDD deltaABC = + ABC.filter(e -> e._1 >= numInputs).mapValues(e -> e.mul(inverseDelta)); + config.endLog("Computing deltaABC and gammaABC for R1CS proving key and verification key"); + + config.beginLog("Computing query densities"); + final long numNonZeroAt = qap.At().filter(e -> !e._2.isZero()).count(); + final long numNonZeroBt = qap.Bt().filter(e -> !e._2.isZero()).count(); + config.endLog("Computing query densities"); + + config.beginLog("Generating G1 MSM Window Table"); + final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + final int scalarSizeG1 = generatorG1.bitSize(); + final long scalarCountG1 = numNonZeroAt + numNonZeroBt + numVariables; + final int windowSizeG1 = FixedBaseMSM.getWindowSize(scalarCountG1 / numPartitions, generatorG1); + final List> windowTableG1 = + FixedBaseMSM.getWindowTable(generatorG1, scalarSizeG1, windowSizeG1); + config.endLog("Generating G1 MSM Window Table"); + + config.beginLog("Generating G2 MSM Window Table"); + final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + final int scalarSizeG2 = generatorG2.bitSize(); + final long scalarCountG2 = numNonZeroBt; + final int windowSizeG2 = FixedBaseMSM.getWindowSize(scalarCountG2 / numPartitions, generatorG2); + final List> windowTableG2 = + FixedBaseMSM.getWindowTable(generatorG2, scalarSizeG2, windowSizeG2); + config.endLog("Generating G2 MSM Window Table"); + + config.beginLog("Generating R1CS proving key"); + config.beginRuntime("Proving Key"); + + final G1T alphaG1 = generatorG1.mul(alpha); + final G1T betaG1 = generatorG1.mul(beta); + final G2T betaG2 = generatorG2.mul(beta); + final G1T deltaG1 = generatorG1.mul(delta); + final G2T deltaG2 = generatorG2.mul(delta); + + config.beginLog("Encoding deltaABC for R1CS proving key"); + final JavaPairRDD deltaABCG1 = + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, deltaABC, config.sparkContext()) + .persist(config.storageLevel()); + deltaABCG1.count(); + qap.Ct().unpersist(); + config.endLog("Encoding deltaABC for R1CS proving key"); + + config.beginLog("Computing query A"); + final JavaPairRDD queryA = + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, qap.At(), config.sparkContext()) + .persist(config.storageLevel()); + queryA.count(); + qap.At().unpersist(); + config.endLog("Computing query A"); + + config.beginLog("Computing query B"); + final JavaPairRDD> queryB = + FixedBaseMSM.distributedDoubleBatchMSM( + scalarSizeG1, + windowSizeG1, + windowTableG1, + scalarSizeG2, + windowSizeG2, + windowTableG2, + qap.Bt(), + config.sparkContext()) + .persist(config.storageLevel()); + queryB.count(); + qap.Bt().unpersist(); + config.endLog("Computing query B"); + + config.beginLog("Computing query H"); + final FieldT inverseDeltaZt = qap.Zt().mul(delta.inverse()); + final JavaPairRDD inverseDeltaHtZt = + qap.Ht().mapValues((e) -> e.mul(inverseDeltaZt)); + final JavaPairRDD queryH = + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, inverseDeltaHtZt, config.sparkContext()) + .persist(config.storageLevel()); + queryH.count(); + qap.Ht().unpersist(); + config.endLog("Computing query H"); + + config.endLog("Generating R1CS proving key"); + config.endRuntime("Proving Key"); + + config.beginLog("Computing gammaABC for R1CS verification key"); + config.beginRuntime("Verification Key"); + final GTT alphaG1betaG2 = pairing.reducedPairing(alphaG1, betaG2); + final G2T gammaG2 = generatorG2.mul(gamma); + final JavaPairRDD gammaABCG1 = + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, gammaABC, config.sparkContext()) + .persist(config.storageLevel()); + final JavaPairRDD fullGammaABCG1 = + Utils.fillRDD(numInputs, generatorG1.zero(), config) + .union(gammaABCG1) + .reduceByKey(G1T::add); + final List UVWGammaG1 = Utils.convertFromPair(fullGammaABCG1.collect(), numInputs); + ABC.unpersist(); + config.endLog("Computing gammaABC for R1CS verification key"); + config.endRuntime("Verification Key"); + + // Construct the proving key. + final ProvingKeyRDD provingKey = + new ProvingKeyRDD<>( + alphaG1, betaG1, betaG2, deltaG1, deltaG2, deltaABCG1, queryA, queryB, queryH, r1cs); + + // Construct the verification key. + final VerificationKey verificationKey = + new VerificationKey<>(alphaG1betaG2, gammaG2, deltaG2, UVWGammaG1); + + return new CRS<>(provingKey, verificationKey); + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/README.md b/src/main/java/zk_proof_systems/zkSNARK/groth16/README.md new file mode 100755 index 0000000..d32f892 --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/README.md @@ -0,0 +1,58 @@ +# zkSNARK.groth16 + +This folder contains an implementation of a publicly-verifiable preprocessing zkSNARK of Groth [\[Gro16\]](https://eprint.iacr.org/2016/260). +The NP relation supported by this protocol is *Rank-1 Constraint Satisfaction* (R1CS). + +``` +Let \REL be defined as: +\REL = (\FF, inp, {A_i(x), B_i(x), C_i(x)}_{i=0}^{m}, t(x)) + +where: +- inp = prim_inputs \cup aux_inputs and where \inp_0 = \FF.mult_id (i.e. one) +- prim_inputs = (inp_1, ..., inp_l), i.e. l = |prim_inputs| +- aux_inputs = (inp_{l+1}, ..., inp_m), i.e. m-l = |aux_inputs|, m = |wires| + +\sum_{i=0}^{m} inp_i * A_i(x) * \sum_{i=0}^{m} inp_i * B_i(x) = \sum_{i=0}^{m} inp_i * C_i(x) + h(x)t(x) + +- deg(h(x)) = n-2 +- deg(t(x)) = n +``` + +## Setup + +``` +(\crs, \trap) \gets Setup(R) + +where: +- \trap \gets (\alpha, \beta, \gamma, \delta, x) +- \crs \gets (\crs_1, \crs_2), where \crs_1 (resp. \crs_2) is a set of \GG_1 (resp. G2) elements + +\crs_1 = \alpha, \beta, \delta, {x^i}_{i=0}^{n-1}, {(\beta * A_i + \alpha * B_i + C_i) / \gamma}_{i=0}^{l}, {(\beta * A_i + \alpha * B_i + C_i) / \delta}_{i=l+1}^{m}, {(x^i * t(x)) / \delta}_{i=0}^{n-2} + +\crs_2 = \beta, \gamma, \delta, {x^i}_{i=0}^{n-1} +``` + +## Prove + +``` +(\pi) \gets Prove(R, \crs, {inp_1, ..., inp_m}) + +\pi = (A, B, C) + +where: +- r, s \sample \FF +- A = \alpha + \sum_{i=0}^{m} inp_i*A_i(x) + r*\delta, A in \GG_1 +- B = \beta + \sum_{i=0}^{m} inp_i*B_i(x) + s*\delta, B in \GG_2 +- C = [\sum_{i=l+1}^{m} inp_i * (\beta * A_i(x) + \alpha * B_i(x) + C_i(x)) + h(x)t(x)] / \delta + A*s + B*r - rs*\delta +``` + +## Verify + +``` +(\res) \gets Verify(R, \crs, {inp_1, ..., inp_l}, \pi) + +\res \in \{0,1\} + +and where: +\res = Pairing(\pi.A, \pi.B) * Pairing(\sum_{i=0}^{l} inp_i *(\beta*A_i(x) + \alpha * B_i(x) + C_i(x)) / \gamma, \gamma) * Pairing(\pi.C, \delta) +``` \ No newline at end of file diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/SerialProver.java new file mode 100755 index 0000000..9d3c490 --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/SerialProver.java @@ -0,0 +1,129 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.fields.AbstractFieldElementExpanded; +import algebra.msm.VariableBaseMSM; +import configuration.Configuration; +import reductions.r1cs_to_qap.R1CStoQAP; +import relations.objects.Assignment; +import relations.qap.QAPRelation; +import relations.qap.QAPWitness; +import scala.Tuple2; +import zk_proof_systems.zkSNARK.groth16.objects.Proof; +import zk_proof_systems.zkSNARK.groth16.objects.ProvingKey; + +public class SerialProver { + public static < + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2> + Proof prove( + final ProvingKey provingKey, + final Assignment primary, + final Assignment auxiliary, + final FieldT fieldFactory, + final Configuration config) { + // If the debug flag is set, check up-front that the R1CS is satisfied + if (config.debugFlag()) { + assert (provingKey.r1cs().isSatisfied(primary, auxiliary)); + } + + config.beginRuntime("Witness"); + config.beginLog("Computing witness polynomial"); + final QAPWitness qapWitness = + R1CStoQAP.R1CStoQAPWitness(provingKey.r1cs(), primary, auxiliary, fieldFactory, config); + config.endLog("Computing witness polynomial"); + config.endRuntime("Witness"); + + if (config.debugFlag()) { + // We are dividing degree 2(d-1) polynomial by degree d polynomial + // and not adding a PGHR-style ZK-patch, so our H is degree d-2. + final FieldT zero = fieldFactory.zero(); + // 1. Verify that H has a non-zero d-2 coefficient + assert (!qapWitness.coefficientsH(qapWitness.degree() - 2).equals(zero)); + // 2. Make sure that coefficients d-1 and d are 0 to make sure that the polynomial hasn't a + // degree higher than d-2 + assert (qapWitness.coefficientsH(qapWitness.degree() - 1).equals(zero)); + assert (qapWitness.coefficientsH(qapWitness.degree()).equals(zero)); + // Check that the witness satisfies the QAP relation. + final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); + final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(provingKey.r1cs(), t); + assert (qap.isSatisfied(qapWitness)); + } + + // Choose two random field elements for prover zero-knowledge. + final FieldT r = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT s = fieldFactory.random(config.seed(), config.secureSeed()); + + // Get initial parameters from the proving key. + final G1T alphaG1 = provingKey.alphaG1(); + final G1T betaG1 = provingKey.betaG1(); + final G2T betaG2 = provingKey.betaG2(); + final G1T deltaG1 = provingKey.deltaG1(); + final G2T deltaG2 = provingKey.deltaG2(); + final G1T rsDelta = deltaG1.mul(r.mul(s)); + + final int numInputs = provingKey.r1cs().numInputs(); + final int numVariables = provingKey.r1cs().numVariables(); + + config.beginRuntime("Proof"); + + config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); + G1T evaluationAt = + VariableBaseMSM.serialMSM(primary.elements(), provingKey.queryA().subList(0, numInputs)); + evaluationAt = + evaluationAt.add( + VariableBaseMSM.serialMSM( + auxiliary.elements(), provingKey.queryA().subList(numInputs, numVariables))); + config.endLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); + + config.beginLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); + final Tuple2 evaluationBtPrimary = + VariableBaseMSM.doubleMSM(primary.elements(), provingKey.queryB().subList(0, numInputs)); + final Tuple2 evaluationBtWitness = + VariableBaseMSM.doubleMSM( + auxiliary.elements(), provingKey.queryB().subList(numInputs, numVariables)); + final G1T evaluationBtG1 = evaluationBtPrimary._1.add(evaluationBtWitness._1); + final G2T evaluationBtG2 = evaluationBtPrimary._2.add(evaluationBtWitness._2); + config.endLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); + + config.beginLog("Computing evaluation to query H"); + final G1T evaluationHtZt = + VariableBaseMSM.serialMSM(qapWitness.coefficientsH(), provingKey.queryH()); + config.endLog("Computing evaluation to query H"); + + // Compute evaluationABC = a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + config.beginLog("Computing evaluation to deltaABC"); + final int numWitness = numVariables - numInputs; + G1T evaluationABC = + VariableBaseMSM.serialMSM( + auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); + evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta + config.endLog("Computing evaluation to deltaABC"); + + // A = alpha + sum_i(a_i*A_i(t)) + r*delta + final G1T A = alphaG1.add(evaluationAt).add(deltaG1.mul(r)); + + // B = beta + sum_i(a_i*B_i(t)) + s*delta + final Tuple2 B = + new Tuple2<>( + betaG1.add(evaluationBtG1).add(deltaG1.mul(s)), + betaG2.add(evaluationBtG2).add(deltaG2.mul(s))); + + // C = sum_i(a_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - + // r*s*delta + final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); + + config.endRuntime("Proof"); + + return new Proof<>(A, B._2, C); + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/SerialSetup.java new file mode 100755 index 0000000..e60ea3d --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/SerialSetup.java @@ -0,0 +1,176 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.curves.AbstractGT; +import algebra.curves.AbstractPairing; +import algebra.fields.AbstractFieldElementExpanded; +import algebra.msm.FixedBaseMSM; +import configuration.Configuration; +import java.util.ArrayList; +import java.util.List; +import reductions.r1cs_to_qap.R1CStoQAP; +import relations.qap.QAPRelation; +import relations.r1cs.R1CSRelation; +import scala.Tuple2; +import zk_proof_systems.zkSNARK.groth16.objects.CRS; +import zk_proof_systems.zkSNARK.groth16.objects.ProvingKey; +import zk_proof_systems.zkSNARK.groth16.objects.VerificationKey; + +public class SerialSetup { + public static < + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2, + GTT extends AbstractGT, + PairingT extends AbstractPairing> + CRS generate( + final R1CSRelation r1cs, + final FieldT fieldFactory, + final G1T g1Factory, + final G2T g2Factory, + final PairingT pairing, + final Configuration config) { + // Generate secret randomness. + final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT alpha = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT beta = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT gamma = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT delta = fieldFactory.random(config.seed(), config.secureSeed()); + final FieldT inverseGamma = gamma.inverse(); + final FieldT inverseDelta = delta.inverse(); + + // A quadratic arithmetic program evaluated at t. + final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(r1cs, t); + + System.out.println("\tQAP - primary input size: " + qap.numInputs()); + System.out.println("\tQAP - total input size: " + qap.numVariables()); + System.out.println("\tQAP - pre degree: " + r1cs.numConstraints()); + System.out.println("\tQAP - degree: " + qap.degree()); + + final int numInputs = qap.numInputs(); + final int numVariables = qap.numVariables(); + + // The gamma inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * gamma^{-1}. + config.beginLog("Computing gammaABC for R1CS verification key"); + final List gammaABC = new ArrayList<>(numInputs); + for (int i = 0; i < numInputs; i++) { + gammaABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i)).mul(inverseGamma)); + } + config.endLog("Computing gammaABC for R1CS verification key"); + + // The delta inverse product component: (beta*A_i(t) + alpha*B_i(t) + C_i(t)) * delta^{-1}. + config.beginLog("Computing deltaABC for R1CS proving key"); + final List deltaABC = new ArrayList<>(numVariables - numInputs); + for (int i = numInputs; i < numVariables; i++) { + deltaABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i)).mul(inverseDelta)); + } + config.endLog("Computing deltaABC for R1CS proving key"); + + config.beginLog("Computing query densities"); + int nonZeroAt = 0; + int nonZeroBt = 0; + for (int i = 0; i < qap.numVariables(); i++) { + if (!qap.At(i).isZero()) { + nonZeroAt++; + } + if (!qap.Bt(i).isZero()) { + nonZeroBt++; + } + } + config.endLog("Computing query densities"); + + config.beginLog("Generating G1 MSM Window Table"); + final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + final int scalarCountG1 = nonZeroAt + nonZeroBt + numVariables; + final int scalarSizeG1 = generatorG1.bitSize(); + final int windowSizeG1 = FixedBaseMSM.getWindowSize(scalarCountG1, generatorG1); + final List> windowTableG1 = + FixedBaseMSM.getWindowTable(generatorG1, scalarSizeG1, windowSizeG1); + config.endLog("Generating G1 MSM Window Table"); + + config.beginLog("Generating G2 MSM Window Table"); + final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + final int scalarCountG2 = nonZeroBt; + final int scalarSizeG2 = generatorG2.bitSize(); + final int windowSizeG2 = FixedBaseMSM.getWindowSize(scalarCountG2, generatorG2); + final List> windowTableG2 = + FixedBaseMSM.getWindowTable(generatorG2, scalarSizeG2, windowSizeG2); + config.endLog("Generating G2 MSM Window Table"); + + config.beginLog("Generating R1CS proving key"); + config.beginRuntime("Proving Key"); + + final G1T alphaG1 = generatorG1.mul(alpha); + final G1T betaG1 = generatorG1.mul(beta); + final G2T betaG2 = generatorG2.mul(beta); + final G1T deltaG1 = generatorG1.mul(delta); + final G2T deltaG2 = generatorG2.mul(delta); + + config.beginLog("Encode deltaABC for R1CS proving key", false); + final List deltaABCG1 = + FixedBaseMSM.batchMSM(scalarSizeG1, windowSizeG1, windowTableG1, deltaABC); + config.endLog("Encode deltaABC for R1CS proving key", false); + + config.beginLog("Computing query A", false); + final List queryA = + FixedBaseMSM.batchMSM(scalarSizeG1, windowSizeG1, windowTableG1, qap.At()); + config.endLog("Computing query A", false); + + config.beginLog("Computing query B", false); + final List> queryB = + FixedBaseMSM.doubleBatchMSM( + scalarSizeG1, + windowSizeG1, + windowTableG1, + scalarSizeG2, + windowSizeG2, + windowTableG2, + qap.Bt()); + config.endLog("Computing query B", false); + + config.beginLog("Computing query H", false); + final FieldT inverseDeltaZt = qap.Zt().mul(delta.inverse()); + for (int i = 0; i < qap.Ht().size(); i++) { + qap.Ht().set(i, qap.Ht().get(i).mul(inverseDeltaZt)); + } + final List queryH = + FixedBaseMSM.batchMSM(scalarSizeG1, windowSizeG1, windowTableG1, qap.Ht()); + config.endLog("Computing query H", false); + + config.endLog("Generating R1CS proving key"); + config.endRuntime("Proving Key"); + + config.beginLog("Generating R1CS verification key"); + config.beginRuntime("Verification Key"); + + final GTT alphaG1betaG2 = pairing.reducedPairing(alphaG1, betaG2); + final G2T gammaG2 = generatorG2.mul(gamma); + + config.beginLog("Encoding gammaABC for R1CS verification key"); + final List gammaABCG1 = + FixedBaseMSM.batchMSM(scalarSizeG1, windowSizeG1, windowTableG1, gammaABC); + config.endLog("Encoding gammaABC for R1CS verification key"); + + config.endLog("Generating R1CS verification key"); + config.endRuntime("Verification Key"); + + // Construct the proving key. + final ProvingKey provingKey = + new ProvingKey<>( + alphaG1, betaG1, betaG2, deltaG1, deltaG2, deltaABCG1, queryA, queryB, queryH, r1cs); + + // Construct the verification key. + final VerificationKey verificationKey = + new VerificationKey<>(alphaG1betaG2, gammaG2, deltaG2, gammaABCG1); + + return new CRS<>(provingKey, verificationKey); + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/Verifier.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/Verifier.java new file mode 100755 index 0000000..c291bc9 --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/Verifier.java @@ -0,0 +1,62 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.curves.AbstractGT; +import algebra.curves.AbstractPairing; +import algebra.fields.AbstractFieldElementExpanded; +import algebra.msm.VariableBaseMSM; +import configuration.Configuration; +import relations.objects.Assignment; +import zk_proof_systems.zkSNARK.groth16.objects.Proof; +import zk_proof_systems.zkSNARK.groth16.objects.VerificationKey; + +public class Verifier { + public static < + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2, + GTT extends AbstractGT, + PairingT extends AbstractPairing> + boolean verify( + final VerificationKey verificationKey, + final Assignment primaryInput, + final Proof proof, + final PairingT pairing, + final Configuration config) { + // Assert first element == FieldT.one(). + final FieldT firstElement = primaryInput.get(0); + assert (firstElement.equals(firstElement.one())); + + // Compute the left hand side: A * B. + final GTT AB = pairing.reducedPairing(proof.gA(), proof.gB()); + + // Get alpha, beta, gamma, and delta from the verification key. + final GTT alphaBeta = verificationKey.alphaG1betaG2(); // alpha * beta + final G2T gamma = verificationKey.gammaG2(); + final G2T delta = verificationKey.deltaG2(); + final GTT CDelta = pairing.reducedPairing(proof.gC(), delta); // C * delta + + // Compute the summation of primaryInput[i] * (beta*u_i(x) + alpha*v_i(x) + w_i(x)) * gamma^{-1} + final G1T evaluationABC = + VariableBaseMSM.serialMSM(primaryInput.elements(), verificationKey.gammaABC()); + + // Compute the right hand side: alpha*beta + evaluationABC*gamma + C*delta + final GTT C = alphaBeta.add(pairing.reducedPairing(evaluationABC, gamma)).add(CDelta); + + final boolean verifierResult = AB.equals(C); + if (!verifierResult && config.verboseFlag()) { + System.out.println("Verifier failed: "); + System.out.println("\n\tA * B = " + AB + "\nC = " + C); + } + + return verifierResult; + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/CRS.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/CRS.java new file mode 100755 index 0000000..b9b2ccf --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/CRS.java @@ -0,0 +1,55 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16.objects; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.curves.AbstractGT; +import algebra.fields.AbstractFieldElementExpanded; +import java.io.Serializable; + +/** Groth16 Common Reference String (CRS) */ +public class CRS< + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2, + GTT extends AbstractGT> + implements Serializable { + + private final ProvingKey provingKey; + private final ProvingKeyRDD provingKeyRDD; + private final VerificationKey verificationKey; + + public CRS( + final ProvingKeyRDD _provingKeyRDD, + final VerificationKey _verificationKey) { + provingKey = null; + provingKeyRDD = _provingKeyRDD; + verificationKey = _verificationKey; + } + + public CRS( + final ProvingKey _provingKey, + final VerificationKey _verificationKey) { + provingKey = _provingKey; + provingKeyRDD = null; + verificationKey = _verificationKey; + } + + public ProvingKey provingKey() { + return provingKey; + } + + public ProvingKeyRDD provingKeyRDD() { + return provingKeyRDD; + } + + public VerificationKey verificationKey() { + return verificationKey; + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/Proof.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/Proof.java new file mode 100755 index 0000000..d22843e --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/Proof.java @@ -0,0 +1,37 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16.objects; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; + +/** Groth16 argument */ +public class Proof, G2T extends AbstractG2> { + + private final G1T gA; + private final G2T gB; + private final G1T gC; + + public Proof(final G1T _gA, final G2T _gB, final G1T _gC) { + gA = _gA; + gB = _gB; + gC = _gC; + } + + public G1T gA() { + return gA; + } + + public G2T gB() { + return gB; + } + + public G1T gC() { + return gC; + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKey.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKey.java new file mode 100755 index 0000000..4dca625 --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKey.java @@ -0,0 +1,99 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16.objects; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.fields.AbstractFieldElementExpanded; +import java.io.Serializable; +import java.util.List; +import relations.r1cs.R1CSRelation; +import scala.Tuple2; + +/** Groth16 proving key */ +public class ProvingKey< + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2> + implements Serializable { + + private final G1T alphaG1; + private final G1T betaG1; + private final G2T betaG2; + private final G1T deltaG1; + private final G2T deltaG2; + private final List deltaABCG1; + private final List queryA; + private final List> queryB; + private final List queryH; + // The proving key holds the arithmetized relation + private final R1CSRelation r1cs; + + public ProvingKey( + final G1T _alphaG1, + final G1T _betaG1, + final G2T _betaG2, + final G1T _deltaG1, + final G2T _deltaG2, + final List _deltaABCG1, + final List _queryA, + final List> _queryB, + final List _queryH, + final R1CSRelation _r1cs) { + alphaG1 = _alphaG1; + betaG1 = _betaG1; + betaG2 = _betaG2; + deltaG1 = _deltaG1; + deltaG2 = _deltaG2; + deltaABCG1 = _deltaABCG1; + queryA = _queryA; + queryB = _queryB; + queryH = _queryH; + r1cs = _r1cs; + } + + public G1T alphaG1() { + return alphaG1; + } + + public G1T betaG1() { + return betaG1; + } + + public G2T betaG2() { + return betaG2; + } + + public G1T deltaG1() { + return deltaG1; + } + + public G2T deltaG2() { + return deltaG2; + } + + public List deltaABCG1() { + return deltaABCG1; + } + + public List queryA() { + return queryA; + } + + public List> queryB() { + return queryB; + } + + public List queryH() { + return queryH; + } + + public R1CSRelation r1cs() { + return r1cs; + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKeyRDD.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKeyRDD.java new file mode 100755 index 0000000..670ca58 --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/ProvingKeyRDD.java @@ -0,0 +1,111 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16.objects; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.fields.AbstractFieldElementExpanded; +import java.io.Serializable; +import org.apache.spark.api.java.JavaPairRDD; +import relations.r1cs.R1CSRelationRDD; +import scala.Tuple2; + +public class ProvingKeyRDD< + FieldT extends AbstractFieldElementExpanded, + G1T extends AbstractG1, + G2T extends AbstractG2> + implements Serializable { + + // Below, [x]_1 (resp. [x]_2 and []_T) represents the encoding of x in G1 (resp. G2 and GT) + // We follow the notations in Groth16 (namely, polynomials are denoted u, v, w, h, t instead of A, + // B, C, H, Z. Moreover the evaluation point is denoted by x) + // + // [alpha]_1 + private final G1T alphaG1; + // [beta]_1 + private final G1T betaG1; + // [beta]_2 + private final G2T betaG2; + // [delta]_1 + private final G1T deltaG1; + // [delta]_2 + private final G2T deltaG2; + // {[(beta * u_i(x) + alpha * v_i(x) + w_i(x))/delta]_1} + private final JavaPairRDD deltaABCG1; + // {[u_i(x)]_1} + private final JavaPairRDD queryA; + // {[v_i(x)]_1} + private final JavaPairRDD> queryB; + // {[(x^i * t(x))/delta]_1} + private final JavaPairRDD queryH; + // The proving key contains an arithmetized relation + private final R1CSRelationRDD r1cs; + + public ProvingKeyRDD( + final G1T _alphaG1, + final G1T _betaG1, + final G2T _betaG2, + final G1T _deltaG1, + final G2T _deltaG2, + final JavaPairRDD _deltaABCG1, + final JavaPairRDD _queryA, + final JavaPairRDD> _queryB, + final JavaPairRDD _queryH, + final R1CSRelationRDD _r1cs) { + alphaG1 = _alphaG1; + betaG1 = _betaG1; + betaG2 = _betaG2; + deltaG1 = _deltaG1; + deltaG2 = _deltaG2; + deltaABCG1 = _deltaABCG1; + queryA = _queryA; + queryB = _queryB; + queryH = _queryH; + r1cs = _r1cs; + } + + public G1T alphaG1() { + return alphaG1; + } + + public G1T betaG1() { + return betaG1; + } + + public G2T betaG2() { + return betaG2; + } + + public G1T deltaG1() { + return deltaG1; + } + + public G2T deltaG2() { + return deltaG2; + } + + public JavaPairRDD deltaABCG1() { + return deltaABCG1; + } + + public JavaPairRDD queryA() { + return queryA; + } + + public JavaPairRDD> queryB() { + return queryB; + } + + public JavaPairRDD queryH() { + return queryH; + } + + public R1CSRelationRDD r1cs() { + return r1cs; + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/VerificationKey.java b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/VerificationKey.java new file mode 100755 index 0000000..e064a6a --- /dev/null +++ b/src/main/java/zk_proof_systems/zkSNARK/groth16/objects/VerificationKey.java @@ -0,0 +1,77 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16.objects; + +import algebra.curves.AbstractG1; +import algebra.curves.AbstractG2; +import algebra.curves.AbstractGT; +import java.util.List; + +/** Groth16 verification key */ +public class VerificationKey< + G1T extends AbstractG1, G2T extends AbstractG2, GTT extends AbstractGT> { + + private final GTT alphaG1betaG2; + private final G2T gammaG2; + private final G2T deltaG2; + private final List gammaABC; + + public VerificationKey( + final GTT _alphaG1betaG2, final G2T _gammaG2, final G2T _deltaG2, final List _gammaABC) { + alphaG1betaG2 = _alphaG1betaG2; + gammaG2 = _gammaG2; + deltaG2 = _deltaG2; + gammaABC = _gammaABC; + } + + public boolean equals(final VerificationKey other) { + if (!alphaG1betaG2.equals(other.alphaG1betaG2())) { + return false; + } + + if (!gammaG2.equals(other.gammaG2())) { + return false; + } + + if (!deltaG2.equals(other.deltaG2())) { + return false; + } + + if (gammaABC.size() != other.gammaABC().size()) { + return false; + } + + for (int i = 0; i < gammaABC.size(); i++) { + if (!gammaABC(i).equals(other.gammaABC(i))) { + return false; + } + } + + return true; + } + + public GTT alphaG1betaG2() { + return alphaG1betaG2; + } + + public G2T gammaG2() { + return gammaG2; + } + + public G2T deltaG2() { + return deltaG2; + } + + public G1T gammaABC(final int i) { + return gammaABC.get(i); + } + + public List gammaABC() { + return gammaABC; + } +} diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md index 156f13e..cdefd93 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/README.md @@ -3,58 +3,10 @@ This folder contains an implementation of **a variant** of the publicly-verifiable preprocessing zkSNARK [\[Gro16\]](https://eprint.iacr.org/2016/260) proposed by Bowe et al. [\[BGM17\]](https://eprint.iacr.org/2017/1050.pdf). The NP relation supported by this protocol is *Rank-1 Constraint Satisfaction* (R1CS). -## zkSNARK.groth16 +## CRS structure modifications ``` -Let \REL be defined as: -\REL = (\FF, inp, {A_i(x), B_i(x), C_i(x)}_{i=0}^{m}, t(x)) - -where: -- inp = prim_inputs \cup aux_inputs and where \inp_0 = \FF.mult_id (i.e. one) -- prim_inputs = (inp_1, ..., inp_l), i.e. l = |prim_inputs| -- aux_inputs = (inp_{l+1}, ..., inp_m), i.e. m-l = |aux_inputs|, m = |wires| - -\sum_{i=0}^{m} inp_i * A_i(x) * \sum_{i=0}^{m} inp_i * B_i(x) = \sum_{i=0}^{m} inp_i * C_i(x) + h(x)t(x) - -- deg(h(x)) = n-2 -- deg(t(x)) = n -``` - -### Setup - -``` -(\crs, \trap) \gets Setup(R) - -where: -- \trap \gets (\alpha, \beta, \gamma, \delta, x) -- \crs \gets (\crs_1, \crs_2), where \crs_1 (resp. \crs_2) is a set of \GG_1 (resp. G2) elements - -\crs_1 = \alpha, \beta, \delta, {x^i}_{i=0}^{n-1}, {(\beta * A_i + \alpha * B_i + C_i) / \gamma}_{i=0}^{l}, {(\beta * A_i + \alpha * B_i + C_i) / \delta}_{i=l+1}^{m}, {(x^i * t(x)) / \delta}_{i=0}^{n-2} - -\crs_2 = \beta, \gamma, \delta, {x^i}_{i=0}^{n-1} -``` - -### Prove - -``` -(\pi) \gets Prove(R, \crs, {inp_1, ..., inp_m}) - -\pi = (A, B, C) - -where: -- r, s \sample \FF -- A = \alpha + \sum_{i=0}^{m} inp_i*A_i(x) + r*\delta, A in \GG_1 -- B = \beta + \sum_{i=0}^{m} inp_i*B_i(x) + s*\delta, B in \GG_2 -- C = [\sum_{i=l+1}^{m} inp_i * (\beta * A_i(x) + \alpha * B_i(x) + C_i(x)) + h(x)t(x)] / \delta + A*s + B*r - rs*\delta -``` - -### Verify - -``` -(\res) \gets Verify(R, \crs, {inp_1, ..., inp_l}, \pi) - -\res \in \{0,1\} - -and where: -\res = Pairing(\pi.A, \pi.B) * Pairing(\sum_{i=0}^{l} inp_i *(\beta*A_i(x) + \alpha * B_i(x) + C_i(x)) / \gamma, \gamma) * Pairing(\pi.C, \delta) +- \gamma elements are removed +- Additional "powers of tau" are computed during the MPC +- {[A_i(x)]_1}_{i=0}^{m}, {[B_i(x)]_2}_{i=0}^{m} are added to the SRS to fasten the Prover algorithm ``` \ No newline at end of file diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java index e2c71df..33e0b79 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import static org.junit.jupiter.api.Assertions.assertTrue; @@ -41,8 +41,8 @@ import relations.objects.Assignment; import relations.r1cs.R1CSRelationRDD; import scala.Tuple3; -import zk_proof_systems.zkSNARK.objects.CRS; -import zk_proof_systems.zkSNARK.objects.Proof; +import zk_proof_systems.zkSNARK.grothBGM17.objects.CRS; +import zk_proof_systems.zkSNARK.grothBGM17.objects.Proof; public class DistributedzkSNARKTest implements Serializable { private transient JavaSparkContext sc; diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index f86d05c..c1e7550 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package zk_proof_systems.zkSNARK; +package zk_proof_systems.zkSNARK.grothBGM17; import static org.junit.jupiter.api.Assertions.assertTrue; @@ -38,8 +38,8 @@ import relations.objects.Assignment; import relations.r1cs.R1CSRelation; import scala.Tuple3; -import zk_proof_systems.zkSNARK.objects.CRS; -import zk_proof_systems.zkSNARK.objects.Proof; +import zk_proof_systems.zkSNARK.grothBGM17.objects.CRS; +import zk_proof_systems.zkSNARK.grothBGM17.objects.Proof; public class SerialzkSNARKTest implements Serializable { private Configuration config; From 0acbf427a2340dfff6c3ef566c0028ae04518587 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 15:25:08 +0000 Subject: [PATCH 48/94] Plain groth tests --- .../groth16/DistributedzkSNARKTest.java | 176 ++++++++++++++++++ .../zkSNARK/groth16/SerialzkSNARKTest.java | 159 ++++++++++++++++ 2 files changed, 335 insertions(+) create mode 100755 src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java create mode 100755 src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java diff --git a/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java new file mode 100755 index 0000000..76988c2 --- /dev/null +++ b/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java @@ -0,0 +1,176 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.curves.barreto_naehrig.*; +import algebra.curves.barreto_naehrig.BNFields.*; +import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; +import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; +import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; +import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; +import algebra.curves.barreto_naehrig.bn254b.BN254bG1; +import algebra.curves.barreto_naehrig.bn254b.BN254bG2; +import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; +import algebra.curves.fake.*; +import configuration.Configuration; +import java.io.Serializable; +import org.apache.spark.SparkConf; +import org.apache.spark.api.java.JavaPairRDD; +import org.apache.spark.api.java.JavaSparkContext; +import org.apache.spark.storage.StorageLevel; +import org.junit.jupiter.api.AfterEach; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import profiler.generation.R1CSConstructor; +import profiler.utils.SparkUtils; +import relations.objects.Assignment; +import relations.r1cs.R1CSRelationRDD; +import scala.Tuple3; +import zk_proof_systems.zkSNARK.groth16.objects.CRS; +import zk_proof_systems.zkSNARK.groth16.objects.Proof; + +public class DistributedzkSNARKTest implements Serializable { + private transient JavaSparkContext sc; + private Configuration config; + + @BeforeEach + public void setUp() { + final SparkConf conf = new SparkConf().setMaster("local").setAppName("default"); + conf.set("spark.files.overwrite", "true"); + conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); + conf.registerKryoClasses(SparkUtils.zksparkClasses()); + + sc = new JavaSparkContext(conf); + + config = new Configuration(1, 1, 1, 2, sc, StorageLevel.MEMORY_ONLY()); + config.setRuntimeFlag(false); + config.setDebugFlag(true); + } + + @AfterEach + public void tearDown() { + sc.stop(); + sc = null; + } + + private < + BNFrT extends BNFr, + BNFqT extends BNFq, + BNFq2T extends BNFq2, + BNFq6T extends BNFq6, + BNFq12T extends BNFq12, + BNG1T extends BNG1, + BNG2T extends BNG2, + BNGTT extends BNGT, + BNG1ParametersT extends AbstractBNG1Parameters, + BNG2ParametersT extends + AbstractBNG2Parameters, + BNGTParametersT extends + AbstractBNGTParameters, + BNPublicParametersT extends BNPublicParameters, + BNPairingT extends + BNPairing< + BNFrT, + BNFqT, + BNFq2T, + BNFq6T, + BNFq12T, + BNG1T, + BNG2T, + BNGTT, + BNG1ParametersT, + BNG2ParametersT, + BNGTParametersT, + BNPublicParametersT>> + void DistributedBNProofSystemTest( + final int numInputs, + final int numConstraints, + BNFrT fieldFactory, + BNG1T g1Factory, + BNG2T g2Factory, + BNPairingT pairing) { + final Tuple3, Assignment, JavaPairRDD> construction = + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelationRDD r1cs = construction._1(); + final Assignment primary = construction._2(); + final JavaPairRDD fullAssignment = construction._3(); + + final CRS CRS = + DistributedSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); + + final Proof proof = + DistributedProver.prove(CRS.provingKeyRDD(), primary, fullAssignment, fieldFactory, config); + + final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, pairing, config); + + System.out.println(isValid); + assertTrue(isValid); + } + + // TODO: + // Remove this comment when: https://github.com/clearmatics/dizk/issues/1 + // is fixed. + /* + @Test + public void DistributedFakeProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + + FakeInitialize.init(); + final Fp fieldFactory = new FakeFqParameters().ONE(); + final FakeG1 fakeG1Factory = new FakeG1Parameters().ONE(); + final FakeG2 fakeG2Factory = new FakeG2Parameters().ONE(); + final FakePairing fakePairing = new FakePairing(); + + final Tuple3, Assignment, JavaPairRDD> construction = + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelationRDD r1cs = construction._1(); + final Assignment primary = construction._2(); + final JavaPairRDD fullAssignment = construction._3(); + + final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, + fakeG1Factory, fakeG2Factory, fakePairing, config); + final Proof proof = DistributedProver.prove(CRS.provingKeyRDD(), primary, + fullAssignment, fieldFactory, config); + final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, + fakePairing, config); + + System.out.println(isValid); + assertTrue(isValid); + } + + @Test + public void DistributedBN254aProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BN254aFr fieldFactory = new BN254aFr(1); + final BN254aG1 g1Factory = BN254aG1Parameters.ONE; + final BN254aG2 g2Factory = BN254aG2Parameters.ONE; + final BN254aPairing pairing = new BN254aPairing(); + + DistributedBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } + */ + + @Test + public void DistributedBN254bProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BN254bFr fieldFactory = new BN254bFr(1); + final BN254bG1 g1Factory = BN254bG1Parameters.ONE; + final BN254bG2 g2Factory = BN254bG2Parameters.ONE; + final BN254bPairing pairing = new BN254bPairing(); + + DistributedBNProofSystemTest( + numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } +} diff --git a/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java new file mode 100755 index 0000000..6e52e01 --- /dev/null +++ b/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java @@ -0,0 +1,159 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package zk_proof_systems.zkSNARK.groth16; + +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.curves.barreto_naehrig.*; +import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; +import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; +import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; +import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; +import algebra.curves.barreto_naehrig.bn254a.BN254aG1; +import algebra.curves.barreto_naehrig.bn254a.BN254aG2; +import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; +import algebra.curves.barreto_naehrig.bn254b.BN254bFields; +import algebra.curves.barreto_naehrig.bn254b.BN254bG1; +import algebra.curves.barreto_naehrig.bn254b.BN254bG2; +import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; +import algebra.curves.fake.*; +import algebra.curves.fake.fake_parameters.FakeFqParameters; +import algebra.curves.fake.fake_parameters.FakeG1Parameters; +import algebra.curves.fake.fake_parameters.FakeG2Parameters; +import algebra.fields.Fp; +import configuration.Configuration; +import java.io.Serializable; +import org.junit.jupiter.api.BeforeEach; +import org.junit.jupiter.api.Test; +import profiler.generation.R1CSConstructor; +import relations.objects.Assignment; +import relations.r1cs.R1CSRelation; +import scala.Tuple3; +import zk_proof_systems.zkSNARK.groth16.objects.CRS; +import zk_proof_systems.zkSNARK.groth16.objects.Proof; + +public class SerialzkSNARKTest implements Serializable { + private Configuration config; + + @BeforeEach + public void setUp() { + config = new Configuration(); + config.setRuntimeFlag(false); + config.setDebugFlag(true); + } + + private < + BNFrT extends BNFields.BNFr, + BNFqT extends BNFields.BNFq, + BNFq2T extends BNFields.BNFq2, + BNFq6T extends BNFields.BNFq6, + BNFq12T extends BNFields.BNFq12, + BNG1T extends BNG1, + BNG2T extends BNG2, + BNGTT extends BNGT, + BNG1ParametersT extends AbstractBNG1Parameters, + BNG2ParametersT extends + AbstractBNG2Parameters, + BNGTParametersT extends + AbstractBNGTParameters, + BNPublicParametersT extends BNPublicParameters, + BNPairingT extends + BNPairing< + BNFrT, + BNFqT, + BNFq2T, + BNFq6T, + BNFq12T, + BNG1T, + BNG2T, + BNGTT, + BNG1ParametersT, + BNG2ParametersT, + BNGTParametersT, + BNPublicParametersT>> + void SerialBNProofSystemTest( + final int numInputs, + final int numConstraints, + BNFrT fieldFactory, + BNG1T g1Factory, + BNG2T g2Factory, + BNPairingT pairing) { + final Tuple3, Assignment, Assignment> construction = + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelation r1cs = construction._1(); + final Assignment primary = construction._2(); + final Assignment fullAssignment = construction._3(); + + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); + + final Proof proof = + SerialProver.prove(CRS.provingKey(), primary, fullAssignment, fieldFactory, config); + + final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, pairing, config); + + System.out.println(isValid); + assertTrue(isValid); + } + + @Test + public void SerialFakeProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + + FakeInitialize.init(); + final Fp fieldFactory = new FakeFqParameters().ONE(); + final FakeG1 fakeG1Factory = new FakeG1Parameters().ONE(); + final FakeG2 fakeG2Factory = new FakeG2Parameters().ONE(); + final FakePairing fakePairing = new FakePairing(); + + final Tuple3, Assignment, Assignment> construction = + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelation r1cs = construction._1(); + final Assignment primary = construction._2(); + final Assignment auxiliary = construction._3(); + + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, fakeG1Factory, fakeG2Factory, fakePairing, config); + final Proof proof = + SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); + final boolean isValid = + Verifier.verify(CRS.verificationKey(), primary, proof, fakePairing, config); + + System.out.println(isValid); + assertTrue(isValid); + } + + @Test + public void SerialBN254aProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BN254aFr fieldFactory = BN254aFr.ONE; + final BN254aG1 g1Factory = BN254aG1Parameters.ONE; + final BN254aG2 g2Factory = BN254aG2Parameters.ONE; + final BN254aPairing pairing = new BN254aPairing(); + + SerialBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } + + @Test + public void SerialBN254bProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BN254bFields.BN254bFr fieldFactory = new BN254bFields.BN254bFr(1); + final BN254bG1 g1Factory = BN254bG1Parameters.ONE; + final BN254bG2 g2Factory = BN254bG2Parameters.ONE; + final BN254bPairing pairing = new BN254bPairing(); + + SerialBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } +} From 1152fa28fcd1ffcff57efa6517258f677fd632bc Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 15:26:54 +0000 Subject: [PATCH 49/94] Applied spotless --- src/main/java/profiler/Profiler.java | 3 +- .../profiler/profiling/ZKSNARKProfiling.java | 3 +- .../zkSNARK/grothBGM17/DistributedSetup.java | 10 ++--- .../zkSNARK/grothBGM17/SerialProver.java | 37 +++++++++++-------- .../zkSNARK/grothBGM17/SerialSetup.java | 10 +++-- .../zkSNARK/grothBGM17/Verifier.java | 12 ++++-- .../grothBGM17/DistributedzkSNARKTest.java | 6 --- .../zkSNARK/grothBGM17/SerialzkSNARKTest.java | 22 +++++------ 8 files changed, 54 insertions(+), 49 deletions(-) diff --git a/src/main/java/profiler/Profiler.java b/src/main/java/profiler/Profiler.java index 2b3ef4d..1d257a8 100755 --- a/src/main/java/profiler/Profiler.java +++ b/src/main/java/profiler/Profiler.java @@ -1,11 +1,10 @@ package profiler; +import configuration.Configuration; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.SparkSession; import org.apache.spark.storage.StorageLevel; - -import configuration.Configuration; import profiler.profiling.FFTProfiling; import profiler.profiling.FixedBaseMSMProfiling; import profiler.profiling.LagrangeProfiling; diff --git a/src/main/java/profiler/profiling/ZKSNARKProfiling.java b/src/main/java/profiler/profiling/ZKSNARKProfiling.java index 5943665..0f404d0 100755 --- a/src/main/java/profiler/profiling/ZKSNARKProfiling.java +++ b/src/main/java/profiler/profiling/ZKSNARKProfiling.java @@ -1,7 +1,5 @@ package profiler.profiling; -import org.apache.spark.api.java.JavaPairRDD; - import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import algebra.curves.barreto_naehrig.bn254a.BN254aG1; import algebra.curves.barreto_naehrig.bn254a.BN254aG2; @@ -15,6 +13,7 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import configuration.Configuration; +import org.apache.spark.api.java.JavaPairRDD; import profiler.generation.R1CSConstructor; import relations.objects.Assignment; import relations.r1cs.R1CSRelation; diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java index 603a172..c16b656 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java @@ -85,7 +85,7 @@ CRS generate( config.beginLog("Generating G1 MSM Window Table"); // For testing with the cpp code, take the identity instead of a random generator - //final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + // final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); final G1T generatorG1 = g1Factory.one(); final int scalarSizeG1 = generatorG1.bitSize(); final long scalarCountG1 = numNonZeroAt + numNonZeroBt + numVariables; @@ -96,7 +96,7 @@ CRS generate( config.endLog("Generating G1 MSM Window Table"); config.beginLog("Generating G2 MSM Window Table"); - //final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + // final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); final G2T generatorG2 = g2Factory.one(); final int scalarSizeG2 = generatorG2.bitSize(); final long scalarCountG2 = numNonZeroBt; @@ -170,9 +170,9 @@ CRS generate( config.beginRuntime("Verification Key"); final JavaPairRDD vkABC = ABC.filter(e -> e._1 < numInputs); final JavaPairRDD vkABCG1 = - FixedBaseMSM.distributedBatchMSM( - scalarSizeG1, windowSizeG1, windowTableG1, vkABC, config.sparkContext()) - .persist(config.storageLevel()); + FixedBaseMSM.distributedBatchMSM( + scalarSizeG1, windowSizeG1, windowTableG1, vkABC, config.sparkContext()) + .persist(config.storageLevel()); // ABC is not stored as an RDD in the verification key, so we recover a `List` final List vkABCFinalG1 = Utils.convertFromPair(vkABCG1.collect(), numInputs); ABC.unpersist(); diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java index 22044a2..7629640 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java @@ -59,7 +59,9 @@ Proof prove( final FieldT t = fieldFactory.random(config.seed(), config.secureSeed()); final QAPRelation qap = R1CStoQAP.R1CStoQAPRelation(provingKey.r1cs(), t); assert (qap.isSatisfied(qapWitness)); - System.out.println("\n\t ===== [DEBUG] qap.isSatisfied(qapWitness) TRUTH value: " + qap.isSatisfied(qapWitness)); + System.out.println( + "\n\t ===== [DEBUG] qap.isSatisfied(qapWitness) TRUTH value: " + + qap.isSatisfied(qapWitness)); } // Choose two random field elements for prover zero-knowledge. @@ -68,11 +70,11 @@ Proof prove( final FieldT s = fieldFactory.random(config.seed(), config.secureSeed()); if (config.debugFlag()) { - assert(qapWitness.coefficientsABC().size() == qapWitness.numVariables()); - assert(provingKey.queryA().size() == qapWitness.numVariables()); - assert(provingKey.deltaABCG1().size() == qapWitness.numVariables() - qapWitness.numInputs()); - System.out.println("\n\t ===== [DEBUG] Asserts on size pass ====="); - } + assert (qapWitness.coefficientsABC().size() == qapWitness.numVariables()); + assert (provingKey.queryA().size() == qapWitness.numVariables()); + assert (provingKey.deltaABCG1().size() == qapWitness.numVariables() - qapWitness.numInputs()); + System.out.println("\n\t ===== [DEBUG] Asserts on size pass ====="); + } // Get initial parameters from the proving key. final G1T alphaG1 = provingKey.alphaG1(); @@ -90,7 +92,8 @@ Proof prove( config.beginLog("Computing evaluation to query A: summation of variable_i*A_i(t)"); // A = alpha + \sum_{i=0}^{numVariables} var_i * A_i(t) + r * delta // Below, the summation is decomposed as: - // \sum_{i=0}^{numInputs} pubInp_i * A_i(t) + \sum_{i=numInputs + 1}^{numVariables} auxInp_i * A_i(t) + // \sum_{i=0}^{numInputs} pubInp_i * A_i(t) + \sum_{i=numInputs + 1}^{numVariables} auxInp_i * + // A_i(t) G1T evaluationAt = VariableBaseMSM.serialMSM(primary.elements(), provingKey.queryA().subList(0, numInputs)); evaluationAt = @@ -102,7 +105,8 @@ Proof prove( config.beginLog("Computing evaluation to query B: summation of variable_i*B_i(t)"); // B = beta + \sum_{i=0}^{numVariables} var_i * B_i(t) + s * delta // Below, the summation is decomposed as: - // \sum_{i=0}^{numInputs} pubInp_i * B_i(t) + \sum_{i=numInputs + 1}^{numVariables} auxInp_i * B_i(t) + // \sum_{i=0}^{numInputs} pubInp_i * B_i(t) + \sum_{i=numInputs + 1}^{numVariables} auxInp_i * + // B_i(t) final Tuple2 evaluationBtPrimary = VariableBaseMSM.doubleMSM(primary.elements(), provingKey.queryB().subList(0, numInputs)); final Tuple2 evaluationBtWitness = @@ -117,17 +121,19 @@ Proof prove( VariableBaseMSM.serialMSM(qapWitness.coefficientsH(), provingKey.queryH()); config.endLog("Computing evaluation to query H"); - // Compute evaluationABC = \sum_{i=numInputs+1}^{numVariables} var_i * ((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. + // Compute evaluationABC = \sum_{i=numInputs+1}^{numVariables} var_i * ((beta*A_i(t) + + // alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta. config.beginLog("Computing evaluation to deltaABC"); final int numWitness = numVariables - numInputs; - //G1T evaluationABC = + // G1T evaluationABC = // VariableBaseMSM.serialMSM( // auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); - System.out.println("[DEBUG] Prover, auxiliary.elements().size(): " + auxiliary.elements().size()); - System.out.println("[DEBUG] Prover, provingKey.deltaABCG1().size(): " + provingKey.deltaABCG1().size()); + System.out.println( + "[DEBUG] Prover, auxiliary.elements().size(): " + auxiliary.elements().size()); + System.out.println( + "[DEBUG] Prover, provingKey.deltaABCG1().size(): " + provingKey.deltaABCG1().size()); System.out.println("[DEBUG] Both expected to be: numWitness = " + numWitness); - G1T evaluationABC = - VariableBaseMSM.serialMSM(auxiliary.elements(), provingKey.deltaABCG1()); + G1T evaluationABC = VariableBaseMSM.serialMSM(auxiliary.elements(), provingKey.deltaABCG1()); evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta config.endLog("Computing evaluation to deltaABC"); @@ -140,7 +146,8 @@ Proof prove( betaG1.add(evaluationBtG1).add(deltaG1.mul(s)), betaG2.add(evaluationBtG2).add(deltaG2.mul(s))); - // C = \sum_{i=numInputs+1}^{numVariables}(var_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + H(t)*Z(t))/delta) + A*s + r*b - r*s*delta + // C = \sum_{i=numInputs+1}^{numVariables}(var_i*((beta*A_i(t) + alpha*B_i(t) + C_i(t)) + + // H(t)*Z(t))/delta) + A*s + r*b - r*s*delta final G1T C = evaluationABC.add(A.mul(s)).add(B._1.mul(r)).sub(rsDelta); config.endRuntime("Proof"); diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java index 47eee66..eb2a70e 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java @@ -61,8 +61,10 @@ CRS generate( // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]_1}_{i=0}^{numInputs} config.beginLog("Computing ABC for R1CS verification key"); final List vkABC = new ArrayList<>(numInputs); - // TODO: (Double check) I don't think we need to add 1 to the bounds here (i.e. i = 1 and i < numInputs + 1) - // because we manually add ONE to the inputs outside of this function when we construct the R1CS. + // TODO: (Double check) I don't think we need to add 1 to the bounds here (i.e. i = 1 and i < + // numInputs + 1) + // because we manually add ONE to the inputs outside of this function when we construct the + // R1CS. for (int i = 0; i < numInputs; i++) { vkABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i))); } @@ -91,7 +93,7 @@ CRS generate( config.endLog("Computing query densities"); config.beginLog("Generating G1 MSM Window Table"); - //final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); + // final G1T generatorG1 = g1Factory.random(config.seed(), config.secureSeed()); // We take ONE in both G1 and G2 as generator, else we need to add the choosen generator // as part of the SRS for the verifier to use it in the computation of [evaluationABC*1]_T final G1T generatorG1 = g1Factory.one(); @@ -103,7 +105,7 @@ CRS generate( config.endLog("Generating G1 MSM Window Table"); config.beginLog("Generating G2 MSM Window Table"); - //final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); + // final G2T generatorG2 = g2Factory.random(config.seed(), config.secureSeed()); // We take ONE in both G1 and G2 as generator, else we need to add the choosen generator // as part of the SRS for the verifier to use it in the computation of [evaluationABC*1]_T final G2T generatorG2 = g2Factory.one(); diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java index d25dd24..598a6c7 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java @@ -39,21 +39,25 @@ boolean verify( final GTT LHS = pairing.reducedPairing(proof.gA(), proof.gB()); // RHS: Compute [alpha * beta]_T - final GTT alphaBeta = pairing.reducedPairing(verificationKey.alphaG1(),verificationKey.betaG2()); + final GTT alphaBeta = + pairing.reducedPairing(verificationKey.alphaG1(), verificationKey.betaG2()); // RHS: Compute [C * delta]_T final G2T delta = verificationKey.deltaG2(); final GTT CDelta = pairing.reducedPairing(proof.gC(), delta); // RHS: Compute \sum_{i=0}^{numInputs} pubInp_i * (beta*A_i(x) + alpha*B_i(x) + C_i(x)) - System.out.println("[DEBUG] Verifier RHS, primaryInput.elements().size(): " + primaryInput.elements().size()); - System.out.println("[DEBUG] Verifier RHS, verificationKey.ABC().size(): " + verificationKey.ABC().size()); + System.out.println( + "[DEBUG] Verifier RHS, primaryInput.elements().size(): " + primaryInput.elements().size()); + System.out.println( + "[DEBUG] Verifier RHS, verificationKey.ABC().size(): " + verificationKey.ABC().size()); System.out.println("[DEBUG] Both expected to be: numInputs"); final G1T evaluationABC = VariableBaseMSM.serialMSM(primaryInput.elements(), verificationKey.ABC()); // Compute the RHS: [alpha*beta + evaluationABC*1 + C*delta]_T - final G2T generatorG2 = delta.one(); // See SerialSetup, we take ONE in both G1 and G2 as generator. + final G2T generatorG2 = + delta.one(); // See SerialSetup, we take ONE in both G1 and G2 as generator. final GTT RHS = alphaBeta.add(pairing.reducedPairing(evaluationABC, generatorG2)).add(CDelta); final boolean verifierResult = LHS.equals(RHS); diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java index 33e0b79..33119d7 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java @@ -14,12 +14,6 @@ import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; -import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; -import algebra.curves.barreto_naehrig.bn254a.BN254aG1; -import algebra.curves.barreto_naehrig.bn254a.BN254aG2; -import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import algebra.curves.barreto_naehrig.bn254b.BN254bG1; import algebra.curves.barreto_naehrig.bn254b.BN254bG2; diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index c1e7550..86f4c7b 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -13,23 +13,23 @@ import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; -//import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; -//import algebra.curves.barreto_naehrig.bn254a.BN254aG1; -//import algebra.curves.barreto_naehrig.bn254a.BN254aG2; -//import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; -//import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; -//import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; +// import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; +// import algebra.curves.barreto_naehrig.bn254a.BN254aG1; +// import algebra.curves.barreto_naehrig.bn254a.BN254aG2; +// import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; +// import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; +// import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import algebra.curves.barreto_naehrig.bn254b.BN254bG1; import algebra.curves.barreto_naehrig.bn254b.BN254bG2; import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -//import algebra.curves.fake.*; -//import algebra.curves.fake.fake_parameters.FakeFqParameters; -//import algebra.curves.fake.fake_parameters.FakeG1Parameters; -//import algebra.curves.fake.fake_parameters.FakeG2Parameters; -//import algebra.fields.Fp; +// import algebra.curves.fake.*; +// import algebra.curves.fake.fake_parameters.FakeFqParameters; +// import algebra.curves.fake.fake_parameters.FakeG1Parameters; +// import algebra.curves.fake.fake_parameters.FakeG2Parameters; +// import algebra.fields.Fp; import configuration.Configuration; import java.io.Serializable; import org.junit.jupiter.api.BeforeEach; From b7ebe52ebfbb51f2c5df509da028a0a1eefe077f Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 15:58:01 +0000 Subject: [PATCH 50/94] Added simple negative test case for bgm verification --- .../zkSNARK/grothBGM17/SerialzkSNARKTest.java | 24 ++++++++++++++----- 1 file changed, 18 insertions(+), 6 deletions(-) diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index 86f4c7b..4bf2c97 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -7,6 +7,7 @@ package zk_proof_systems.zkSNARK.grothBGM17; +import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.curves.barreto_naehrig.*; @@ -96,13 +97,24 @@ void SerialBNProofSystemTest( final CRS CRS = SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); - final Proof proof = + // Make sure that a valid proof verifies + final Proof proofValid = SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); - - final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, pairing, config); - - System.out.println(isValid); - assertTrue(isValid); + final boolean isValidProofValid = + Verifier.verify(CRS.verificationKey(), primary, proofValid, pairing, config); + System.out.println("Verification bit of valid proof: " + isValidProofValid); + assertTrue(isValidProofValid); + + // Make sure that an invalid/random proof does NOT verify + final Proof proofInvalid = + new Proof( + g1Factory.random(config.seed(), config.secureSeed()), + g2Factory.random(config.seed(), config.secureSeed()), + g1Factory.random(config.seed(), config.secureSeed())); + final boolean isInvalidProofValid = + Verifier.verify(CRS.verificationKey(), primary, proofInvalid, pairing, config); + System.out.println("Verification bit of invalid proof: " + isInvalidProofValid); + assertFalse(isInvalidProofValid); } /* From be332597ce0733d64b4a1e5f6039cdabd0872dd3 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 16:05:53 +0000 Subject: [PATCH 51/94] Removed debug comments and added documentation comments in bgm files --- .../zkSNARK/grothBGM17/DistributedProver.java | 7 ------- .../zkSNARK/grothBGM17/DistributedSetup.java | 7 ------- .../zkSNARK/grothBGM17/SerialProver.java | 12 ------------ .../zkSNARK/grothBGM17/SerialSetup.java | 11 ----------- .../zkSNARK/grothBGM17/Verifier.java | 12 ------------ .../zkSNARK/grothBGM17/objects/CRS.java | 7 ------- .../zkSNARK/grothBGM17/objects/Proof.java | 9 +-------- .../zkSNARK/grothBGM17/objects/ProvingKey.java | 9 +-------- .../zkSNARK/grothBGM17/objects/ProvingKeyRDD.java | 8 +------- .../zkSNARK/grothBGM17/objects/VerificationKey.java | 9 +-------- 10 files changed, 4 insertions(+), 87 deletions(-) diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java index 9bf76ea..f16fa7b 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedProver.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java index c16b656..c6c323b 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedSetup.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java index 7629640..d4a3834 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialProver.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; @@ -128,11 +121,6 @@ Proof prove( // G1T evaluationABC = // VariableBaseMSM.serialMSM( // auxiliary.subList(0, numWitness), provingKey.deltaABCG1().subList(0, numWitness)); - System.out.println( - "[DEBUG] Prover, auxiliary.elements().size(): " + auxiliary.elements().size()); - System.out.println( - "[DEBUG] Prover, provingKey.deltaABCG1().size(): " + provingKey.deltaABCG1().size()); - System.out.println("[DEBUG] Both expected to be: numWitness = " + numWitness); G1T evaluationABC = VariableBaseMSM.serialMSM(auxiliary.elements(), provingKey.deltaABCG1()); evaluationABC = evaluationABC.add(evaluationHtZt); // H(t)*Z(t)/delta config.endLog("Computing evaluation to deltaABC"); diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java index eb2a70e..8018ab7 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/SerialSetup.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; @@ -61,10 +54,6 @@ CRS generate( // {[beta * A_i(t) + alpha * B_i(t) + C_i(t)]_1}_{i=0}^{numInputs} config.beginLog("Computing ABC for R1CS verification key"); final List vkABC = new ArrayList<>(numInputs); - // TODO: (Double check) I don't think we need to add 1 to the bounds here (i.e. i = 1 and i < - // numInputs + 1) - // because we manually add ONE to the inputs outside of this function when we construct the - // R1CS. for (int i = 0; i < numInputs; i++) { vkABC.add(beta.mul(qap.At(i)).add(alpha.mul(qap.Bt(i))).add(qap.Ct(i))); } diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java index 598a6c7..07d4bfd 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/Verifier.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17; import algebra.curves.AbstractG1; @@ -47,11 +40,6 @@ boolean verify( final GTT CDelta = pairing.reducedPairing(proof.gC(), delta); // RHS: Compute \sum_{i=0}^{numInputs} pubInp_i * (beta*A_i(x) + alpha*B_i(x) + C_i(x)) - System.out.println( - "[DEBUG] Verifier RHS, primaryInput.elements().size(): " + primaryInput.elements().size()); - System.out.println( - "[DEBUG] Verifier RHS, verificationKey.ABC().size(): " + verificationKey.ABC().size()); - System.out.println("[DEBUG] Both expected to be: numInputs"); final G1T evaluationABC = VariableBaseMSM.serialMSM(primaryInput.elements(), verificationKey.ABC()); diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java index d00b052..81b1adb 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/CRS.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java index 6ffcf9f..215b0c2 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/Proof.java @@ -1,16 +1,9 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; -/** Groth16 argument */ +/** Groth16-BGM17 Argument */ public class Proof, G2T extends AbstractG2> { private final G1T gA; diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java index 7d82413..86424e5 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKey.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; @@ -15,7 +8,7 @@ import relations.r1cs.R1CSRelation; import scala.Tuple2; -/** Groth16 proving key */ +/** Groth16-BGM17 Proving key */ public class ProvingKey< FieldT extends AbstractFieldElementExpanded, G1T extends AbstractG1, diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java index 7ec7b42..44d50a4 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/ProvingKeyRDD.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; @@ -15,6 +8,7 @@ import relations.r1cs.R1CSRelationRDD; import scala.Tuple2; +/** Groth16-BGM17 Proving key for distributed computation */ public class ProvingKeyRDD< FieldT extends AbstractFieldElementExpanded, G1T extends AbstractG1, diff --git a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java index 9eb6bbf..a740350 100755 --- a/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java +++ b/src/main/java/zk_proof_systems/zkSNARK/grothBGM17/objects/VerificationKey.java @@ -1,17 +1,10 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package zk_proof_systems.zkSNARK.grothBGM17.objects; import algebra.curves.AbstractG1; import algebra.curves.AbstractG2; import java.util.List; -/** Groth16-BGM17 verification key */ +/** Groth16-BGM17 Verification key */ public class VerificationKey, G2T extends AbstractG2> { // [alpha]_1 From 8b560a3eb88d4c05ddcfd892fafb42fbd2bc239c Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 16:34:34 +0000 Subject: [PATCH 52/94] Added README for bls12377 folder --- src/main/java/algebra/curves/bls/README.md | 24 ++++++++++++++++++++++ 1 file changed, 24 insertions(+) create mode 100644 src/main/java/algebra/curves/bls/README.md diff --git a/src/main/java/algebra/curves/bls/README.md b/src/main/java/algebra/curves/bls/README.md new file mode 100644 index 0000000..aac9f49 --- /dev/null +++ b/src/main/java/algebra/curves/bls/README.md @@ -0,0 +1,24 @@ +# Barreto Lynn Scott (BLS) curves + +This folder contains implementations and parameters for the following BLS curves: + +| BLS12377 | Order | # Bits | 2-adicity | +|--------------|---------------------------------------------------------------------------------------------------|------------|-----------| +| Base Field | 0x1ae3a4617c510eac63b05c06ca1493b1a22d9f300f5138f1ef3622fba094800170b5d44300000008508c00000000001 | 377 | 46 | +| Scalar Field | 0x12ab655e9a2ca55660b44d1e5c37b00159aa76fed00000010a11800000000001 | 253 | 47 | + +```python +# Sage excerpt +q = 258664426012969094010652733694893533536393512754914660539884262666720468348340822774968888139573360124440321458177 +r = 8444461749428370424248824938781546531375899335154063827935233455917409239041 + +factor(r-1) +# 2^47 * 3 * 5 * 7 * 13 * 499 * 958612291309063373 * 9586122913090633729^2 +factor(q-1) +# 2^46 * 3 * 7 * 13 * 53 * 409 * 499 * 2557 * 6633514200929891813 * 73387170334035996766247648424745786170238574695861388454532790956181 + +len(bin(r)[2:]) +# 253 +len(bin(q)[2:]) +# 377 +``` \ No newline at end of file From 25cb88b27dd9cfa44882fdfce3c52f0df2014b05 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 16:36:02 +0000 Subject: [PATCH 53/94] Added refs --- src/main/java/algebra/curves/bls/README.md | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/src/main/java/algebra/curves/bls/README.md b/src/main/java/algebra/curves/bls/README.md index aac9f49..5403b71 100644 --- a/src/main/java/algebra/curves/bls/README.md +++ b/src/main/java/algebra/curves/bls/README.md @@ -21,4 +21,9 @@ len(bin(r)[2:]) # 253 len(bin(q)[2:]) # 377 -``` \ No newline at end of file +``` + +## References + +- https://eprint.iacr.org/2002/088.pdf +- https://eprint.iacr.org/2018/962.pdf From c640b0006b4ccd24644771a624f2f31cbae2370d Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 17:34:14 +0000 Subject: [PATCH 54/94] Renamed bls repo --- .../{bls => barreto_lynn_scott}/README.md | 0 .../bls12_377/BLS12_377Fields.java | 290 ++++++++++++++++++ 2 files changed, 290 insertions(+) rename src/main/java/algebra/curves/{bls => barreto_lynn_scott}/README.md (100%) create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java diff --git a/src/main/java/algebra/curves/bls/README.md b/src/main/java/algebra/curves/barreto_lynn_scott/README.md similarity index 100% rename from src/main/java/algebra/curves/bls/README.md rename to src/main/java/algebra/curves/barreto_lynn_scott/README.md diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java new file mode 100755 index 0000000..6fcfc4f --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java @@ -0,0 +1,290 @@ +package algebra.curves.barreto_lynn_scott.bls12_377; + +import algebra.curves.barreto_naehrig.BNFields.*; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.*; +import algebra.fields.Fp; +import algebra.fields.Fp12_2Over3Over2; +import algebra.fields.Fp2; +import algebra.fields.Fp6_3Over2; +import java.math.BigInteger; + +//public class BLS12_377Fields { +// /* Scalar field Fr */ +// public static class BLS12_377Fr extends BNFr { +// +// public static final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); +// public static final BLS12_377Fr ZERO = new BLS12_377Fr(FrParameters.ZERO()); +// public static final BLS12_377Fr ONE = new BLS12_377Fr(FrParameters.ONE()); +// public static final BLS12_377Fr MULTIPLICATIVE_GENERATOR = +// new BLS12_377Fr(FrParameters.multiplicativeGenerator()); +// +// public Fp element; +// +// public BLS12_377Fr(final BigInteger number) { +// this.element = new Fp(number, FrParameters); +// } +// +// public BLS12_377Fr(final Fp number) { +// this(number.toBigInteger()); +// } +// +// public BLS12_377Fr(final String number) { +// this(new BigInteger(number)); +// } +// +// public BLS12_377Fr(final long number) { +// this(BigInteger.valueOf(number)); +// } +// +// public BLS12_377Fr self() { +// return this; +// } +// +// public Fp element() { +// return element; +// } +// +// public BLS12_377Fr zero() { +// return ZERO; +// } +// +// public BLS12_377Fr one() { +// return ONE; +// } +// +// public BLS12_377Fr multiplicativeGenerator() { +// return MULTIPLICATIVE_GENERATOR; +// } +// +// public BLS12_377Fr construct(final BigInteger number) { +// return new BLS12_377Fr(number); +// } +// +// public BLS12_377Fr construct(final long number) { +// return new BLS12_377Fr(number); +// } +// +// public BLS12_377Fr construct(final Fp element) { +// return new BLS12_377Fr(element); +// } +// +// public String toString() { +// return this.element.toString(); +// } +// } +// +// /* Base field Fq */ +// public static class BLS12_377Fq extends BNFq { +// +// public static final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); +// public static final BLS12_377Fq ZERO = new BLS12_377Fq(FqParameters.ZERO()); +// public static final BLS12_377Fq ONE = new BLS12_377Fq(FqParameters.ONE()); +// public static final BLS12_377Fq MULTIPLICATIVE_GENERATOR = +// new BLS12_377Fq(FqParameters.multiplicativeGenerator()); +// +// public Fp element; +// +// public BLS12_377Fq(final Fp element) { +// this.element = element; +// } +// +// public BLS12_377Fq(final BigInteger number) { +// this.element = new Fp(number, FqParameters); +// } +// +// public BLS12_377Fq(final String number) { +// this(new BigInteger(number)); +// } +// +// public BLS12_377Fq(final long number) { +// this(BigInteger.valueOf(number)); +// } +// +// public BLS12_377Fq self() { +// return this; +// } +// +// public Fp element() { +// return element; +// } +// +// public BLS12_377Fq zero() { +// return ZERO; +// } +// +// public BLS12_377Fq one() { +// return ONE; +// } +// +// public BLS12_377Fq multiplicativeGenerator() { +// return MULTIPLICATIVE_GENERATOR; +// } +// +// public BLS12_377Fq construct(final BigInteger number) { +// return new BLS12_377Fq(number); +// } +// +// public BLS12_377Fq construct(final Fp element) { +// return new BLS12_377Fq(element); +// } +// +// public BLS12_377Fq construct(final String element) { +// return new BLS12_377Fq(element); +// } +// +// public BLS12_377Fq construct(final long number) { +// return new BLS12_377Fq(number); +// } +// +// public String toString() { +// return this.element.toString(); +// } +// } +// +// /* Twist field Fq2 */ +// public static class BLS12_377Fq2 extends BNFq2 { +// +// public static final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); +// public static BLS12_377Fq2 ZERO = new BLS12_377Fq2(Fq2Parameters.ZERO()); +// public static BLS12_377Fq2 ONE = new BLS12_377Fq2(Fq2Parameters.ONE()); +// +// public Fp2 element; +// +// public BLS12_377Fq2(final Fp2 element) { +// this.element = element; +// } +// +// public BLS12_377Fq2(final BigInteger c0, final BigInteger c1) { +// this.element = new Fp2(c0, c1, Fq2Parameters); +// } +// +// public BLS12_377Fq2(final BLS12_377Fq c0, final BLS12_377Fq c1) { +// this(c0.toBigInteger(), c1.toBigInteger()); +// } +// +// public BLS12_377Fq2(final long c0, final long c1) { +// this(BigInteger.valueOf(c0), BigInteger.valueOf(c1)); +// } +// +// public BLS12_377Fq2 self() { +// return this; +// } +// +// public Fp2 element() { +// return this.element; +// } +// +// public BLS12_377Fq2 zero() { +// return ZERO; +// } +// +// public BLS12_377Fq2 one() { +// return ONE; +// } +// +// public BLS12_377Fq2 construct(final Fp2 element) { +// return new BLS12_377Fq2(element); +// } +// +// public BLS12_377Fq2 construct(final BLS12_377Fq c0, final BLS12_377Fq c1) { +// return new BLS12_377Fq2(c0, c1); +// } +// +// public BLS12_377Fq2 construct(final long c0, final long c1) { +// return new BLS12_377Fq2(c0, c1); +// } +// +// public String toString() { +// return this.element.toString(); +// } +// } +// +// /* Field Fq6 */ +// public static class BLS12_377Fq6 extends BNFq6 { +// +// public static final BLS12_377Fq6Parameters Fq6Parameters = new BLS12_377Fq6Parameters(); +// public static BLS12_377Fq6 ZERO = new BLS12_377Fq6(Fq6Parameters.ZERO()); +// public static BLS12_377Fq6 ONE = new BLS12_377Fq6(Fq6Parameters.ONE()); +// +// public Fp6_3Over2 element; +// +// public BLS12_377Fq6(final Fp6_3Over2 element) { +// this.element = element; +// } +// +// public BLS12_377Fq6(final BLS12_377Fq2 c0, final BLS12_377Fq2 c1, final BLS12_377Fq2 c2) { +// this.element = new Fp6_3Over2(c0.element, c1.element, c2.element, Fq6Parameters); +// } +// +// public BLS12_377Fq6 self() { +// return this; +// } +// +// public Fp6_3Over2 element() { +// return this.element; +// } +// +// public BLS12_377Fq6 zero() { +// return ZERO; +// } +// +// public BLS12_377Fq6 one() { +// return ONE; +// } +// +// public Fp2 mulByNonResidue(final Fp2 other) { +// return Fq6Parameters.nonresidue().mul(other); +// } +// +// public BLS12_377Fq6 construct(final Fp6_3Over2 element) { +// return new BLS12_377Fq6(element); +// } +// +// public String toString() { +// return this.element.toString(); +// } +// } +// +// /* Field Fq12 */ +// public static class BLS12_377Fq12 extends BNFq12 { +// +// public static final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); +// public static BLS12_377Fq12 ZERO = new BLS12_377Fq12(Fq12Parameters.ZERO()); +// public static BLS12_377Fq12 ONE = new BLS12_377Fq12(Fq12Parameters.ONE()); +// +// public Fp12_2Over3Over2 element; +// +// public BLS12_377Fq12(final Fp12_2Over3Over2 element) { +// this.element = element; +// } +// +// public BLS12_377Fq12(final BLS12_377Fq6 c0, final BLS12_377Fq6 c1) { +// this.element = new Fp12_2Over3Over2(c0.element, c1.element, Fq12Parameters); +// } +// +// public BLS12_377Fq12 self() { +// return this; +// } +// +// public Fp12_2Over3Over2 element() { +// return this.element; +// } +// +// public BLS12_377Fq12 zero() { +// return ZERO; +// } +// +// public BLS12_377Fq12 one() { +// return ONE; +// } +// +// public BLS12_377Fq12 construct(final Fp12_2Over3Over2 element) { +// return new BLS12_377Fq12(element); +// } +// +// public String toString() { +// return this.element.toString(); +// } +// } +//} +// \ No newline at end of file From 6f4b0ad4f09b11f47fc8ef783210d848bffb4060 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 17:35:52 +0000 Subject: [PATCH 55/94] Introduced mock folder to hold testing data for the fields --- .../fields/{ => mock}/fieldparameters/LargeFpParameters.java | 2 +- src/main/java/algebra/fields/mock/fieldparameters/README.md | 3 +++ .../fieldparameters/SmallFp12_2Over3Over2_Parameters.java | 2 +- .../fields/{ => mock}/fieldparameters/SmallFp2Parameters.java | 2 +- .../fields/{ => mock}/fieldparameters/SmallFp3Parameters.java | 2 +- .../{ => mock}/fieldparameters/SmallFp6_2Over3_Parameters.java | 2 +- .../{ => mock}/fieldparameters/SmallFp6_3Over2_Parameters.java | 2 +- .../fields/{ => mock}/fieldparameters/SmallFpParameters.java | 2 +- .../java/profiler/profiling/MatrixMultiplicationProfiling.java | 2 +- src/test/java/algebra/curves/BilinearityTest.java | 2 +- src/test/java/algebra/fft/SerialFFTTest.java | 2 +- src/test/java/algebra/fields/FieldsTest.java | 2 +- src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java | 2 +- src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java | 2 +- src/test/java/algebra/msm/SerialFixedBaseMSMTest.java | 2 +- src/test/java/bace/BaceTest.java | 2 +- src/test/java/common/DistributedNaiveEvaluationTest.java | 2 +- src/test/java/common/SerialNaiveEvaluationTest.java | 2 +- src/test/java/reductions/R1CStoQAPRDDTest.java | 2 +- src/test/java/relations/MatMulTest.java | 2 +- src/test/java/relations/R1CSConstructorTest.java | 2 +- 21 files changed, 23 insertions(+), 20 deletions(-) rename src/main/java/algebra/fields/{ => mock}/fieldparameters/LargeFpParameters.java (98%) create mode 100644 src/main/java/algebra/fields/mock/fieldparameters/README.md rename src/main/java/algebra/fields/{ => mock}/fieldparameters/SmallFp12_2Over3Over2_Parameters.java (98%) rename src/main/java/algebra/fields/{ => mock}/fieldparameters/SmallFp2Parameters.java (98%) rename src/main/java/algebra/fields/{ => mock}/fieldparameters/SmallFp3Parameters.java (97%) rename src/main/java/algebra/fields/{ => mock}/fieldparameters/SmallFp6_2Over3_Parameters.java (97%) rename src/main/java/algebra/fields/{ => mock}/fieldparameters/SmallFp6_3Over2_Parameters.java (98%) rename src/main/java/algebra/fields/{ => mock}/fieldparameters/SmallFpParameters.java (98%) diff --git a/src/main/java/algebra/fields/fieldparameters/LargeFpParameters.java b/src/main/java/algebra/fields/mock/fieldparameters/LargeFpParameters.java similarity index 98% rename from src/main/java/algebra/fields/fieldparameters/LargeFpParameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/LargeFpParameters.java index 34fcc14..e9284f8 100755 --- a/src/main/java/algebra/fields/fieldparameters/LargeFpParameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/LargeFpParameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp; import algebra.fields.abstractfieldparameters.AbstractFpParameters; diff --git a/src/main/java/algebra/fields/mock/fieldparameters/README.md b/src/main/java/algebra/fields/mock/fieldparameters/README.md new file mode 100644 index 0000000..6a57988 --- /dev/null +++ b/src/main/java/algebra/fields/mock/fieldparameters/README.md @@ -0,0 +1,3 @@ +# Mock field parameters + +Mock data used for testing purposes. \ No newline at end of file diff --git a/src/main/java/algebra/fields/fieldparameters/SmallFp12_2Over3Over2_Parameters.java b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp12_2Over3Over2_Parameters.java similarity index 98% rename from src/main/java/algebra/fields/fieldparameters/SmallFp12_2Over3Over2_Parameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/SmallFp12_2Over3Over2_Parameters.java index 7927d30..8eb6491 100755 --- a/src/main/java/algebra/fields/fieldparameters/SmallFp12_2Over3Over2_Parameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp12_2Over3Over2_Parameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp12_2Over3Over2; import algebra.fields.Fp2; diff --git a/src/main/java/algebra/fields/fieldparameters/SmallFp2Parameters.java b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp2Parameters.java similarity index 98% rename from src/main/java/algebra/fields/fieldparameters/SmallFp2Parameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/SmallFp2Parameters.java index 7653449..9db1af3 100755 --- a/src/main/java/algebra/fields/fieldparameters/SmallFp2Parameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp2Parameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp; import algebra.fields.Fp2; diff --git a/src/main/java/algebra/fields/fieldparameters/SmallFp3Parameters.java b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp3Parameters.java similarity index 97% rename from src/main/java/algebra/fields/fieldparameters/SmallFp3Parameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/SmallFp3Parameters.java index 4d61a6f..176af28 100755 --- a/src/main/java/algebra/fields/fieldparameters/SmallFp3Parameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp3Parameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp; import algebra.fields.Fp3; diff --git a/src/main/java/algebra/fields/fieldparameters/SmallFp6_2Over3_Parameters.java b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp6_2Over3_Parameters.java similarity index 97% rename from src/main/java/algebra/fields/fieldparameters/SmallFp6_2Over3_Parameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/SmallFp6_2Over3_Parameters.java index 33b4bca..49d9246 100755 --- a/src/main/java/algebra/fields/fieldparameters/SmallFp6_2Over3_Parameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp6_2Over3_Parameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp; import algebra.fields.Fp6_2Over3; diff --git a/src/main/java/algebra/fields/fieldparameters/SmallFp6_3Over2_Parameters.java b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp6_3Over2_Parameters.java similarity index 98% rename from src/main/java/algebra/fields/fieldparameters/SmallFp6_3Over2_Parameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/SmallFp6_3Over2_Parameters.java index 9b1173f..f7b4f00 100755 --- a/src/main/java/algebra/fields/fieldparameters/SmallFp6_3Over2_Parameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/SmallFp6_3Over2_Parameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp2; import algebra.fields.Fp6_3Over2; diff --git a/src/main/java/algebra/fields/fieldparameters/SmallFpParameters.java b/src/main/java/algebra/fields/mock/fieldparameters/SmallFpParameters.java similarity index 98% rename from src/main/java/algebra/fields/fieldparameters/SmallFpParameters.java rename to src/main/java/algebra/fields/mock/fieldparameters/SmallFpParameters.java index 7926fd3..7d853d7 100755 --- a/src/main/java/algebra/fields/fieldparameters/SmallFpParameters.java +++ b/src/main/java/algebra/fields/mock/fieldparameters/SmallFpParameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.fields.fieldparameters; +package algebra.fields.mock.fieldparameters; import algebra.fields.Fp; import algebra.fields.abstractfieldparameters.AbstractFpParameters; diff --git a/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java b/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java index 6ff21f7..c2615e8 100755 --- a/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java +++ b/src/main/java/profiler/profiling/MatrixMultiplicationProfiling.java @@ -1,7 +1,7 @@ package profiler.profiling; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import configuration.Configuration; import org.apache.spark.api.java.JavaPairRDD; import profiler.generation.R1CSConstructor; diff --git a/src/test/java/algebra/curves/BilinearityTest.java b/src/test/java/algebra/curves/BilinearityTest.java index bb18c93..969ea5f 100755 --- a/src/test/java/algebra/curves/BilinearityTest.java +++ b/src/test/java/algebra/curves/BilinearityTest.java @@ -32,7 +32,7 @@ import algebra.curves.fake.fake_parameters.FakeGTParameters; import algebra.fields.AbstractFieldElementExpanded; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import org.junit.jupiter.api.Test; public class BilinearityTest { diff --git a/src/test/java/algebra/fft/SerialFFTTest.java b/src/test/java/algebra/fft/SerialFFTTest.java index afd8ea9..6677f57 100755 --- a/src/test/java/algebra/fft/SerialFFTTest.java +++ b/src/test/java/algebra/fft/SerialFFTTest.java @@ -11,7 +11,7 @@ import algebra.fields.ComplexField; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import common.NaiveEvaluation; import java.io.Serializable; import java.util.ArrayList; diff --git a/src/test/java/algebra/fields/FieldsTest.java b/src/test/java/algebra/fields/FieldsTest.java index 914ecd7..da49054 100755 --- a/src/test/java/algebra/fields/FieldsTest.java +++ b/src/test/java/algebra/fields/FieldsTest.java @@ -13,7 +13,7 @@ import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; import algebra.fields.abstractfieldparameters.AbstractFp2Parameters; import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; -import algebra.fields.fieldparameters.*; +import algebra.fields.mock.fieldparameters.*; import org.junit.jupiter.api.Test; public class FieldsTest { diff --git a/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java b/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java index 1ecd990..fee206d 100755 --- a/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java +++ b/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java @@ -14,7 +14,7 @@ import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import algebra.groups.AdditiveIntegerGroup; import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; import common.Utils; diff --git a/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java b/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java index b4ecc16..77d1cfc 100755 --- a/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java +++ b/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java @@ -10,7 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import algebra.groups.AdditiveIntegerGroup; import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; import java.io.Serializable; diff --git a/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java b/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java index ac9b10e..3aa3711 100755 --- a/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java +++ b/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java @@ -10,7 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import algebra.groups.AdditiveIntegerGroup; import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; import java.io.Serializable; diff --git a/src/test/java/bace/BaceTest.java b/src/test/java/bace/BaceTest.java index e71cc3c..07d8a44 100755 --- a/src/test/java/bace/BaceTest.java +++ b/src/test/java/bace/BaceTest.java @@ -11,7 +11,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import bace.circuit.Circuit; import bace.circuit.InputGate; import common.Utils; diff --git a/src/test/java/common/DistributedNaiveEvaluationTest.java b/src/test/java/common/DistributedNaiveEvaluationTest.java index 6bc9c1f..b8f8d95 100755 --- a/src/test/java/common/DistributedNaiveEvaluationTest.java +++ b/src/test/java/common/DistributedNaiveEvaluationTest.java @@ -10,7 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import java.io.Serializable; import java.util.ArrayList; import java.util.List; diff --git a/src/test/java/common/SerialNaiveEvaluationTest.java b/src/test/java/common/SerialNaiveEvaluationTest.java index 6d73ff0..2a89cd9 100755 --- a/src/test/java/common/SerialNaiveEvaluationTest.java +++ b/src/test/java/common/SerialNaiveEvaluationTest.java @@ -10,7 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import java.io.Serializable; import java.util.ArrayList; import org.junit.jupiter.api.Test; diff --git a/src/test/java/reductions/R1CStoQAPRDDTest.java b/src/test/java/reductions/R1CStoQAPRDDTest.java index eb46368..444fe60 100755 --- a/src/test/java/reductions/R1CStoQAPRDDTest.java +++ b/src/test/java/reductions/R1CStoQAPRDDTest.java @@ -10,7 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import configuration.Configuration; import java.io.Serializable; import org.apache.spark.api.java.JavaPairRDD; diff --git a/src/test/java/relations/MatMulTest.java b/src/test/java/relations/MatMulTest.java index f23c089..19e3d29 100755 --- a/src/test/java/relations/MatMulTest.java +++ b/src/test/java/relations/MatMulTest.java @@ -11,7 +11,7 @@ import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import configuration.Configuration; import java.io.Serializable; import org.apache.spark.api.java.JavaPairRDD; diff --git a/src/test/java/relations/R1CSConstructorTest.java b/src/test/java/relations/R1CSConstructorTest.java index a612bed..25a1321 100755 --- a/src/test/java/relations/R1CSConstructorTest.java +++ b/src/test/java/relations/R1CSConstructorTest.java @@ -10,7 +10,7 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.Fp; -import algebra.fields.fieldparameters.LargeFpParameters; +import algebra.fields.mock.fieldparameters.LargeFpParameters; import configuration.Configuration; import java.io.Serializable; import org.apache.spark.api.java.JavaPairRDD; From 3f36fc39dd80cca2e5dfd81f8d42d61b9048510e Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 17:58:32 +0000 Subject: [PATCH 56/94] Renamed fake curve as mock --- .../java/algebra/curves/{fake => mock}/FakeG1.java | 4 ++-- .../java/algebra/curves/{fake => mock}/FakeG2.java | 4 ++-- .../java/algebra/curves/{fake => mock}/FakeGT.java | 4 ++-- .../algebra/curves/{fake => mock}/FakeInitialize.java | 4 ++-- .../algebra/curves/{fake => mock}/FakePairing.java | 7 ++++--- .../AbstractFakeG1Parameters.java | 7 ++++--- .../AbstractFakeG2Parameters.java | 7 ++++--- .../AbstractFakeGTParameters.java | 6 +++--- .../fake_parameters/FakeFqParameters.java | 2 +- .../fake_parameters/FakeG1Parameters.java | 7 ++++--- .../fake_parameters/FakeG2Parameters.java | 7 ++++--- .../fake_parameters/FakeGTParameters.java | 7 ++++--- src/test/java/algebra/curves/BilinearityTest.java | 8 ++++---- src/test/java/algebra/curves/CurvesTest.java | 10 +++++----- .../zkSNARK/groth16/DistributedzkSNARKTest.java | 2 +- .../zkSNARK/groth16/SerialzkSNARKTest.java | 8 ++++---- .../zkSNARK/grothBGM17/DistributedzkSNARKTest.java | 2 +- 17 files changed, 51 insertions(+), 45 deletions(-) rename src/main/java/algebra/curves/{fake => mock}/FakeG1.java (96%) rename src/main/java/algebra/curves/{fake => mock}/FakeG2.java (96%) rename src/main/java/algebra/curves/{fake => mock}/FakeGT.java (95%) rename src/main/java/algebra/curves/{fake => mock}/FakeInitialize.java (84%) rename src/main/java/algebra/curves/{fake => mock}/FakePairing.java (95%) rename src/main/java/algebra/curves/{fake => mock}/abstract_fake_parameters/AbstractFakeG1Parameters.java (79%) rename src/main/java/algebra/curves/{fake => mock}/abstract_fake_parameters/AbstractFakeG2Parameters.java (79%) rename src/main/java/algebra/curves/{fake => mock}/abstract_fake_parameters/AbstractFakeGTParameters.java (75%) rename src/main/java/algebra/curves/{fake => mock}/fake_parameters/FakeFqParameters.java (98%) rename src/main/java/algebra/curves/{fake => mock}/fake_parameters/FakeG1Parameters.java (89%) rename src/main/java/algebra/curves/{fake => mock}/fake_parameters/FakeG2Parameters.java (89%) rename src/main/java/algebra/curves/{fake => mock}/fake_parameters/FakeGTParameters.java (85%) diff --git a/src/main/java/algebra/curves/fake/FakeG1.java b/src/main/java/algebra/curves/mock/FakeG1.java similarity index 96% rename from src/main/java/algebra/curves/fake/FakeG1.java rename to src/main/java/algebra/curves/mock/FakeG1.java index 934bd24..7b2fd9c 100755 --- a/src/main/java/algebra/curves/fake/FakeG1.java +++ b/src/main/java/algebra/curves/mock/FakeG1.java @@ -5,10 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake; +package algebra.curves.mock; import algebra.curves.AbstractG1; -import algebra.curves.fake.abstract_fake_parameters.AbstractFakeG1Parameters; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG1Parameters; import algebra.fields.Fp; import java.math.BigInteger; import java.util.ArrayList; diff --git a/src/main/java/algebra/curves/fake/FakeG2.java b/src/main/java/algebra/curves/mock/FakeG2.java similarity index 96% rename from src/main/java/algebra/curves/fake/FakeG2.java rename to src/main/java/algebra/curves/mock/FakeG2.java index 51ff022..8b46731 100755 --- a/src/main/java/algebra/curves/fake/FakeG2.java +++ b/src/main/java/algebra/curves/mock/FakeG2.java @@ -5,10 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake; +package algebra.curves.mock; import algebra.curves.AbstractG2; -import algebra.curves.fake.abstract_fake_parameters.AbstractFakeG2Parameters; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG2Parameters; import algebra.fields.Fp; import java.math.BigInteger; import java.util.ArrayList; diff --git a/src/main/java/algebra/curves/fake/FakeGT.java b/src/main/java/algebra/curves/mock/FakeGT.java similarity index 95% rename from src/main/java/algebra/curves/fake/FakeGT.java rename to src/main/java/algebra/curves/mock/FakeGT.java index 22b202c..4f2e2c6 100755 --- a/src/main/java/algebra/curves/fake/FakeGT.java +++ b/src/main/java/algebra/curves/mock/FakeGT.java @@ -5,10 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake; +package algebra.curves.mock; import algebra.curves.AbstractGT; -import algebra.curves.fake.abstract_fake_parameters.AbstractFakeGTParameters; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeGTParameters; import algebra.fields.Fp; import java.io.Serializable; import java.math.BigInteger; diff --git a/src/main/java/algebra/curves/fake/FakeInitialize.java b/src/main/java/algebra/curves/mock/FakeInitialize.java similarity index 84% rename from src/main/java/algebra/curves/fake/FakeInitialize.java rename to src/main/java/algebra/curves/mock/FakeInitialize.java index d508944..041f55a 100755 --- a/src/main/java/algebra/curves/fake/FakeInitialize.java +++ b/src/main/java/algebra/curves/mock/FakeInitialize.java @@ -5,9 +5,9 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake; +package algebra.curves.mock; -import algebra.curves.fake.fake_parameters.FakeGTParameters; +import algebra.curves.mock.fake_parameters.FakeGTParameters; public class FakeInitialize { diff --git a/src/main/java/algebra/curves/fake/FakePairing.java b/src/main/java/algebra/curves/mock/FakePairing.java similarity index 95% rename from src/main/java/algebra/curves/fake/FakePairing.java rename to src/main/java/algebra/curves/mock/FakePairing.java index 8b51093..35ca5c0 100755 --- a/src/main/java/algebra/curves/fake/FakePairing.java +++ b/src/main/java/algebra/curves/mock/FakePairing.java @@ -5,12 +5,13 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake; - -import static algebra.curves.fake.FakeInitialize.GTParameters; +package algebra.curves.mock; import algebra.curves.AbstractPairing; import algebra.fields.Fp; + +import static algebra.curves.mock.FakeInitialize.GTParameters; + import java.math.BigInteger; public class FakePairing extends AbstractPairing { diff --git a/src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeG1Parameters.java b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java similarity index 79% rename from src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeG1Parameters.java rename to src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java index dfe0d5e..8fd5707 100755 --- a/src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeG1Parameters.java +++ b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java @@ -5,12 +5,13 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.abstract_fake_parameters; +package algebra.curves.mock.abstract_fake_parameters; -import algebra.curves.fake.FakeG1; -import algebra.curves.fake.fake_parameters.FakeFqParameters; import java.util.ArrayList; +import algebra.curves.mock.FakeG1; +import algebra.curves.mock.fake_parameters.FakeFqParameters; + public abstract class AbstractFakeG1Parameters { public abstract FakeFqParameters FqParameters(); diff --git a/src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeG2Parameters.java b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java similarity index 79% rename from src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeG2Parameters.java rename to src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java index 24aaed4..c68d655 100755 --- a/src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeG2Parameters.java +++ b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java @@ -5,12 +5,13 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.abstract_fake_parameters; +package algebra.curves.mock.abstract_fake_parameters; -import algebra.curves.fake.FakeG2; -import algebra.curves.fake.fake_parameters.FakeFqParameters; import java.util.ArrayList; +import algebra.curves.mock.FakeG2; +import algebra.curves.mock.fake_parameters.FakeFqParameters; + public abstract class AbstractFakeG2Parameters { public abstract FakeFqParameters FqParameters(); diff --git a/src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeGTParameters.java b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeGTParameters.java similarity index 75% rename from src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeGTParameters.java rename to src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeGTParameters.java index 1905b90..b249967 100755 --- a/src/main/java/algebra/curves/fake/abstract_fake_parameters/AbstractFakeGTParameters.java +++ b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeGTParameters.java @@ -5,10 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.abstract_fake_parameters; +package algebra.curves.mock.abstract_fake_parameters; -import algebra.curves.fake.FakeGT; -import algebra.curves.fake.fake_parameters.FakeFqParameters; +import algebra.curves.mock.FakeGT; +import algebra.curves.mock.fake_parameters.FakeFqParameters; public abstract class AbstractFakeGTParameters { diff --git a/src/main/java/algebra/curves/fake/fake_parameters/FakeFqParameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeFqParameters.java similarity index 98% rename from src/main/java/algebra/curves/fake/fake_parameters/FakeFqParameters.java rename to src/main/java/algebra/curves/mock/fake_parameters/FakeFqParameters.java index 4948e4a..c49cf64 100755 --- a/src/main/java/algebra/curves/fake/fake_parameters/FakeFqParameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeFqParameters.java @@ -5,7 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.fake_parameters; +package algebra.curves.mock.fake_parameters; import algebra.fields.Fp; import algebra.fields.abstractfieldparameters.AbstractFpParameters; diff --git a/src/main/java/algebra/curves/fake/fake_parameters/FakeG1Parameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java similarity index 89% rename from src/main/java/algebra/curves/fake/fake_parameters/FakeG1Parameters.java rename to src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java index a454be9..64f5991 100755 --- a/src/main/java/algebra/curves/fake/fake_parameters/FakeG1Parameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java @@ -5,14 +5,15 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.fake_parameters; +package algebra.curves.mock.fake_parameters; -import algebra.curves.fake.FakeG1; -import algebra.curves.fake.abstract_fake_parameters.AbstractFakeG1Parameters; import java.io.Serializable; import java.math.BigInteger; import java.util.ArrayList; +import algebra.curves.mock.FakeG1; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG1Parameters; + public class FakeG1Parameters extends AbstractFakeG1Parameters implements Serializable { private FakeFqParameters FqParameters; diff --git a/src/main/java/algebra/curves/fake/fake_parameters/FakeG2Parameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java similarity index 89% rename from src/main/java/algebra/curves/fake/fake_parameters/FakeG2Parameters.java rename to src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java index b47222b..097d5b1 100755 --- a/src/main/java/algebra/curves/fake/fake_parameters/FakeG2Parameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java @@ -5,14 +5,15 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.fake_parameters; +package algebra.curves.mock.fake_parameters; -import algebra.curves.fake.FakeG2; -import algebra.curves.fake.abstract_fake_parameters.AbstractFakeG2Parameters; import java.io.Serializable; import java.math.BigInteger; import java.util.ArrayList; +import algebra.curves.mock.FakeG2; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG2Parameters; + public class FakeG2Parameters extends AbstractFakeG2Parameters implements Serializable { private FakeFqParameters FqParameters; diff --git a/src/main/java/algebra/curves/fake/fake_parameters/FakeGTParameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java similarity index 85% rename from src/main/java/algebra/curves/fake/fake_parameters/FakeGTParameters.java rename to src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java index 22a72b8..f645677 100755 --- a/src/main/java/algebra/curves/fake/fake_parameters/FakeGTParameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java @@ -5,13 +5,14 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves.fake.fake_parameters; +package algebra.curves.mock.fake_parameters; -import algebra.curves.fake.FakeGT; -import algebra.curves.fake.abstract_fake_parameters.AbstractFakeGTParameters; import java.io.Serializable; import java.math.BigInteger; +import algebra.curves.mock.FakeGT; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeGTParameters; + public class FakeGTParameters extends AbstractFakeGTParameters implements Serializable { private FakeFqParameters FqParameters; diff --git a/src/test/java/algebra/curves/BilinearityTest.java b/src/test/java/algebra/curves/BilinearityTest.java index 969ea5f..1aaa8d9 100755 --- a/src/test/java/algebra/curves/BilinearityTest.java +++ b/src/test/java/algebra/curves/BilinearityTest.java @@ -26,10 +26,10 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bGTParameters; -import algebra.curves.fake.*; -import algebra.curves.fake.fake_parameters.FakeG1Parameters; -import algebra.curves.fake.fake_parameters.FakeG2Parameters; -import algebra.curves.fake.fake_parameters.FakeGTParameters; +import algebra.curves.mock.*; +import algebra.curves.mock.fake_parameters.FakeG1Parameters; +import algebra.curves.mock.fake_parameters.FakeG2Parameters; +import algebra.curves.mock.fake_parameters.FakeGTParameters; import algebra.fields.AbstractFieldElementExpanded; import algebra.fields.Fp; import algebra.fields.mock.fieldparameters.LargeFpParameters; diff --git a/src/test/java/algebra/curves/CurvesTest.java b/src/test/java/algebra/curves/CurvesTest.java index 2cf68db..7fea749 100755 --- a/src/test/java/algebra/curves/CurvesTest.java +++ b/src/test/java/algebra/curves/CurvesTest.java @@ -14,11 +14,11 @@ import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.fake.FakeG1; -import algebra.curves.fake.FakeG2; -import algebra.curves.fake.FakeInitialize; -import algebra.curves.fake.fake_parameters.FakeG1Parameters; -import algebra.curves.fake.fake_parameters.FakeG2Parameters; +import algebra.curves.mock.FakeG1; +import algebra.curves.mock.FakeG2; +import algebra.curves.mock.FakeInitialize; +import algebra.curves.mock.fake_parameters.FakeG1Parameters; +import algebra.curves.mock.fake_parameters.FakeG2Parameters; import algebra.groups.AbstractGroup; import java.math.BigInteger; import org.junit.jupiter.api.Test; diff --git a/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java index 76988c2..6500baf 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/groth16/DistributedzkSNARKTest.java @@ -20,7 +20,7 @@ import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.fake.*; +import algebra.curves.mock.*; import configuration.Configuration; import java.io.Serializable; import org.apache.spark.SparkConf; diff --git a/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java index 6e52e01..f3cd351 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/groth16/SerialzkSNARKTest.java @@ -25,10 +25,10 @@ import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.fake.*; -import algebra.curves.fake.fake_parameters.FakeFqParameters; -import algebra.curves.fake.fake_parameters.FakeG1Parameters; -import algebra.curves.fake.fake_parameters.FakeG2Parameters; +import algebra.curves.mock.*; +import algebra.curves.mock.fake_parameters.FakeFqParameters; +import algebra.curves.mock.fake_parameters.FakeG1Parameters; +import algebra.curves.mock.fake_parameters.FakeG2Parameters; import algebra.fields.Fp; import configuration.Configuration; import java.io.Serializable; diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java index 33119d7..4437839 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java @@ -20,7 +20,7 @@ import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.fake.*; +import algebra.curves.mock.*; import configuration.Configuration; import java.io.Serializable; import org.apache.spark.SparkConf; From 898130c382b0d123176b03c2a66a77e1b13689f6 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Thu, 7 Jan 2021 18:18:54 +0000 Subject: [PATCH 57/94] Moved additive group to a mock folder --- .../bls12_377/BLS12_377Fields.java | 14 +++++--------- src/main/java/algebra/curves/mock/FakePairing.java | 5 ++--- src/main/java/algebra/curves/mock/README.md | 3 +++ .../AbstractFakeG1Parameters.java | 3 +-- .../AbstractFakeG2Parameters.java | 3 +-- .../mock/fake_parameters/FakeG1Parameters.java | 5 ++--- .../mock/fake_parameters/FakeG2Parameters.java | 5 ++--- .../mock/fake_parameters/FakeGTParameters.java | 5 ++--- src/main/java/algebra/groups/AbstractGroup.java | 1 + .../groups/{ => mock}/AdditiveIntegerGroup.java | 5 +++-- src/main/java/algebra/groups/mock/README.md | 3 +++ .../AbstractAdditiveIntegerGroupParameters.java | 4 ++-- .../LargeAdditiveIntegerGroupParameters.java | 6 +++--- .../algebra/groups/AdditiveIntegerGroupTest.java | 3 ++- .../algebra/msm/DistributedFixedBaseMSMTest.java | 4 ++-- .../msm/DistributedVariableBaseMSMTest.java | 4 ++-- .../java/algebra/msm/SerialFixedBaseMSMTest.java | 4 ++-- .../algebra/msm/SerialVariableBaseMSMTest.java | 4 ++-- 18 files changed, 40 insertions(+), 41 deletions(-) create mode 100644 src/main/java/algebra/curves/mock/README.md rename src/main/java/algebra/groups/{ => mock}/AdditiveIntegerGroup.java (95%) create mode 100644 src/main/java/algebra/groups/mock/README.md rename src/main/java/algebra/groups/{ => mock}/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java (83%) rename src/main/java/algebra/groups/{ => mock}/integergroupparameters/LargeAdditiveIntegerGroupParameters.java (84%) diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java index 6fcfc4f..8eead20 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java @@ -2,13 +2,8 @@ import algebra.curves.barreto_naehrig.BNFields.*; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.*; -import algebra.fields.Fp; -import algebra.fields.Fp12_2Over3Over2; -import algebra.fields.Fp2; -import algebra.fields.Fp6_3Over2; -import java.math.BigInteger; -//public class BLS12_377Fields { +// public class BLS12_377Fields { // /* Scalar field Fr */ // public static class BLS12_377Fr extends BNFr { // @@ -246,7 +241,8 @@ // } // // /* Field Fq12 */ -// public static class BLS12_377Fq12 extends BNFq12 { +// public static class BLS12_377Fq12 extends BNFq12 { // // public static final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); // public static BLS12_377Fq12 ZERO = new BLS12_377Fq12(Fq12Parameters.ZERO()); @@ -286,5 +282,5 @@ // return this.element.toString(); // } // } -//} -// \ No newline at end of file +// } +// diff --git a/src/main/java/algebra/curves/mock/FakePairing.java b/src/main/java/algebra/curves/mock/FakePairing.java index 35ca5c0..f680b75 100755 --- a/src/main/java/algebra/curves/mock/FakePairing.java +++ b/src/main/java/algebra/curves/mock/FakePairing.java @@ -7,11 +7,10 @@ package algebra.curves.mock; -import algebra.curves.AbstractPairing; -import algebra.fields.Fp; - import static algebra.curves.mock.FakeInitialize.GTParameters; +import algebra.curves.AbstractPairing; +import algebra.fields.Fp; import java.math.BigInteger; public class FakePairing extends AbstractPairing { diff --git a/src/main/java/algebra/curves/mock/README.md b/src/main/java/algebra/curves/mock/README.md new file mode 100644 index 0000000..92fd869 --- /dev/null +++ b/src/main/java/algebra/curves/mock/README.md @@ -0,0 +1,3 @@ +# Mock curve + +Curve used for testing purposes. \ No newline at end of file diff --git a/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java index 8fd5707..a692198 100755 --- a/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java +++ b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG1Parameters.java @@ -7,10 +7,9 @@ package algebra.curves.mock.abstract_fake_parameters; -import java.util.ArrayList; - import algebra.curves.mock.FakeG1; import algebra.curves.mock.fake_parameters.FakeFqParameters; +import java.util.ArrayList; public abstract class AbstractFakeG1Parameters { diff --git a/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java index c68d655..c20fc5a 100755 --- a/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java +++ b/src/main/java/algebra/curves/mock/abstract_fake_parameters/AbstractFakeG2Parameters.java @@ -7,10 +7,9 @@ package algebra.curves.mock.abstract_fake_parameters; -import java.util.ArrayList; - import algebra.curves.mock.FakeG2; import algebra.curves.mock.fake_parameters.FakeFqParameters; +import java.util.ArrayList; public abstract class AbstractFakeG2Parameters { diff --git a/src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java index 64f5991..deda06c 100755 --- a/src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeG1Parameters.java @@ -7,13 +7,12 @@ package algebra.curves.mock.fake_parameters; +import algebra.curves.mock.FakeG1; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG1Parameters; import java.io.Serializable; import java.math.BigInteger; import java.util.ArrayList; -import algebra.curves.mock.FakeG1; -import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG1Parameters; - public class FakeG1Parameters extends AbstractFakeG1Parameters implements Serializable { private FakeFqParameters FqParameters; diff --git a/src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java index 097d5b1..e4f343d 100755 --- a/src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeG2Parameters.java @@ -7,13 +7,12 @@ package algebra.curves.mock.fake_parameters; +import algebra.curves.mock.FakeG2; +import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG2Parameters; import java.io.Serializable; import java.math.BigInteger; import java.util.ArrayList; -import algebra.curves.mock.FakeG2; -import algebra.curves.mock.abstract_fake_parameters.AbstractFakeG2Parameters; - public class FakeG2Parameters extends AbstractFakeG2Parameters implements Serializable { private FakeFqParameters FqParameters; diff --git a/src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java b/src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java index f645677..aa2df7a 100755 --- a/src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java +++ b/src/main/java/algebra/curves/mock/fake_parameters/FakeGTParameters.java @@ -7,11 +7,10 @@ package algebra.curves.mock.fake_parameters; -import java.io.Serializable; -import java.math.BigInteger; - import algebra.curves.mock.FakeGT; import algebra.curves.mock.abstract_fake_parameters.AbstractFakeGTParameters; +import java.io.Serializable; +import java.math.BigInteger; public class FakeGTParameters extends AbstractFakeGTParameters implements Serializable { diff --git a/src/main/java/algebra/groups/AbstractGroup.java b/src/main/java/algebra/groups/AbstractGroup.java index cc34d1b..42fa448 100755 --- a/src/main/java/algebra/groups/AbstractGroup.java +++ b/src/main/java/algebra/groups/AbstractGroup.java @@ -12,6 +12,7 @@ import java.math.BigInteger; import java.util.ArrayList; +/** AbstractGroup defines the set of standard operations on group elements */ public abstract class AbstractGroup> implements Serializable { /* Returns self element */ diff --git a/src/main/java/algebra/groups/AdditiveIntegerGroup.java b/src/main/java/algebra/groups/mock/AdditiveIntegerGroup.java similarity index 95% rename from src/main/java/algebra/groups/AdditiveIntegerGroup.java rename to src/main/java/algebra/groups/mock/AdditiveIntegerGroup.java index 814d74f..504146e 100755 --- a/src/main/java/algebra/groups/AdditiveIntegerGroup.java +++ b/src/main/java/algebra/groups/mock/AdditiveIntegerGroup.java @@ -5,9 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.groups; +package algebra.groups.mock; -import algebra.groups.abstractintegergroupparameters.AbstractAdditiveIntegerGroupParameters; +import algebra.groups.AbstractGroup; +import algebra.groups.mock.abstractintegergroupparameters.AbstractAdditiveIntegerGroupParameters; import java.math.BigInteger; import java.security.SecureRandom; import java.util.ArrayList; diff --git a/src/main/java/algebra/groups/mock/README.md b/src/main/java/algebra/groups/mock/README.md new file mode 100644 index 0000000..58e28ff --- /dev/null +++ b/src/main/java/algebra/groups/mock/README.md @@ -0,0 +1,3 @@ +# Mock group + +Additive integer group used as mock data for testing purposes. \ No newline at end of file diff --git a/src/main/java/algebra/groups/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java b/src/main/java/algebra/groups/mock/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java similarity index 83% rename from src/main/java/algebra/groups/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java rename to src/main/java/algebra/groups/mock/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java index 014c796..6972acc 100755 --- a/src/main/java/algebra/groups/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java +++ b/src/main/java/algebra/groups/mock/abstractintegergroupparameters/AbstractAdditiveIntegerGroupParameters.java @@ -5,9 +5,9 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.groups.abstractintegergroupparameters; +package algebra.groups.mock.abstractintegergroupparameters; -import algebra.groups.AdditiveIntegerGroup; +import algebra.groups.mock.AdditiveIntegerGroup; import java.math.BigInteger; public abstract class AbstractAdditiveIntegerGroupParameters { diff --git a/src/main/java/algebra/groups/integergroupparameters/LargeAdditiveIntegerGroupParameters.java b/src/main/java/algebra/groups/mock/integergroupparameters/LargeAdditiveIntegerGroupParameters.java similarity index 84% rename from src/main/java/algebra/groups/integergroupparameters/LargeAdditiveIntegerGroupParameters.java rename to src/main/java/algebra/groups/mock/integergroupparameters/LargeAdditiveIntegerGroupParameters.java index d24b871..ee06a86 100755 --- a/src/main/java/algebra/groups/integergroupparameters/LargeAdditiveIntegerGroupParameters.java +++ b/src/main/java/algebra/groups/mock/integergroupparameters/LargeAdditiveIntegerGroupParameters.java @@ -5,10 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.groups.integergroupparameters; +package algebra.groups.mock.integergroupparameters; -import algebra.groups.AdditiveIntegerGroup; -import algebra.groups.abstractintegergroupparameters.AbstractAdditiveIntegerGroupParameters; +import algebra.groups.mock.AdditiveIntegerGroup; +import algebra.groups.mock.abstractintegergroupparameters.AbstractAdditiveIntegerGroupParameters; import java.io.Serializable; import java.math.BigInteger; diff --git a/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java b/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java index 6858a58..5f42ab3 100755 --- a/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java +++ b/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java @@ -10,7 +10,8 @@ import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; -import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; +import algebra.groups.mock.AdditiveIntegerGroup; +import algebra.groups.mock.integergroupparameters.LargeAdditiveIntegerGroupParameters; import java.io.Serializable; import java.math.BigInteger; import org.junit.jupiter.api.Test; diff --git a/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java b/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java index fee206d..77cdfe3 100755 --- a/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java +++ b/src/test/java/algebra/msm/DistributedFixedBaseMSMTest.java @@ -15,8 +15,8 @@ import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.fields.Fp; import algebra.fields.mock.fieldparameters.LargeFpParameters; -import algebra.groups.AdditiveIntegerGroup; -import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; +import algebra.groups.mock.AdditiveIntegerGroup; +import algebra.groups.mock.integergroupparameters.LargeAdditiveIntegerGroupParameters; import common.Utils; import java.io.Serializable; import java.util.ArrayList; diff --git a/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java b/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java index 77d1cfc..7c86d0a 100755 --- a/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java +++ b/src/test/java/algebra/msm/DistributedVariableBaseMSMTest.java @@ -11,8 +11,8 @@ import algebra.fields.Fp; import algebra.fields.mock.fieldparameters.LargeFpParameters; -import algebra.groups.AdditiveIntegerGroup; -import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; +import algebra.groups.mock.AdditiveIntegerGroup; +import algebra.groups.mock.integergroupparameters.LargeAdditiveIntegerGroupParameters; import java.io.Serializable; import java.util.ArrayList; import org.apache.spark.api.java.JavaRDD; diff --git a/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java b/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java index 3aa3711..325d24b 100755 --- a/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java +++ b/src/test/java/algebra/msm/SerialFixedBaseMSMTest.java @@ -11,8 +11,8 @@ import algebra.fields.Fp; import algebra.fields.mock.fieldparameters.LargeFpParameters; -import algebra.groups.AdditiveIntegerGroup; -import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; +import algebra.groups.mock.AdditiveIntegerGroup; +import algebra.groups.mock.integergroupparameters.LargeAdditiveIntegerGroupParameters; import java.io.Serializable; import java.math.BigInteger; import java.util.ArrayList; diff --git a/src/test/java/algebra/msm/SerialVariableBaseMSMTest.java b/src/test/java/algebra/msm/SerialVariableBaseMSMTest.java index 2120a62..4cf6894 100755 --- a/src/test/java/algebra/msm/SerialVariableBaseMSMTest.java +++ b/src/test/java/algebra/msm/SerialVariableBaseMSMTest.java @@ -9,8 +9,8 @@ import static org.junit.jupiter.api.Assertions.assertTrue; -import algebra.groups.AdditiveIntegerGroup; -import algebra.groups.integergroupparameters.LargeAdditiveIntegerGroupParameters; +import algebra.groups.mock.AdditiveIntegerGroup; +import algebra.groups.mock.integergroupparameters.LargeAdditiveIntegerGroupParameters; import java.io.Serializable; import java.math.BigInteger; import java.util.ArrayList; From f8355ac4b55efc8809b185544714b17662eef9ba Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 8 Jan 2021 14:49:29 +0000 Subject: [PATCH 58/94] Added README for the groups package --- src/main/java/algebra/groups/README.md | 23 +++++++++++++++++++++++ 1 file changed, 23 insertions(+) create mode 100644 src/main/java/algebra/groups/README.md diff --git a/src/main/java/algebra/groups/README.md b/src/main/java/algebra/groups/README.md new file mode 100644 index 0000000..078f951 --- /dev/null +++ b/src/main/java/algebra/groups/README.md @@ -0,0 +1,23 @@ +# algebra.groups + +This folder contains the definition of `AbstractGroup` which defines the interface that needs to be implemented by groups. + +An example implementation of a group can be found in `mock`. +As illustrated there, to implement a new group, one needs to: +1. Create a new folder (e.g. `mygroup`). +1. Create `mygroup/MyGroup.java` that holds the generic implementation of the group. To do so, inherit from the `AbstractGroup` class and implement all its functions, e.g. +```java +public class MyGroup extends AbstractGroup { + // Implement the abstract functions defined in AbstractGroup.java + // in order to define the group operations etc. +} +``` +2. Instantiate the specific group of interest by setting its parameters. *Note:* if the group is a particular instantiation in a family (e.g. a specific curve in a curve family) it is worth exposing some generic interface to manipulate the group parameters outside of the package. To do so: + - Create `mygroup/abstractparameters/parameters.java` to hold the common interface to expose the parameters of the various group instantiation + - Instantiate the specific group's (e.g. `MyGroup1`) parameters `mygroup/mygroup1parameters/parameters.java` by inheriting from the abstract class `mygroup.AbstractParameters.Parameters` and implementing the appropriate functions, e.g. +```java +public class MyGroupParameters extends AbstractGroupParameters { + // Implement the abstract functions defined in AbstractGroupParameters.java + // in order to define the specific group's parameters etc. +} +``` From 316d74ccb6f948db289f9747fb571521a009122d Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 8 Jan 2021 15:22:16 +0000 Subject: [PATCH 59/94] Extended README for the fields package --- src/main/java/algebra/fields/mock/fieldparameters/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/java/algebra/fields/mock/fieldparameters/README.md b/src/main/java/algebra/fields/mock/fieldparameters/README.md index 6a57988..9cee41e 100644 --- a/src/main/java/algebra/fields/mock/fieldparameters/README.md +++ b/src/main/java/algebra/fields/mock/fieldparameters/README.md @@ -1,3 +1,3 @@ # Mock field parameters -Mock data used for testing purposes. \ No newline at end of file +Field implementation used as mock data for testing purposes. \ No newline at end of file From d2bf2bc62c8e84e5ad16eea4ef9e400eadc77bec Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 8 Jan 2021 15:36:38 +0000 Subject: [PATCH 60/94] Added field and curves READMEs --- src/main/java/algebra/curves/README.md | 6 ++++++ src/main/java/algebra/fields/README.md | 13 +++++++++++++ 2 files changed, 19 insertions(+) create mode 100644 src/main/java/algebra/curves/README.md create mode 100644 src/main/java/algebra/fields/README.md diff --git a/src/main/java/algebra/curves/README.md b/src/main/java/algebra/curves/README.md new file mode 100644 index 0000000..d73f9dd --- /dev/null +++ b/src/main/java/algebra/curves/README.md @@ -0,0 +1,6 @@ +# algebra.curves + +The set of points on an elliptic curve along with the usual addition rule ("chord and tangent") define a algebraic group. +As such, implementing a curve family (and/or a unique curve in a family) can be done by following the process described in [algebra.groups](../groups/README.md). +However, a few extra functions are added in the `AbstractG1`, `AbstractG2` etc. files that extend `AbstractGroup` to provide useful functions w.r.t elliptic curves. +As such, these classes are those that need to be inherited and instantiated when implementing a "concrete" elliptic curve. diff --git a/src/main/java/algebra/fields/README.md b/src/main/java/algebra/fields/README.md new file mode 100644 index 0000000..7eaacaf --- /dev/null +++ b/src/main/java/algebra/fields/README.md @@ -0,0 +1,13 @@ +# algebra.fields + +This folder contains: +- The definition of `AbstractFieldElement` which defines the interface that needs to be implemented by fields (this interface is extended in `AbstractFieldElementExpanded`) +- The implementation of: + - Finite field `Fp`, p prime + - Extension fields `Fp^n`, p prime, n \in \NN + - The field of complex numbers \CC + +In order to instantiate the field implementations above, one needs to provide the "field parameters" for the specific instantiation (e.g. provide a value for `p`, the identities etc.). This is done by inheriting and instantiating the classes in `abstractfieldparameters`. + +An example implementation of fields can be found in `mock`. + From d7e50c49f00a161c8941d0b54a61f9c4253947d8 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 8 Jan 2021 17:22:30 +0000 Subject: [PATCH 61/94] First shot implementation of BLS12-377 --- .../curves/barreto_lynn_scott/BLSFields.java | 423 +++++++++++++ .../curves/barreto_lynn_scott/BLSG1.java | 230 +++++++ .../curves/barreto_lynn_scott/BLSG2.java | 233 +++++++ .../curves/barreto_lynn_scott/BLSGT.java | 60 ++ .../curves/barreto_lynn_scott/BLSPairing.java | 345 +++++++++++ .../BLSPublicParameters.java | 81 +++ .../AbstractBLSFq12Parameters.java | 5 + .../AbstractBLSFq2Parameters.java | 5 + .../AbstractBLSFq6Parameters.java | 5 + .../AbstractBLSFqParameters.java | 5 + .../AbstractBLSFrParameters.java | 5 + .../AbstractBLSG1Parameters.java | 24 + .../AbstractBLSG2Parameters.java | 26 + .../AbstractBLSGTParameters.java | 20 + .../bls12_377/BLS12_377Fields.java | 567 +++++++++--------- .../bls12_377/BLS12_377G1.java | 39 ++ .../bls12_377/BLS12_377G2.java | 31 + .../bls12_377/BLS12_377GT.java | 29 + .../bls12_377/BLS12_377Pairing.java | 42 ++ .../bls12_377/BLS12_377PublicParameters.java | 53 ++ .../BLS12_377Fq12Parameters.java | 143 +++++ .../BLS12_377Fq2Parameters.java | 112 ++++ .../BLS12_377Fq6Parameters.java | 122 ++++ .../BLS12_377FqParameters.java | 107 ++++ .../BLS12_377FrParameters.java | 104 ++++ .../BLS12_377G1Parameters.java | 63 ++ .../BLS12_377G2Parameters.java | 76 +++ .../BLS12_377GTParameters.java | 26 + 28 files changed, 2697 insertions(+), 284 deletions(-) create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/BLSPublicParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq12Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq2Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq6Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFqParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFrParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG1Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSGTParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java create mode 100755 src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java new file mode 100755 index 0000000..9d54716 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java @@ -0,0 +1,423 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.fields.*; +import java.math.BigInteger; + +public interface BLSFields { + /* Scalar field Fr */ + public abstract class BLSFr> + extends AbstractFieldElementExpanded { + + public abstract Fp element(); + + public abstract BLSFrT zero(); + + public abstract BLSFrT one(); + + public abstract BLSFrT multiplicativeGenerator(); + + public abstract BLSFrT construct(final Fp element); + + public abstract String toString(); + + public BLSFrT add(final BLSFrT other) { + return this.construct(this.element().add(other.element())); + } + + public BLSFrT sub(final BLSFrT other) { + return this.construct(this.element().sub(other.element())); + } + + public BLSFrT mul(final BLSFrT other) { + return this.construct(this.element().mul(other.element())); + } + + public boolean isZero() { + return this.equals(this.zero()); + } + + public boolean isSpecial() { + return this.isZero(); + } + + public boolean isOne() { + return this.equals(this.one()); + } + + public BLSFrT random(final Long seed, final byte[] secureSeed) { + return this.construct(this.element().random(seed, secureSeed)); + } + + public BLSFrT negate() { + return this.construct(this.element().negate()); + } + + public BLSFrT inverse() { + return this.construct(this.element().inverse()); + } + + public BLSFrT square() { + return this.construct(this.element().square()); + } + + public BLSFrT rootOfUnity(final long size) { + return this.construct(this.element().rootOfUnity(size)); + } + + public int bitSize() { + return this.element().bitSize(); + } + + public BigInteger toBigInteger() { + return this.element().toBigInteger(); + } + + public boolean equals(final BLSFrT other) { + if (other == null) { + return false; + } + + return this.element().equals(other.element()); + } + } + + /* Base field Fq */ + public abstract class BLSFq> + extends AbstractFieldElementExpanded { + public abstract Fp element(); + + public abstract BLSFqT zero(); + + public abstract BLSFqT one(); + + public abstract BLSFqT multiplicativeGenerator(); + + public abstract BLSFqT construct(final long element); + + public abstract BLSFqT construct(final Fp element); + + public abstract String toString(); + + public BLSFqT add(final BLSFqT other) { + return this.construct(this.element().add(other.element())); + } + + public BLSFqT sub(final BLSFqT other) { + return this.construct(this.element().sub(other.element())); + } + + public BLSFqT mul(final BLSFqT other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFqT mul(final Fp other) { + return this.construct(this.element().mul(other)); + } + + public boolean isZero() { + return this.equals(this.zero()); + } + + public boolean isSpecial() { + return this.isZero(); + } + + public boolean isOne() { + return this.equals(this.one()); + } + + public BLSFqT random(final Long seed, final byte[] secureSeed) { + return this.construct(this.element().random(seed, secureSeed)); + } + + public BLSFqT negate() { + return this.construct(this.element().negate()); + } + + public BLSFqT inverse() { + return this.construct(this.element().inverse()); + } + + public BLSFqT square() { + return this.construct(this.element().square()); + } + + public BLSFqT rootOfUnity(final long size) { + return this.construct(this.element().rootOfUnity(size)); + } + + public int bitSize() { + return this.element().bitSize(); + } + + public BigInteger toBigInteger() { + return this.element().toBigInteger(); + } + + public boolean equals(final BLSFqT other) { + if (other == null) { + return false; + } + + return this.element().equals(other.element()); + } + } + + /* Twist field Fq2 */ + public abstract class BLSFq2, BLSFq2T extends BLSFq2> + extends AbstractFieldElement { + public abstract Fp2 element(); + + public abstract BLSFq2T zero(); + + public abstract BLSFq2T one(); + + public abstract BLSFq2T construct(final Fp2 element); + + public abstract String toString(); + + public BLSFq2T add(final BLSFq2T other) { + return this.construct(this.element().add(other.element())); + } + + public BLSFq2T sub(final BLSFq2T other) { + return this.construct(this.element().sub(other.element())); + } + + public BLSFq2T mul(final BLSFq2T other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFq2T mul(final BLSFqT other) { + return this.construct(this.element().mul(other.element())); + } + + public boolean isZero() { + return this.equals(this.zero()); + } + + public boolean isOne() { + return this.equals(this.one()); + } + + public BLSFq2T random(final Long seed, final byte[] secureSeed) { + return this.construct(this.element().random(seed, secureSeed)); + } + + public BLSFq2T negate() { + return this.construct(this.element().negate()); + } + + public BLSFq2T inverse() { + return this.construct(this.element().inverse()); + } + + public BLSFq2T square() { + return this.construct(this.element().square()); + } + + public BLSFq2T FrobeniusMap(long power) { + return this.construct(this.element().FrobeniusMap(power)); + } + + public int bitSize() { + return this.element().bitSize(); + } + + public boolean equals(final BLSFq2T other) { + if (other == null) { + return false; + } + + return this.element().equals(other.element()); + } + } + + /* Field Fq6 */ + public abstract class BLSFq6< + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6> + extends AbstractFieldElement { + public abstract Fp6_3Over2 element(); + + public abstract BLSFq6T zero(); + + public abstract BLSFq6T one(); + + public abstract Fp2 mulByNonResidue(final Fp2 other); + + public abstract BLSFq6T construct(final Fp6_3Over2 element); + + public abstract String toString(); + + public BLSFq6T add(final BLSFq6T other) { + return this.construct(this.element().add(other.element())); + } + + public BLSFq6T sub(final BLSFq6T other) { + return this.construct(this.element().sub(other.element())); + } + + public BLSFq6T mul(final BLSFqT other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFq6T mul(final BLSFq2T other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFq6T mul(final BLSFq6T other) { + return this.construct(this.element().mul(other.element())); + } + + public boolean isZero() { + return this.equals(this.zero()); + } + + public boolean isOne() { + return this.equals(this.one()); + } + + public BLSFq6T random(final Long seed, final byte[] secureSeed) { + return this.construct(this.element().random(seed, secureSeed)); + } + + public BLSFq6T negate() { + return this.construct(this.element().negate()); + } + + public BLSFq6T square() { + return this.construct(this.element().square()); + } + + public BLSFq6T inverse() { + return this.construct(this.element().inverse()); + } + + public BLSFq6T FrobeniusMap(long power) { + return this.construct(this.element().FrobeniusMap(power)); + } + + public int bitSize() { + return this.element().bitSize(); + } + + public boolean equals(final BLSFq6T other) { + if (other == null) { + return false; + } + + return this.element().equals(other.element()); + } + } + + /* Field Fq12 */ + public abstract class BLSFq12< + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6, + BLSFq12T extends BLSFq12> + extends AbstractFieldElement { + + public abstract Fp12_2Over3Over2 element(); + + public abstract BLSFq12T zero(); + + public abstract BLSFq12T one(); + + public abstract BLSFq12T construct(final Fp12_2Over3Over2 element); + + public abstract String toString(); + + public BLSFq12T add(final BLSFq12T other) { + return this.construct(this.element().add(other.element())); + } + + public BLSFq12T sub(final BLSFq12T other) { + return this.construct(this.element().sub(other.element())); + } + + public BLSFq12T mul(final BLSFqT other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFq12T mul(final BLSFq2T other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFq12T mul(final BLSFq6T other) { + return this.construct(this.element().mul(other.element())); + } + + // public Fp6_3Over2 mulByNonResidue(final Fp6_3Over2 other) { + // return other.construct(Fq12Parameters.nonresidue().mul(other.c2), other.c0, + // other.c1); + // } + + public BLSFq12T mul(final BLSFq12T other) { + return this.construct(this.element().mul(other.element())); + } + + public BLSFq12T pow(final BigInteger other) { + return this.construct(this.element().pow(other)); + } + + public boolean isZero() { + return this.equals(this.zero()); + } + + public boolean isOne() { + return this.equals(this.one()); + } + + public BLSFq12T random(final Long seed, final byte[] secureSeed) { + return this.construct(this.element().random(seed, secureSeed)); + } + + public BLSFq12T negate() { + return this.construct(this.element().negate()); + } + + public BLSFq12T square() { + return this.construct(this.element().square()); + } + + public BLSFq12T inverse() { + return this.construct(this.element().inverse()); + } + + public BLSFq12T FrobeniusMap(long power) { + return this.construct(this.element().FrobeniusMap(power)); + } + + public BLSFq12T unitaryInverse() { + return this.construct(this.element().unitaryInverse()); + } + + public BLSFq12T cyclotomicSquared() { + return this.construct(this.element().cyclotomicSquared()); + } + + public BLSFq12T mulBy024(final BLSFq2T ell0, final BLSFq2T ellVW, final BLSFq2T ellVV) { + return this.construct( + this.element().mulBy024(ell0.element(), ellVW.element(), ellVV.element())); + } + + public BLSFq12T cyclotomicExponentiation(final BigInteger exponent) { + return this.construct(this.element().cyclotomicExponentiation(exponent)); + } + + public int bitSize() { + return this.element().bitSize(); + } + + public boolean equals(final BLSFq12T other) { + if (other == null) { + return false; + } + + return this.element().equals(other.element()); + } + } +} // BLSFields diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java new file mode 100755 index 0000000..0e708d1 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java @@ -0,0 +1,230 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.AbstractG1; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFr; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG1Parameters; +import java.util.ArrayList; + +public abstract class BLSG1< + BLSFrT extends BLSFr, + BLSFqT extends BLSFq, + BLSG1T extends BLSG1, + BLSG1ParametersT extends AbstractBLSG1Parameters> + extends AbstractG1 { + public final BLSG1ParametersT G1Parameters; + protected final BLSFqT X; + protected final BLSFqT Y; + public final BLSFqT Z; + + public BLSG1(final BLSFqT X, final BLSFqT Y, final BLSFqT Z, final BLSG1ParametersT G1Parameters) { + this.X = X; + this.Y = Y; + this.Z = Z; + this.G1Parameters = G1Parameters; + } + + public abstract BLSG1T construct(BLSFqT X, BLSFqT Y, BLSFqT Z); + + public BLSG1T add(final BLSG1T other) { + assert (other != null); + + // Handle special cases having to do with O + if (isZero()) { + return other; + } + + if (other.isZero()) { + return this.self(); + } + + // No need to handle points of order 2,4 + // (they cannot exist in a modulus-order subgroup) + + // Check for doubling case + + // Using Jacobian coordinates so: + // (X1:Y1:Z1) = (X2:Y2:Z2) + // iff + // X1/Z1^2 == X2/Z2^2 and Y1/Z1^3 == Y2/Z2^3 + // iff + // X1 * Z2^2 == X2 * Z1^2 and Y1 * Z2^3 == Y2 * Z1^3 + + final BLSFqT Z1Z1 = this.Z.square(); + final BLSFqT Z2Z2 = other.Z.square(); + + final BLSFqT U1 = this.X.mul(Z2Z2); + final BLSFqT U2 = other.X.mul(Z1Z1); + + final BLSFqT Z1_cubed = this.Z.mul(Z1Z1); + final BLSFqT Z2_cubed = other.Z.mul(Z2Z2); + + // S1 = Y1 * Z2 * Z2Z2 + final BLSFqT S1 = this.Y.mul(Z2_cubed); + // S2 = Y2 * Z1 * Z1Z1 + final BLSFqT S2 = other.Y.mul(Z1_cubed); + + if (U1.equals(U2) && S1.equals(S2)) { + // Doubling case + // Nothing above can be reused + return dbl(); + } + + // Rest of the add case + // H = U2-U1 + final BLSFqT H = U2.sub(U1); + // I = (2 * H)^2 + final BLSFqT I = H.add(H).square(); + // J = H * I + final BLSFqT J = H.mul(I); + // r = 2 * (S2-S1) + final BLSFqT S2MinusS1 = S2.sub(S1); + final BLSFqT r = S2MinusS1.add(S2MinusS1); + // V = U1 * I + final BLSFqT V = U1.mul(I); + // X3 = r^2 - J - 2 * V + final BLSFqT X3 = r.square().sub(J).sub(V.add(V)); + final BLSFqT S1_J = S1.mul(J); + // Y3 = r * (V-X3)-2 * S1_J + final BLSFqT Y3 = r.mul(V.sub(X3)).sub(S1_J.add(S1_J)); + // Z3 = ((Z1+Z2)^2-Z1Z1-Z2Z2) * H + final BLSFqT Z3 = this.Z.add(other.Z).square().sub(Z1Z1).sub(Z2Z2).mul(H); + + return this.construct(X3, Y3, Z3); + } + + public BLSG1T sub(final BLSG1T other) { + return this.add(other.negate()); + } + + public boolean isZero() { + return this.Z.isZero(); + } + + public boolean isOne() { + return this.X.equals(this.one().X) + && this.Y.equals(this.one().Y) + && this.Z.equals(this.one().Z); + } + + public boolean isSpecial() { + return isZero() || isOne(); + } + + public BLSG1T zero() { + return this.G1Parameters.ZERO(); + } + + public BLSG1T one() { + return this.G1Parameters.ONE(); + } + + public BLSG1T random(final Long seed, final byte[] secureSeed) { + return this.one().mul(this.G1Parameters.oneFr().random(seed, secureSeed)); + } + + public BLSG1T negate() { + return this.construct(this.X, this.Y.negate(), this.Z); + } + + public BLSG1T dbl() { + // Handle point at infinity + if (isZero()) { + return this.self(); + } + + // No need to handle points of order 2,4 + // (they cannot exist in a modulus-order subgroup) + + // NOTE: does not handle O and pts of order 2,4 + // http://www.hyperelliptic.org/EFD/g1p/auto-shortw-jacobian-0.html#doubling-dbl-2009-l + + // A = X1^2 + final BLSFqT A = this.X.square(); + // B = Y1^2 + final BLSFqT B = this.Y.square(); + // C = B^2 + final BLSFqT C = B.square(); + // D = 2 * ((X1 + B)^2 - A - C) + BLSFqT D = this.X.add(B).square().sub(A).sub(C); + D = D.add(D); + // E = 3 * A + final BLSFqT E = A.add(A).add(A); + // F = E^2 + final BLSFqT F = E.square(); + // X3 = F - 2 D + final BLSFqT X3 = F.sub(D.add(D)); + // Y3 = E * (D - X3) - 8 * C + BLSFqT eightC = C.add(C); + eightC = eightC.add(eightC); + eightC = eightC.add(eightC); + final BLSFqT Y3 = E.mul(D.sub(X3)).sub(eightC); + // Z3 = 2 * Y1 * Z1 + final BLSFqT Y1Z1 = this.Y.mul(this.Z); + final BLSFqT Z3 = Y1Z1.add(Y1Z1); + + return this.construct(X3, Y3, Z3); + } + + public BLSG1T toAffineCoordinates() { + if (isZero()) { + return this.construct(this.X.zero(), this.Y.one(), this.Z.zero()); + } else { + BLSFqT ZInverse = this.Z.inverse(); + BLSFqT Z2Inverse = ZInverse.square(); + BLSFqT Z3Inverse = Z2Inverse.mul(ZInverse); + return this.construct(this.X.mul(Z2Inverse), this.Y.mul(Z3Inverse), this.Z.one()); + } + } + + public int bitSize() { + return Math.max(this.X.bitSize(), Math.max(this.Y.bitSize(), this.Z.bitSize())); + } + + public ArrayList fixedBaseWindowTable() { + return this.G1Parameters.fixedBaseWindowTable(); + } + + public String toString() { + if (isZero()) { + return "0"; + } + + return this.X.toString() + ", " + this.Y.toString() + ", " + this.Z.toString(); + } + + public boolean equals(final BLSG1T other) { + if (isZero()) { + return other.isZero(); + } + + if (other.isZero()) { + return false; + } + + // Now neither is O. + + // using Jacobian coordinates so: + // (X1:Y1:Z1) = (X2:Y2:Z2) + // iff + // X1/Z1^2 == X2/Z2^2 and Y1/Z1^3 == Y2/Z2^3 + // iff + // X1 * Z2^2 == X2 * Z1^2 and Y1 * Z2^3 == Y2 * Z1^3 + + final BLSFqT Z1_squared = this.Z.square(); + final BLSFqT Z2_squared = other.Z.square(); + + if (!this.X.mul(Z2_squared).equals(other.X.mul(Z1_squared))) { + return false; + } + + final BLSFqT Z1_cubed = this.Z.mul(Z1_squared); + final BLSFqT Z2_cubed = other.Z.mul(Z2_squared); + + if (!this.Y.mul(Z2_cubed).equals(other.Y.mul(Z1_cubed))) { + return false; + } + + return true; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java new file mode 100755 index 0000000..83cdb4e --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java @@ -0,0 +1,233 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.AbstractG2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFr; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG2Parameters; +import java.util.ArrayList; + +public abstract class BLSG2< + BLSFrT extends BLSFr, + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSG2T extends BLSG2, + BLSG2ParametersT extends + AbstractBLSG2Parameters> + extends AbstractG2 { + protected final BLSG2ParametersT G2Parameters; + protected BLSFq2T X; + protected BLSFq2T Y; + protected BLSFq2T Z; + + public BLSG2(final BLSFq2T X, final BLSFq2T Y, final BLSFq2T Z, final BLSG2ParametersT G2Parameters) { + this.X = X; + this.Y = Y; + this.Z = Z; + this.G2Parameters = G2Parameters; + } + + public abstract BLSG2T construct(BLSFq2T X, BLSFq2T Y, BLSFq2T Z); + + public BLSG2T add(final BLSG2T other) { + // Handle special cases having to do with O + if (isZero()) { + return other; + } + + if (other.isZero()) { + return this.self(); + } + + // No need to handle points of order 2,4 + // (they cannot exist in a modulus-order subgroup) + + // Check for doubling case + + // Using Jacobian coordinates so: + // (X1:Y1:Z1) = (X2:Y2:Z2) + // iff + // X1/Z1^2 == X2/Z2^2 and Y1/Z1^3 == Y2/Z2^3 + // iff + // X1 * Z2^2 == X2 * Z1^2 and Y1 * Z2^3 == Y2 * Z1^3 + + final BLSFq2T Z1Z1 = this.Z.square(); + final BLSFq2T Z2Z2 = other.Z.square(); + + final BLSFq2T U1 = this.X.mul(Z2Z2); + final BLSFq2T U2 = other.X.mul(Z1Z1); + + final BLSFq2T Z1Cubed = this.Z.mul(Z1Z1); + final BLSFq2T Z2Cubed = other.Z.mul(Z2Z2); + + // S1 = Y1 * Z2 * Z2Z2 + final BLSFq2T S1 = this.Y.mul(Z2Cubed); + // S2 = Y2 * Z1 * Z1Z1 + final BLSFq2T S2 = other.Y.mul(Z1Cubed); + + if (U1.equals(U2) && S1.equals(S2)) { + // Double case + // Nothing of above can be reused + return dbl(); + } + + // Rest of the add case + // H = U2-U1 + final BLSFq2T H = U2.sub(U1); + // I = (2 * H)^2 + final BLSFq2T I = H.add(H).square(); + // J = H * I + final BLSFq2T J = H.mul(I); + // r = 2 * (S2-S1) + final BLSFq2T S2MinusS1 = S2.sub(S1); + final BLSFq2T r = S2MinusS1.add(S2MinusS1); + // V = U1 * I + final BLSFq2T V = U1.mul(I); + // X3 = r^2 - J - 2 * V + final BLSFq2T X3 = r.square().sub(J).sub(V.add(V)); + // Y3 = r * (V-X3)-2 S1 J + final BLSFq2T S1_J = S1.mul(J); + final BLSFq2T Y3 = r.mul(V.sub(X3)).sub(S1_J.add(S1_J)); + // Z3 = ((Z1+Z2)^2-Z1Z1-Z2Z2) * H + final BLSFq2T Z3 = this.Z.add(other.Z).square().sub(Z1Z1).sub(Z2Z2).mul(H); + + return this.construct(X3, Y3, Z3); + } + + public BLSG2T sub(final BLSG2T other) { + return this.add(other.negate()); + } + + public BLSG2T dbl() { + if (isZero()) { + return this.self(); + } + + // NOTE: does not handle O and pts of order 2,4 + // http://www.hyperelliptic.org/EFD/g1p/auto-shortw-projective.html#doubling-dbl-2007-bl + + // A = X1^2 + final BLSFq2T A = this.X.square(); + // B = Y1^2 + final BLSFq2T B = this.Y.square(); + // C = B^2 + final BLSFq2T C = B.square(); + // D = 2 * ((X1 + B)^2 - A - C) + BLSFq2T D = this.X.add(B).square().sub(A).sub(C); + D = D.add(D); + // E = 3 * A + final BLSFq2T E = A.add(A).add(A); + // F = E^2 + final BLSFq2T F = E.square(); + // X3 = F - 2 D + final BLSFq2T X3 = F.sub(D.add(D)); + // Y3 = E * (D - X3) - 8 * C + BLSFq2T eightC = C.add(C); + eightC = eightC.add(eightC); + eightC = eightC.add(eightC); + final BLSFq2T Y3 = E.mul(D.sub(X3)).sub(eightC); + final BLSFq2T Y1Z1 = this.Y.mul(this.Z); + // Z3 = 2 * Y1 * Z1 + final BLSFq2T Z3 = Y1Z1.add(Y1Z1); + + return this.construct(X3, Y3, Z3); + } + + public boolean isZero() { + return this.Z.isZero(); + } + + public boolean isSpecial() { + return isZero() || this.Z.isOne(); + } + + public boolean isOne() { + return this.X.isOne() && this.Y.isOne() && this.Z.isOne(); + } + + public BLSG2T zero() { + return this.G2Parameters.ZERO(); + } + + public BLSG2T one() { + return this.G2Parameters.ONE(); + } + + public BLSG2T random(final Long seed, final byte[] secureSeed) { + return this.one().mul(this.G2Parameters.oneFr().random(seed, secureSeed).toBigInteger()); + } + + public BLSG2T negate() { + return this.construct(this.X, this.Y.negate(), this.Z); + } + + public void setX(final BLSFq2T X) { + this.X = X; + } + + public void setY(final BLSFq2T Y) { + this.Y = Y; + } + + public void setZ(final BLSFq2T Z) { + this.Z = Z; + } + + public BLSG2T toAffineCoordinates() { + if (isZero()) { + return this.construct(this.X.zero(), this.Y.one(), this.Z.zero()); + } else { + final BLSFq2T ZInverse = this.Z.inverse(); + final BLSFq2T Z2Inverse = ZInverse.square(); + final BLSFq2T Z3Inverse = Z2Inverse.mul(ZInverse); + return this.construct(this.X.mul(Z2Inverse), this.Y.mul(Z3Inverse), this.Z.one()); + } + } + + public int bitSize() { + return Math.max(this.X.bitSize(), Math.max(this.Y.bitSize(), this.Z.bitSize())); + } + + public ArrayList fixedBaseWindowTable() { + return this.G2Parameters.fixedBaseWindowTable(); + } + + public String toString() { + if (isZero()) { + return "0"; + } + + return this.X.toString() + ", " + this.Y.toString() + ", " + this.Z.toString(); + } + + public boolean equals(final BLSG2T other) { + if (isZero()) { + return other.isZero(); + } + + if (other.isZero()) { + return false; + } + + // Now neither is O. + + // using Jacobian coordinates so: + // (X1:Y1:Z1) = (X2:Y2:Z2) + // iff + // X1/Z1^2 == X2/Z2^2 and Y1/Z1^3 == Y2/Z2^3 + // iff + // X1 * Z2^2 == X2 * Z1^2 and Y1 * Z2^3 == Y2 * Z1^3 + + final BLSFq2T Z1Squared = this.Z.square(); + final BLSFq2T Z2Squared = other.Z.square(); + + if (!this.X.mul(Z2Squared).equals(other.X.mul(Z1Squared))) { + return false; + } + + final BLSFq2T Z1Cubed = this.Z.mul(Z1Squared); + final BLSFq2T Z2Cubed = other.Z.mul(Z2Squared); + + return this.Y.mul(Z2Cubed).equals(other.Y.mul(Z1Cubed)); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java new file mode 100755 index 0000000..f4759e9 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java @@ -0,0 +1,60 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.AbstractGT; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq12; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq6; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; +import java.math.BigInteger; + +public abstract class BLSGT< + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6, + BLSFq12T extends BLSFq12, + BLSGTT extends BLSGT, + BLSGTParametersT extends + AbstractBLSGTParameters> + extends AbstractGT { + public final BLSGTParametersT GTParameters; + public final BLSFq12T element; + + public BLSGT(final BLSFq12T value, final BLSGTParametersT GTParameters) { + this.element = value; + this.GTParameters = GTParameters; + } + + public abstract BLSGTT construct(final BLSFq12T element); + + public BLSGTT add(final BLSGTT other) { + return this.construct(this.element.mul(other.element)); + } + + public BLSGTT mul(final BigInteger other) { + return this.construct(this.element.pow(other)); + } + + public BLSGTT one() { + return this.GTParameters.ONE(); + } + + public BLSGTT negate() { + return this.construct(this.element.unitaryInverse()); + } + + public boolean equals(final BLSGTT other) { + return this.element.equals(other.element); + } + + public String toString() { + return this.element.toString(); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java new file mode 100755 index 0000000..71f2f60 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java @@ -0,0 +1,345 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.AbstractPairing; +import algebra.curves.barreto_lynn_scott.BLSFields.*; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG1Parameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG2Parameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; +import java.math.BigInteger; +import java.util.ArrayList; +import java.util.List; + +public abstract class BLSPairing< + BLSFrT extends BLSFr, + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6, + BLSFq12T extends BLSFq12, + BLSG1T extends BLSG1, + BLSG2T extends BLSG2, + BLSGTT extends BLSGT, + BLSG1ParametersT extends AbstractBLSG1Parameters, + BLSG2ParametersT extends + AbstractBLSG2Parameters, + BLSGTParametersT extends + AbstractBLSGTParameters, + BLSPublicParametersT extends BLSPublicParameters> + extends AbstractPairing { + + public abstract BLSPublicParametersT publicParameters(); + + protected class AteG1Precompute extends G1Precompute { + final BLSFqT PX; + final BLSFqT PY; + + private AteG1Precompute(final BLSFqT PX, final BLSFqT PY) { + this.PX = PX; + this.PY = PY; + } + + public boolean equals(final AteG1Precompute other) { + return PX.equals(other.PX) && PY.equals(other.PY); + } + } + + protected class AteEllCoefficients { + BLSFq2T ell0; + BLSFq2T ellVW; + BLSFq2T ellVV; + + public boolean equals(final AteEllCoefficients other) { + return this.ell0.equals(other.ell0) + && this.ellVW.equals(other.ellVW) + && this.ellVV.equals(other.ellVV); + } + } + + protected class AteG2Precompute extends G2Precompute { + final BLSFq2T QX; + final BLSFq2T QY; + final List coefficients; + + private AteG2Precompute( + final BLSFq2T QX, final BLSFq2T QY, final List coefficients) { + this.QX = QX; + this.QY = QY; + this.coefficients = coefficients; + } + + public boolean equals(final AteG2Precompute other) { + return this.QX.equals(other.QX) + && this.QY.equals(other.QY) + && this.coefficients.equals(other.coefficients); + } + } + + private AteEllCoefficients doublingStepForFlippedMillerLoop( + final BLSFqT twoInverse, BLSG2T current) { + final BLSFq2T X = current.X, Y = current.Y, Z = current.Z; + + // A = (X * Y) / 2 + final BLSFq2T A = X.mul(Y).mul(twoInverse); + // B = Y^2 + final BLSFq2T B = Y.square(); + // C = Z^2 + final BLSFq2T C = Z.square(); + // D = 3 * C + final BLSFq2T D = C.add(C).add(C); + // E = twist_b * D + final BLSFq2T E = this.publicParameters().twistCoefficientB().mul(D); + // F = 3 * E + final BLSFq2T F = E.add(E).add(E); + // G = (B + F)/2 + final BLSFq2T G = B.add(F).mul(twoInverse); + // H = (Y + Z)^2 - (B + C) + final BLSFq2T H = (Y.add(Z)).square().sub(B.add(C)); + // I = E - B + final BLSFq2T I = E.sub(B); + // J = X^2 + final BLSFq2T J = X.square(); + // E_squared = E^2 + final BLSFq2T ESquare = E.square(); + + // X3 = A * (B - F) + current.setX(A.mul(B.sub(F))); + // Y3 = G^2 - 3*E^2 + current.setY(G.square().sub(ESquare.add(ESquare).add(ESquare))); + // Z3 = B * H + current.setZ(B.mul(H)); + + AteEllCoefficients c = new AteEllCoefficients(); + // ell_0 = xi * I + c.ell0 = this.publicParameters().twist().mul(I); + // ell_VW = - H (later: * yP) + c.ellVW = H.negate(); + // ell_VV = 3*J (later: * xP) + c.ellVV = J.add(J).add(J); + + return c; + } + + private AteEllCoefficients mixedAdditionStepForFlippedMillerLoop( + final BLSG2T base, BLSG2T current) { + final BLSFq2T X1 = current.X, Y1 = current.Y, Z1 = current.Z; + final BLSFq2T X2 = base.X, Y2 = base.Y; + + final BLSFq2T A = Y2.mul(Z1); + final BLSFq2T B = X2.mul(Z1); + final BLSFq2T theta = Y1.sub(A); + final BLSFq2T lambda = X1.sub(B); + final BLSFq2T C = theta.square(); + final BLSFq2T D = lambda.square(); + final BLSFq2T E = lambda.mul(D); + final BLSFq2T F = Z1.mul(C); + final BLSFq2T G = X1.mul(D); + final BLSFq2T H = E.add(F).sub(G.add(G)); + final BLSFq2T I = Y1.mul(E); + final BLSFq2T J = theta.mul(X2).sub(lambda.mul(Y2)); + + current.setX(lambda.mul(H)); + current.setY(theta.mul(G.sub(H)).sub(I)); + current.setZ(Z1.mul(E)); + + AteEllCoefficients c = new AteEllCoefficients(); + c.ell0 = this.publicParameters().twist().mul(J); + // VV gets multiplied to xP during line evaluation at P + c.ellVV = theta.negate(); + // VW gets multiplied to yP during line evaluation at P + c.ellVW = lambda; + + return c; + } + + private BLSG2T mulByQ(final BLSG2T element) { + return element.construct( + publicParameters().qXMulTwist().mul(element.X.FrobeniusMap(1)), + publicParameters().qYMulTwist().mul(element.Y.FrobeniusMap(1)), + element.Z.FrobeniusMap(1)); + } + + private BLSFq12T ExpByZ(final BLSFq12T elt) { + BLSFq12T result = elt.cyclotomicExponentiation(this.publicParameters().finalExponentZ()); + if (!this.publicParameters().isFinalExponentZNegative()) { + result = result.unitaryInverse(); + } + return result; + } + + private BLSFq12T finalExponentiationFirstChunk(final BLSFq12T elt) { + // Computes result = elt^((q^6-1)*(q^2+1)). + // Follows, e.g., Beuchat et al page 9, by computing + // result as follows: + // elt^((q^6-1)*(q^2+1)) = (conj(elt) * elt^(-1))^(q^2+1) + // More precisely: + // A = conj(elt), B = elt.inverse(), C = A * B, D = C.Frobenius_map(2) + // result = D * C + final BLSFq12T A = elt.unitaryInverse(); + final BLSFq12T B = elt.inverse(); + final BLSFq12T C = A.mul(B); + final BLSFq12T D = C.FrobeniusMap(2); + return D.mul(C); + } + + private BLSFq12T finalExponentiationLastChunk(final BLSFq12T elt) { + // In the following, we follow the Algorithm 1 described in Table 1 of: + // https://eprint.iacr.org/2016/130.pdf in order to compute the + // hard part of the final exponentiation + // + // Note: As shown Table 3: https://eprint.iacr.org/2016/130.pdf this algorithm + // isn't optimal since Algorithm 2 allows to have less temp. variables and has + // a better complexity. + // + // In the following we denote by [x] = elt^(x): + // A = [-2] + final BLSFq12T A = elt.cyclotomicSquared().unitaryInverse(); + // B = [z] + final BLSFq12T B = ExpByZ(elt); + // C = [2z] + final BLSFq12T C = B.cyclotomicSquared(); + // D = [z-2] + final BLSFq12T D = A.mul(B); + // E = [z^2-2z] + final BLSFq12T E = ExpByZ(D); + // F = [z^3-2z^2] + final BLSFq12T F = ExpByZ(E); + // G = [z^4-2z^3] + final BLSFq12T G = ExpByZ(F); + // H = [z^4-2z^3+2z] + final BLSFq12T H = G.mul(C); + // I = [z^5-2z^4+2z^2] + final BLSFq12T I = ExpByZ(H); + // J = [-z+2] + final BLSFq12T J = D.unitaryInverse(); + // K = [z^5-2z^4+2z^2-z+2] + final BLSFq12T K = I.mul(J); + // L = [z^5-2z^4+2z^2-z+3] = [\lambda_0] + final BLSFq12T L = K.mul(elt); + // M = [-1] + final BLSFq12T M = elt.unitaryInverse(); + // N = [z^2-2z+1] = [\lambda_3] + final BLSFq12T N = E.mul(elt); + // O = [(z^2-2z+1) * (q^3)] + final BLSFq12T O = N.FrobeniusMap(3); + // P = [z^4-2z^3+2z-1] = [\lambda_1] + final BLSFq12T P = H.mul(M); + // Q = [(z^4-2z^3+2z-1) * q] + final BLSFq12T Q = P.FrobeniusMap(1); + // R = [z^3-2z^2+z] = [\lambda_2] + final BLSFq12T R = F.mul(B); + // S = [(z^3-2z^2+z) * (q^2)] + final BLSFq12T S = R.FrobeniusMap(2); + // T = [(z^2-2z+1) * (q^3) + (z^3-2z^2+z) * (q^2)] + final BLSFq12T T = O.mul(S); + // U = [(z^2-2z+1) * (q^3) + (z^3-2z^2+z) * (q^2) + (z^4-2z^3+2z-1) * q] + final BLSFq12T U = T.mul(Q); + // result = [(z^2-2z+1) * (q^3) + (z^3-2z^2+z) * (q^2) + (z^4-2z^3+2z-1) * q + z^5-2z^4+2z^2-z+3] + // = [(p^4 - p^2 + 1)/r]. + final BLSFq12T result = U.mul(L); + + return result; + } + + // Implementation of the Miller loop for BLS12_377 curve + // See https://eprint.iacr.org/2019/077.pdf for more info and potential optimizations + private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute QPrec) { + // blsFq12Factory = BLS12_377Fq12.ONE; + BLSFq12T f = this.publicParameters().blsFq12Factory(); + + boolean found = false; + int idx = 0; + + final BigInteger loopCount = this.publicParameters().ateLoopCount(); + AteEllCoefficients c; + + for (int i = loopCount.bitLength(); i >= 0; --i) { + final boolean bit = loopCount.testBit(i); + if (!found) { + // This skips the MSB itself. + found |= bit; + continue; + } + + // Code below gets executed for all bits (EXCEPT the MSB itself) of + // loopCount (skipping leading zeros) in MSB to LSB order. + c = QPrec.coefficients.get(idx++); + f = f.square(); + f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); + + if (bit) { + c = QPrec.coefficients.get(idx++); + f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); + } + } + + // Not executed for BLS12_377 + if (this.publicParameters().isAteLoopCountNegative()) { + f = f.inverse(); + } + + //c = QPrec.coefficients.get(idx++); + //f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); + + //c = QPrec.coefficients.get(idx); + //f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); + + return f; + } + + protected AteG1Precompute precomputeG1(final BLSG1T P) { + BLSG1T PAffine = P.construct(P.X, P.Y, P.Z).toAffineCoordinates(); + + return new AteG1Precompute(PAffine.X, PAffine.Y); + } + + protected AteG2Precompute precomputeG2(final BLSG2T Q) { + BLSG2T QAffine = Q.construct(Q.X, Q.Y, Q.Z).toAffineCoordinates(); + + BLSFqT fqFactory = this.publicParameters().coefficientB(); + BLSFqT twoInverse = fqFactory.construct(2).inverse(); + + BLSG2T R = Q.construct(QAffine.X, QAffine.Y, QAffine.Y.one()); + final BigInteger loopCount = this.publicParameters().ateLoopCount(); + boolean found = false; + + final List coeffs = new ArrayList<>(); + + for (int i = loopCount.bitLength(); i >= 0; --i) { + final boolean bit = loopCount.testBit(i); + if (!found) { + // This skips the MSB itself. + found |= bit; + continue; + } + + coeffs.add(doublingStepForFlippedMillerLoop(twoInverse, R)); + + if (bit) { + coeffs.add(mixedAdditionStepForFlippedMillerLoop(QAffine, R)); + } + } + + //BLSG2T Q1 = this.mulByQ(QAffine); + //assert (Q1.Z.equals(QAffine.X.one())); + //BLSG2T Q2 = this.mulByQ(Q1); + //assert (Q2.Z.equals(QAffine.X.one())); + + return new AteG2Precompute(QAffine.X, QAffine.Y, coeffs); + } + + protected BLSFq12T atePairing(final BLSG1T P, final BLSG2T Q) { + final AteG1Precompute PPrec = precomputeG1(P); + final AteG2Precompute QPrec = precomputeG2(Q); + return millerLoop(PPrec, QPrec); + } + + public BLSFq12T finalExponentiation(final BLSFq12T elt) { + // We know that: + // (p^12 - 1) / r = (p^6 - 1) (p^2 + 1) ((p^4 - p^2 + 1) / r) + // |_________________| |__________________| + // easy part hard part + // where: + // sage: cyclotomic_polynomial(12) # = x^4 - x^2 + 1 + final BLSFq12T A = finalExponentiationFirstChunk(elt); + return finalExponentiationLastChunk(A); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPublicParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPublicParameters.java new file mode 100755 index 0000000..60502a4 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPublicParameters.java @@ -0,0 +1,81 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq12; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq6; +import java.math.BigInteger; + +public abstract class BLSPublicParameters< + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6, + BLSFq12T extends BLSFq12> { + protected BLSFqT coefficientB; + protected BLSFq2T twist; + protected BLSFq2T twistCoefficientB; + protected BLSFqT bC0MulTwist; + protected BLSFqT bC1MulTwist; + protected BLSFq2T qXMulTwist; + protected BLSFq2T qYMulTwist; + + protected BigInteger ateLoopCount; + protected boolean isAteLoopCountNegative; + protected BigInteger finalExponent; + protected BigInteger finalExponentZ; + protected boolean isFinalExponentZNegative; + + protected BLSFq12T blsFq12Factory; + + BLSFqT coefficientB() { + return coefficientB; + } + + BLSFq2T twist() { + return twist; + } + + BLSFq2T twistCoefficientB() { + return twistCoefficientB; + } + + BLSFqT bC0MulTwist() { + return bC0MulTwist; + } + + BLSFqT bC1MulTwist() { + return bC1MulTwist; + } + + BLSFq2T qXMulTwist() { + return qXMulTwist; + } + + BLSFq2T qYMulTwist() { + return qYMulTwist; + } + + BigInteger ateLoopCount() { + return ateLoopCount; + } + + boolean isAteLoopCountNegative() { + return isAteLoopCountNegative; + } + + public BigInteger finalExponent() { + return finalExponent; + } + + public BigInteger finalExponentZ() { + return finalExponentZ; + } + + boolean isFinalExponentZNegative() { + return isFinalExponentZNegative; + } + + BLSFq12T blsFq12Factory() { + return blsFq12Factory; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq12Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq12Parameters.java new file mode 100755 index 0000000..4248e2d --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq12Parameters.java @@ -0,0 +1,5 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; + +public abstract class AbstractBLSFq12Parameters extends AbstractFp12_2Over3Over2_Parameters {} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq2Parameters.java new file mode 100755 index 0000000..9b7023f --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq2Parameters.java @@ -0,0 +1,5 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.fields.abstractfieldparameters.AbstractFp2Parameters; + +public abstract class AbstractBLSFq2Parameters extends AbstractFp2Parameters {} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq6Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq6Parameters.java new file mode 100755 index 0000000..00a2bdc --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFq6Parameters.java @@ -0,0 +1,5 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; + +public abstract class AbstractBLSFq6Parameters extends AbstractFp6_3Over2_Parameters {} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFqParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFqParameters.java new file mode 100755 index 0000000..ed917c4 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFqParameters.java @@ -0,0 +1,5 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.fields.abstractfieldparameters.AbstractFpParameters; + +public abstract class AbstractBLSFqParameters extends AbstractFpParameters {} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFrParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFrParameters.java new file mode 100755 index 0000000..796e734 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSFrParameters.java @@ -0,0 +1,5 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.fields.abstractfieldparameters.AbstractFpParameters; + +public abstract class AbstractBLSFrParameters extends AbstractFpParameters {} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG1Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG1Parameters.java new file mode 100755 index 0000000..4dcd1f8 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG1Parameters.java @@ -0,0 +1,24 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFr; +import algebra.curves.barreto_lynn_scott.BLSG1; +import java.util.ArrayList; + +/** Generic class to represent the parameters defining a given BLS G1 group */ +public abstract class AbstractBLSG1Parameters< + BLSFrT extends BLSFr, + BLSFqT extends BLSFq, + BLSG1T extends BLSG1, + BLSG1ParametersT extends AbstractBLSG1Parameters> { + + public abstract BLSG1T ZERO(); + + public abstract BLSG1T ONE(); + + public abstract BLSFrT zeroFr(); + + public abstract BLSFrT oneFr(); + + public abstract ArrayList fixedBaseWindowTable(); +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java new file mode 100755 index 0000000..5864e9f --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java @@ -0,0 +1,26 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFr; +import algebra.curves.barreto_lynn_scott.BLSG2; +import java.util.ArrayList; + +/** Generic class to represent the parameters defining a given BLS G2 group */ +public abstract class AbstractBLSG2Parameters< + BLSFrT extends BLSFr, + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSG2T extends BLSG2, + BLSG2ParametersT extends AbstractBLSG2Parameters> { + + public abstract BLSG2T ZERO(); + + public abstract BLSG2T ONE(); + + public abstract BLSFrT zeroFr(); + + public abstract BLSFrT oneFr(); + + public abstract ArrayList fixedBaseWindowTable(); +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSGTParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSGTParameters.java new file mode 100755 index 0000000..8a73569 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSGTParameters.java @@ -0,0 +1,20 @@ +package algebra.curves.barreto_lynn_scott.abstract_bls_parameters; + +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq12; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq6; +import algebra.curves.barreto_lynn_scott.BLSGT; + +/** Generic class to represent the parameters defining a given BLS GT group */ +public abstract class AbstractBLSGTParameters< + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6, + BLSFq12T extends BLSFq12, + BLSGTT extends BLSGT, + BLSGTParametersT extends + AbstractBLSGTParameters> { + + public abstract BLSGTT ONE(); +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java index 8eead20..f74803a 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java @@ -1,286 +1,285 @@ package algebra.curves.barreto_lynn_scott.bls12_377; -import algebra.curves.barreto_naehrig.BNFields.*; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.*; - -// public class BLS12_377Fields { -// /* Scalar field Fr */ -// public static class BLS12_377Fr extends BNFr { -// -// public static final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); -// public static final BLS12_377Fr ZERO = new BLS12_377Fr(FrParameters.ZERO()); -// public static final BLS12_377Fr ONE = new BLS12_377Fr(FrParameters.ONE()); -// public static final BLS12_377Fr MULTIPLICATIVE_GENERATOR = -// new BLS12_377Fr(FrParameters.multiplicativeGenerator()); -// -// public Fp element; -// -// public BLS12_377Fr(final BigInteger number) { -// this.element = new Fp(number, FrParameters); -// } -// -// public BLS12_377Fr(final Fp number) { -// this(number.toBigInteger()); -// } -// -// public BLS12_377Fr(final String number) { -// this(new BigInteger(number)); -// } -// -// public BLS12_377Fr(final long number) { -// this(BigInteger.valueOf(number)); -// } -// -// public BLS12_377Fr self() { -// return this; -// } -// -// public Fp element() { -// return element; -// } -// -// public BLS12_377Fr zero() { -// return ZERO; -// } -// -// public BLS12_377Fr one() { -// return ONE; -// } -// -// public BLS12_377Fr multiplicativeGenerator() { -// return MULTIPLICATIVE_GENERATOR; -// } -// -// public BLS12_377Fr construct(final BigInteger number) { -// return new BLS12_377Fr(number); -// } -// -// public BLS12_377Fr construct(final long number) { -// return new BLS12_377Fr(number); -// } -// -// public BLS12_377Fr construct(final Fp element) { -// return new BLS12_377Fr(element); -// } -// -// public String toString() { -// return this.element.toString(); -// } -// } -// -// /* Base field Fq */ -// public static class BLS12_377Fq extends BNFq { -// -// public static final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); -// public static final BLS12_377Fq ZERO = new BLS12_377Fq(FqParameters.ZERO()); -// public static final BLS12_377Fq ONE = new BLS12_377Fq(FqParameters.ONE()); -// public static final BLS12_377Fq MULTIPLICATIVE_GENERATOR = -// new BLS12_377Fq(FqParameters.multiplicativeGenerator()); -// -// public Fp element; -// -// public BLS12_377Fq(final Fp element) { -// this.element = element; -// } -// -// public BLS12_377Fq(final BigInteger number) { -// this.element = new Fp(number, FqParameters); -// } -// -// public BLS12_377Fq(final String number) { -// this(new BigInteger(number)); -// } -// -// public BLS12_377Fq(final long number) { -// this(BigInteger.valueOf(number)); -// } -// -// public BLS12_377Fq self() { -// return this; -// } -// -// public Fp element() { -// return element; -// } -// -// public BLS12_377Fq zero() { -// return ZERO; -// } -// -// public BLS12_377Fq one() { -// return ONE; -// } -// -// public BLS12_377Fq multiplicativeGenerator() { -// return MULTIPLICATIVE_GENERATOR; -// } -// -// public BLS12_377Fq construct(final BigInteger number) { -// return new BLS12_377Fq(number); -// } -// -// public BLS12_377Fq construct(final Fp element) { -// return new BLS12_377Fq(element); -// } -// -// public BLS12_377Fq construct(final String element) { -// return new BLS12_377Fq(element); -// } -// -// public BLS12_377Fq construct(final long number) { -// return new BLS12_377Fq(number); -// } -// -// public String toString() { -// return this.element.toString(); -// } -// } -// -// /* Twist field Fq2 */ -// public static class BLS12_377Fq2 extends BNFq2 { -// -// public static final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); -// public static BLS12_377Fq2 ZERO = new BLS12_377Fq2(Fq2Parameters.ZERO()); -// public static BLS12_377Fq2 ONE = new BLS12_377Fq2(Fq2Parameters.ONE()); -// -// public Fp2 element; -// -// public BLS12_377Fq2(final Fp2 element) { -// this.element = element; -// } -// -// public BLS12_377Fq2(final BigInteger c0, final BigInteger c1) { -// this.element = new Fp2(c0, c1, Fq2Parameters); -// } -// -// public BLS12_377Fq2(final BLS12_377Fq c0, final BLS12_377Fq c1) { -// this(c0.toBigInteger(), c1.toBigInteger()); -// } -// -// public BLS12_377Fq2(final long c0, final long c1) { -// this(BigInteger.valueOf(c0), BigInteger.valueOf(c1)); -// } -// -// public BLS12_377Fq2 self() { -// return this; -// } -// -// public Fp2 element() { -// return this.element; -// } -// -// public BLS12_377Fq2 zero() { -// return ZERO; -// } -// -// public BLS12_377Fq2 one() { -// return ONE; -// } -// -// public BLS12_377Fq2 construct(final Fp2 element) { -// return new BLS12_377Fq2(element); -// } -// -// public BLS12_377Fq2 construct(final BLS12_377Fq c0, final BLS12_377Fq c1) { -// return new BLS12_377Fq2(c0, c1); -// } -// -// public BLS12_377Fq2 construct(final long c0, final long c1) { -// return new BLS12_377Fq2(c0, c1); -// } -// -// public String toString() { -// return this.element.toString(); -// } -// } -// -// /* Field Fq6 */ -// public static class BLS12_377Fq6 extends BNFq6 { -// -// public static final BLS12_377Fq6Parameters Fq6Parameters = new BLS12_377Fq6Parameters(); -// public static BLS12_377Fq6 ZERO = new BLS12_377Fq6(Fq6Parameters.ZERO()); -// public static BLS12_377Fq6 ONE = new BLS12_377Fq6(Fq6Parameters.ONE()); -// -// public Fp6_3Over2 element; -// -// public BLS12_377Fq6(final Fp6_3Over2 element) { -// this.element = element; -// } -// -// public BLS12_377Fq6(final BLS12_377Fq2 c0, final BLS12_377Fq2 c1, final BLS12_377Fq2 c2) { -// this.element = new Fp6_3Over2(c0.element, c1.element, c2.element, Fq6Parameters); -// } -// -// public BLS12_377Fq6 self() { -// return this; -// } -// -// public Fp6_3Over2 element() { -// return this.element; -// } -// -// public BLS12_377Fq6 zero() { -// return ZERO; -// } -// -// public BLS12_377Fq6 one() { -// return ONE; -// } -// -// public Fp2 mulByNonResidue(final Fp2 other) { -// return Fq6Parameters.nonresidue().mul(other); -// } -// -// public BLS12_377Fq6 construct(final Fp6_3Over2 element) { -// return new BLS12_377Fq6(element); -// } -// -// public String toString() { -// return this.element.toString(); -// } -// } -// -// /* Field Fq12 */ -// public static class BLS12_377Fq12 extends BNFq12 { -// -// public static final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); -// public static BLS12_377Fq12 ZERO = new BLS12_377Fq12(Fq12Parameters.ZERO()); -// public static BLS12_377Fq12 ONE = new BLS12_377Fq12(Fq12Parameters.ONE()); -// -// public Fp12_2Over3Over2 element; -// -// public BLS12_377Fq12(final Fp12_2Over3Over2 element) { -// this.element = element; -// } -// -// public BLS12_377Fq12(final BLS12_377Fq6 c0, final BLS12_377Fq6 c1) { -// this.element = new Fp12_2Over3Over2(c0.element, c1.element, Fq12Parameters); -// } -// -// public BLS12_377Fq12 self() { -// return this; -// } -// -// public Fp12_2Over3Over2 element() { -// return this.element; -// } -// -// public BLS12_377Fq12 zero() { -// return ZERO; -// } -// -// public BLS12_377Fq12 one() { -// return ONE; -// } -// -// public BLS12_377Fq12 construct(final Fp12_2Over3Over2 element) { -// return new BLS12_377Fq12(element); -// } -// -// public String toString() { -// return this.element.toString(); -// } -// } -// } -// +import algebra.curves.barreto_lynn_scott.BLSFields.*; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.*; + +import algebra.fields.Fp; +import algebra.fields.Fp12_2Over3Over2; +import algebra.fields.Fp2; +import algebra.fields.Fp6_3Over2; +import java.math.BigInteger; + +public class BLS12_377Fields { + /* Scalar field Fr */ + public static class BLS12_377Fr extends BLSFr { + public static final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); + public static final BLS12_377Fr ZERO = new BLS12_377Fr(FrParameters.ZERO()); + public static final BLS12_377Fr ONE = new BLS12_377Fr(FrParameters.ONE()); + public static final BLS12_377Fr MULTIPLICATIVE_GENERATOR = + new BLS12_377Fr(FrParameters.multiplicativeGenerator()); + public Fp element; + + public BLS12_377Fr(final BigInteger number) { + this.element = new Fp(number, FrParameters); + } + + public BLS12_377Fr(final Fp number) { + this(number.toBigInteger()); + } + + public BLS12_377Fr(final String number) { + this(new BigInteger(number)); + } + + public BLS12_377Fr(final long number) { + this(BigInteger.valueOf(number)); + } + + public BLS12_377Fr self() { + return this; + } + + public Fp element() { + return element; + } + + public BLS12_377Fr zero() { + return ZERO; + } + + public BLS12_377Fr one() { + return ONE; + } + + public BLS12_377Fr multiplicativeGenerator() { + return MULTIPLICATIVE_GENERATOR; + } + + public BLS12_377Fr construct(final BigInteger number) { + return new BLS12_377Fr(number); + } + + public BLS12_377Fr construct(final long number) { + return new BLS12_377Fr(number); + } + + public BLS12_377Fr construct(final Fp element) { + return new BLS12_377Fr(element); + } + + public String toString() { + return this.element.toString(); + } + } + + /* Base field Fq */ + public static class BLS12_377Fq extends BLSFq { + public static final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); + public static final BLS12_377Fq ZERO = new BLS12_377Fq(FqParameters.ZERO()); + public static final BLS12_377Fq ONE = new BLS12_377Fq(FqParameters.ONE()); + public static final BLS12_377Fq MULTIPLICATIVE_GENERATOR = + new BLS12_377Fq(FqParameters.multiplicativeGenerator()); + + public Fp element; + + public BLS12_377Fq(final Fp element) { + this.element = element; + } + + public BLS12_377Fq(final BigInteger number) { + this.element = new Fp(number, FqParameters); + } + + public BLS12_377Fq(final String number) { + this(new BigInteger(number)); + } + + public BLS12_377Fq(final long number) { + this(BigInteger.valueOf(number)); + } + + public BLS12_377Fq self() { + return this; + } + + public Fp element() { + return element; + } + + public BLS12_377Fq zero() { + return ZERO; + } + + public BLS12_377Fq one() { + return ONE; + } + + public BLS12_377Fq multiplicativeGenerator() { + return MULTIPLICATIVE_GENERATOR; + } + + public BLS12_377Fq construct(final BigInteger number) { + return new BLS12_377Fq(number); + } + + public BLS12_377Fq construct(final Fp element) { + return new BLS12_377Fq(element); + } + + public BLS12_377Fq construct(final String element) { + return new BLS12_377Fq(element); + } + + public BLS12_377Fq construct(final long number) { + return new BLS12_377Fq(number); + } + + public String toString() { + return this.element.toString(); + } + } + + /* Twist field Fq2 */ + public static class BLS12_377Fq2 extends BLSFq2 { + public static final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); + public static BLS12_377Fq2 ZERO = new BLS12_377Fq2(Fq2Parameters.ZERO()); + public static BLS12_377Fq2 ONE = new BLS12_377Fq2(Fq2Parameters.ONE()); + + public Fp2 element; + + public BLS12_377Fq2(final Fp2 element) { + this.element = element; + } + + public BLS12_377Fq2(final BigInteger c0, final BigInteger c1) { + this.element = new Fp2(c0, c1, Fq2Parameters); + } + + public BLS12_377Fq2(final BLS12_377Fq c0, final BLS12_377Fq c1) { + this(c0.toBigInteger(), c1.toBigInteger()); + } + + public BLS12_377Fq2(final long c0, final long c1) { + this(BigInteger.valueOf(c0), BigInteger.valueOf(c1)); + } + + public BLS12_377Fq2 self() { + return this; + } + + public Fp2 element() { + return this.element; + } + + public BLS12_377Fq2 zero() { + return ZERO; + } + + public BLS12_377Fq2 one() { + return ONE; + } + + public BLS12_377Fq2 construct(final Fp2 element) { + return new BLS12_377Fq2(element); + } + + public BLS12_377Fq2 construct(final BLS12_377Fq c0, final BLS12_377Fq c1) { + return new BLS12_377Fq2(c0, c1); + } + + public BLS12_377Fq2 construct(final long c0, final long c1) { + return new BLS12_377Fq2(c0, c1); + } + + public String toString() { + return this.element.toString(); + } + } + + /* Field Fq6 */ + public static class BLS12_377Fq6 extends BLSFq6 { + public static final BLS12_377Fq6Parameters Fq6Parameters = new BLS12_377Fq6Parameters(); + public static BLS12_377Fq6 ZERO = new BLS12_377Fq6(Fq6Parameters.ZERO()); + public static BLS12_377Fq6 ONE = new BLS12_377Fq6(Fq6Parameters.ONE()); + + public Fp6_3Over2 element; + + public BLS12_377Fq6(final Fp6_3Over2 element) { + this.element = element; + } + + public BLS12_377Fq6(final BLS12_377Fq2 c0, final BLS12_377Fq2 c1, final BLS12_377Fq2 c2) { + this.element = new Fp6_3Over2(c0.element, c1.element, c2.element, Fq6Parameters); + } + + public BLS12_377Fq6 self() { + return this; + } + + public Fp6_3Over2 element() { + return this.element; + } + + public BLS12_377Fq6 zero() { + return ZERO; + } + + public BLS12_377Fq6 one() { + return ONE; + } + + public Fp2 mulByNonResidue(final Fp2 other) { + return Fq6Parameters.nonresidue().mul(other); + } + + public BLS12_377Fq6 construct(final Fp6_3Over2 element) { + return new BLS12_377Fq6(element); + } + + public String toString() { + return this.element.toString(); + } +} + + /* Field Fq12 */ + public static class BLS12_377Fq12 extends BLSFq12 { + public static final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); + public static BLS12_377Fq12 ZERO = new BLS12_377Fq12(Fq12Parameters.ZERO()); + public static BLS12_377Fq12 ONE = new BLS12_377Fq12(Fq12Parameters.ONE()); + + public Fp12_2Over3Over2 element; + + public BLS12_377Fq12(final Fp12_2Over3Over2 element) { + this.element = element; + } + + public BLS12_377Fq12(final BLS12_377Fq6 c0, final BLS12_377Fq6 c1) { + this.element = new Fp12_2Over3Over2(c0.element, c1.element, Fq12Parameters); + } + + public BLS12_377Fq12 self() { + return this; + } + + public Fp12_2Over3Over2 element() { + return this.element; + } + + public BLS12_377Fq12 zero() { + return ZERO; + } + + public BLS12_377Fq12 one() { + return ONE; + } + + public BLS12_377Fq12 construct(final Fp12_2Over3Over2 element) { + return new BLS12_377Fq12(element); + } + + public String toString() { + return this.element.toString(); + } + } +} // BLS12_377Fields diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java new file mode 100755 index 0000000..6626933 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java @@ -0,0 +1,39 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377; + +import algebra.curves.barreto_lynn_scott.BLSG1; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; + +/** + * Class representing a specific BN group. This class is representing BOTH the group and a point in + * the group. In fact, each point necessitates the X,Y,Z coordinates; and the group definition + * additionally requires the `G1Parameters` attribute. Here all group elements are constructed by + * `construct()` which is a wrapper around the "Group class" constructor. + * + *

That's why `BN254bG1Parameters.ONE` is treated as a "Group factory" here: + * https://github.com/clearmatics/dizk/blob/develop/src/test/java/algebra/curves/CurvesTest.java#L93 + */ +public class BLS12_377G1 extends BLSG1 { + + public static final BLS12_377G1Parameters G1Parameters = new BLS12_377G1Parameters(); + + public BLS12_377G1(final BLS12_377Fq X, final BLS12_377Fq Y, final BLS12_377Fq Z) { + super(X, Y, Z, G1Parameters); + } + + public BLS12_377G1 self() { + return this; + } + + public BLS12_377G1 construct(final BLS12_377Fq X, final BLS12_377Fq Y, final BLS12_377Fq Z) { + return new BLS12_377G1(X, Y, Z); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java new file mode 100755 index 0000000..feab4e6 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java @@ -0,0 +1,31 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377; + +import algebra.curves.barreto_lynn_scott.BLSG2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; + +public class BLS12_377G2 extends BLSG2 { + + private static final BLS12_377G2Parameters G2Parameters = new BLS12_377G2Parameters(); + + public BLS12_377G2(final BLS12_377Fq2 X, final BLS12_377Fq2 Y, final BLS12_377Fq2 Z) { + super(X, Y, Z, G2Parameters); + } + + public BLS12_377G2 self() { + return this; + } + + public BLS12_377G2 construct(final BLS12_377Fq2 X, final BLS12_377Fq2 Y, final BLS12_377Fq2 Z) { + return new BLS12_377G2(X, Y, Z); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java new file mode 100755 index 0000000..dfb02d7 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java @@ -0,0 +1,29 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377; + +import algebra.curves.barreto_lynn_scott.BLSGT; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq12; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq6; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377GTParameters; + +public class BLS12_377GT + extends BLSGT { + + private static final BLS12_377GTParameters GTParameters = new BLS12_377GTParameters(); + + public BLS12_377GT(final BLS12_377Fq12 value) { + super(value, GTParameters); + } + + public BLS12_377GT construct(final BLS12_377Fq12 element) { + return new BLS12_377GT(element); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java new file mode 100755 index 0000000..7396f2e --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java @@ -0,0 +1,42 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377; + +import algebra.curves.barreto_lynn_scott.BLSPairing; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.*; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377GTParameters; + +public class BLS12_377Pairing + extends BLSPairing< + BLS12_377Fr, + BLS12_377Fq, + BLS12_377Fq2, + BLS12_377Fq6, + BLS12_377Fq12, + BLS12_377G1, + BLS12_377G2, + BLS12_377GT, + BLS12_377G1Parameters, + BLS12_377G2Parameters, + BLS12_377GTParameters, + BLS12_377PublicParameters> { + + private static final BLS12_377PublicParameters publicParameters = new BLS12_377PublicParameters(); + + public BLS12_377PublicParameters publicParameters() { + return publicParameters; + } + + public BLS12_377GT reducedPairing(final BLS12_377G1 P, final BLS12_377G2 Q) { + final BLS12_377Fq12 f = atePairing(P, Q); + final BLS12_377Fq12 result = finalExponentiation(f); + return new BLS12_377GT(result); + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java new file mode 100755 index 0000000..7fa5d41 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java @@ -0,0 +1,53 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377; + +import algebra.curves.barreto_lynn_scott.BLSPublicParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq12; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq6; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq2Parameters; +import java.math.BigInteger; + +public class BLS12_377PublicParameters + extends BLSPublicParameters { + + public BLS12_377PublicParameters() { + final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); + + coefficientB = new BLS12_377Fq("3"); + twist = new BLS12_377Fq2(new BLS12_377Fq("9"), new BLS12_377Fq("1")); + twistCoefficientB = twist.inverse().mul(coefficientB); + bC0MulTwist = coefficientB.mul(new BLS12_377Fq(Fq2Parameters.nonresidue())); + bC1MulTwist = coefficientB.mul(new BLS12_377Fq(Fq2Parameters.nonresidue())); + qXMulTwist = + new BLS12_377Fq2( + new BLS12_377Fq( + "21575463638280843010398324269430826099269044274347216827212613867836435027261"), + new BLS12_377Fq( + "10307601595873709700152284273816112264069230130616436755625194854815875713954")); + qYMulTwist = + new BLS12_377Fq2( + new BLS12_377Fq( + "2821565182194536844548159561693502659359617185244120367078079554186484126554"), + new BLS12_377Fq( + "3505843767911556378687030309984248845540243509899259641013678093033130930403")); + + // Pairing parameters + ateLoopCount = new BigInteger("29793968203157093288"); + isAteLoopCountNegative = false; + finalExponent = + new BigInteger( + "552484233613224096312617126783173147097382103762957654188882734314196910839907541213974502761540629817009608548654680343627701153829446747810907373256841551006201639677726139946029199968412598804882391702273019083653272047566316584365559776493027495458238373902875937659943504873220554161550525926302303331747463515644711876653177129578303191095900909191624817826566688241804408081892785725967931714097716709526092261278071952560171111444072049229123565057483750161460024353346284167282452756217662335528813519139808291170539072125381230815729071544861602750936964829313608137325426383735122175229541155376346436093930287402089517426973178917569713384748081827255472576937471496195752727188261435633271238710131736096299798168852925540549342330775279877006784354801422249722573783561685179618816480037695005515426162362431072245638324744480"); + finalExponentZ = new BigInteger("4965661367192848881"); + isFinalExponentZNegative = false; + + blsFq12Factory = BLS12_377Fq12.ONE; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java new file mode 100755 index 0000000..8dea9a6 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java @@ -0,0 +1,143 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFq12Parameters; +import algebra.fields.Fp; +import algebra.fields.Fp12_2Over3Over2; +import algebra.fields.Fp2; +import java.io.Serializable; + +public class BLS12_377Fq12Parameters extends AbstractBLSFq12Parameters implements Serializable { + public BLS12_377FqParameters FqParameters; + public BLS12_377Fq2Parameters Fq2Parameters; + public BLS12_377Fq6Parameters Fq6Parameters; + + public Fp12_2Over3Over2 ZERO; + public Fp12_2Over3Over2 ONE; + + public Fp2 nonresidue; + + public Fp2[] FrobeniusCoefficientsC1; + + public BLS12_377Fq12Parameters() { + this.FqParameters = new BLS12_377FqParameters(); + this.Fq2Parameters = new BLS12_377Fq2Parameters(); + this.Fq6Parameters = new BLS12_377Fq6Parameters(); + + this.ZERO = new Fp12_2Over3Over2(Fq6Parameters.ZERO(), Fq6Parameters.ZERO(), this); + this.ONE = new Fp12_2Over3Over2(Fq6Parameters.ONE(), Fq6Parameters.ZERO(), this); + + this.nonresidue = new Fp2(FqParameters.ZERO(), FqParameters.ONE(), Fq2Parameters); + + this.FrobeniusCoefficientsC1 = new Fp2[12]; + this.FrobeniusCoefficientsC1[0] = + new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + this.FrobeniusCoefficientsC1[1] = + new Fp2( + new Fp( + "92949345220277864758624960506473182677953048909283248980960104381795901929519566951595905490535835115111760994353", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[2] = + new Fp2( + new Fp( + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410946", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[3] = + new Fp2( + new Fp( + "216465761340224619389371505802605247630151569547285782856803747159100223055385581585702401816380679166954762214499", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[4] = + new Fp2( + new Fp( + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410945", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[5] = + new Fp2( + new Fp( + "123516416119946754630746545296132064952198520638002533875843642777304321125866014634106496325844844051843001220146", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[6] = + new Fp2( + new Fp( + "258664426012969094010652733694893533536393512754914660539884262666720468348340822774968888139573360124440321458176", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[7] = + new Fp2( + new Fp( + "165715080792691229252027773188420350858440463845631411558924158284924566418821255823372982649037525009328560463824", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[8] = + new Fp2( + new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[9] = + new Fp2( + new Fp( + "42198664672744474621281227892288285906241943207628877683080515507620245292955241189266486323192680957485559243678", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[10] = + new Fp2( + new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047232", FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[11] = + new Fp2( + new Fp( + "135148009893022339379906188398761468584194992116912126664040619889416147222474808140862391813728516072597320238031", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + } + + public BLS12_377FqParameters FpParameters() { + return FqParameters; + } + + public BLS12_377Fq2Parameters Fp2Parameters() { + return Fq2Parameters; + } + + public BLS12_377Fq6Parameters Fp6Parameters() { + return Fq6Parameters; + } + + public Fp12_2Over3Over2 ZERO() { + return ZERO; + } + + public Fp12_2Over3Over2 ONE() { + return ONE; + } + + public Fp2 nonresidue() { + return nonresidue; + } + + public Fp2[] FrobeniusMapCoefficientsC1() { + return FrobeniusCoefficientsC1; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java new file mode 100755 index 0000000..e6de300 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java @@ -0,0 +1,112 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFq2Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.fields.Fp; +import algebra.fields.Fp2; +import java.io.Serializable; +import java.math.BigInteger; + + +public class BLS12_377Fq2Parameters extends AbstractBLSFq2Parameters implements Serializable { + public BLS12_377FqParameters FqParameters; + public BigInteger euler; + public long s; + public BigInteger t; + public BigInteger tMinus1Over2; + public Fp nonresidue; + public Fp2 nqr; + public Fp2 nqrTot; + public Fp[] FrobeniusCoefficientsC1; + + public Fp2 ZERO; + public Fp2 ONE; + + public BLS12_377Fq2Parameters() { + this.FqParameters = new BLS12_377FqParameters(); + this.euler = + new BigInteger( + "239547588008311421220994022608339370399626158265550411218223901127035046843189118723920525909718935985594116157406550130918127817069793474323196511433944"); + this.s = 4; + this.t = + new BigInteger( + "29943448501038927652624252826042421299953269783193801402277987640879380855398639840490065738714866998199264519675818766364765977133724184290399563929243"); + this.tMinus1Over2 = + new BigInteger( + "14971724250519463826312126413021210649976634891596900701138993820439690427699319920245032869357433499099632259837909383182382988566862092145199781964621"); + this.nonresidue = + new BLS12_377Fq( + "21888242871839275222246405745257275088696311157297823662689037894645226208582") + .element(); + this.nqr = new Fp2(new Fp("2", FqParameters), new Fp("1", FqParameters), this); + this.nqrTot = + new Fp2( + new Fp( + "5033503716262624267312492558379982687175200734934877598599011485707452665730", + FqParameters), + new Fp( + "314498342015008975724433667930697407966947188435857772134235984660852259084", + FqParameters), + this); + this.FrobeniusCoefficientsC1 = new Fp[2]; + this.FrobeniusCoefficientsC1[0] = new Fp("1", FqParameters); + this.FrobeniusCoefficientsC1[1] = + new Fp( + "21888242871839275222246405745257275088696311157297823662689037894645226208582", + FqParameters); + + this.ZERO = new Fp2(BigInteger.ZERO, BigInteger.ZERO, this); + this.ONE = new Fp2(BigInteger.ONE, BigInteger.ZERO, this); + } + + public BLS12_377FqParameters FpParameters() { + return FqParameters; + } + + public Fp2 ZERO() { + return ZERO; + } + + public Fp2 ONE() { + return ONE; + } + + public BigInteger euler() { + return euler; + } + + public long s() { + return s; + } + + public BigInteger t() { + return t; + } + + public BigInteger tMinus1Over2() { + return tMinus1Over2; + } + + public Fp nonresidue() { + return nonresidue; + } + + public Fp2 nqr() { + return nqr; + } + + public Fp2 nqrTot() { + return nqrTot; + } + + public Fp[] FrobeniusMapCoefficientsC1() { + return FrobeniusCoefficientsC1; + } +} \ No newline at end of file diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java new file mode 100755 index 0000000..b8bb74b --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java @@ -0,0 +1,122 @@ +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFq6Parameters; +import algebra.fields.Fp; +import algebra.fields.Fp2; +import algebra.fields.Fp6_3Over2; +import java.io.Serializable; + +public class BLS12_377Fq6Parameters extends AbstractBLSFq6Parameters implements Serializable { + public BLS12_377Fq2Parameters Fq2Parameters; + + public Fp6_3Over2 ZERO; + public Fp6_3Over2 ONE; + + public Fp2 nonresidue; + public Fp2[] FrobeniusCoefficientsC1; + public Fp2[] FrobeniusCoefficientsC2; + + public BLS12_377Fq6Parameters() { + final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); + this.Fq2Parameters = new BLS12_377Fq2Parameters(); + + this.ZERO = + new Fp6_3Over2(Fq2Parameters.ZERO(), Fq2Parameters.ZERO(), Fq2Parameters.ZERO(), this); + this.ONE = + new Fp6_3Over2(Fq2Parameters.ONE(), Fq2Parameters.ZERO(), Fq2Parameters.ZERO(), this); + this.nonresidue = new Fp2(FqParameters.ZERO(), FqParameters.ONE(), Fq2Parameters); + + this.FrobeniusCoefficientsC1 = new Fp2[6]; + this.FrobeniusCoefficientsC1[0] = + new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + this.FrobeniusCoefficientsC1[1] = + new Fp2( + new Fp( + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410946", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[2] = + new Fp2( + new Fp( + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410945", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[3] = + new Fp2( + new Fp( + "258664426012969094010652733694893533536393512754914660539884262666720468348340822774968888139573360124440321458176", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[4] = + new Fp2( + new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC1[5] = + new Fp2( + new Fp( + "258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047232", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + + this.FrobeniusCoefficientsC2 = new Fp2[6]; + this.FrobeniusCoefficientsC2[0] = + new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + this.FrobeniusCoefficientsC2[1] = + new Fp2( + new Fp( + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410945", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC2[2] = + new Fp2( + new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC2[3] = + new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + this.FrobeniusCoefficientsC2[4] = + new Fp2( + new Fp( + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410945", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + this.FrobeniusCoefficientsC2[5] = + new Fp2( + new Fp( + "258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", + FqParameters), + new Fp("0", FqParameters), + Fq2Parameters); + } + + public BLS12_377Fq2Parameters Fp2Parameters() { + return Fq2Parameters; + } + + public Fp6_3Over2 ZERO() { + return ZERO; + } + + public Fp6_3Over2 ONE() { + return ONE; + } + + public Fp2 nonresidue() { + return nonresidue; + } + + public Fp2[] FrobeniusMapCoefficientsC1() { + return FrobeniusCoefficientsC1; + } + + public Fp2[] FrobeniusMapCoefficientsC2() { + return FrobeniusCoefficientsC2; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java new file mode 100755 index 0000000..5df2701 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java @@ -0,0 +1,107 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFqParameters; +import algebra.fields.Fp; +import java.io.Serializable; +import java.math.BigInteger; + +public class BLS12_377FqParameters extends AbstractBLSFqParameters implements Serializable { + public BigInteger modulus; + public BigInteger root; + public Fp multiplicativeGenerator; + public long numBits; + + public BigInteger euler; + public long s; + public BigInteger t; + public BigInteger tMinus1Over2; + public Fp nqr; + public Fp nqrTot; + + public Fp ZERO; + public Fp ONE; + + public BLS12_377FqParameters() { + this.modulus = + new BigInteger( + "258664426012969094010652733694893533536393512754914660539884262666720468348340822774968888139573360124440321458177"); + this.root = + new BigInteger( + "32863578547254505029601261939868325669770508939375122462904745766352256812585773382134936404344547323199885654433"); + this.multiplicativeGenerator = new Fp("15", this); + this.numBits = 377; + + this.euler = + new BigInteger( + "129332213006484547005326366847446766768196756377457330269942131333360234174170411387484444069786680062220160729088"); + this.s = 46; + this.t = + new BigInteger( + "3675842578061421676390135839012792950148785745837396071634149488243117337281387659330802195819009059"); + this.tMinus1Over2 = + new BigInteger( + "1837921289030710838195067919506396475074392872918698035817074744121558668640693829665401097909504529"); + this.nqr = new Fp("5", this); + this.nqrTot = + new Fp( + "33774956008227656219775876656288133547078610493828613777258829345740556592044969439504850374928261397247202212840", this); + + this.ZERO = new Fp(BigInteger.ZERO, this); + this.ONE = new Fp(BigInteger.ONE, this); + } + + public BigInteger modulus() { + return modulus; + } + + public BigInteger root() { + return root; + } + + public Fp multiplicativeGenerator() { + return multiplicativeGenerator; + } + + public long numBits() { + return numBits; + } + + public BigInteger euler() { + return euler; + } + + public long s() { + return s; + } + + public BigInteger t() { + return t; + } + + public BigInteger tMinus1Over2() { + return tMinus1Over2; + } + + public Fp nqr() { + return nqr; + } + + public Fp nqrTot() { + return nqrTot; + } + + public Fp ZERO() { + return ZERO; + } + + public Fp ONE() { + return ONE; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java new file mode 100755 index 0000000..5074d9e --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java @@ -0,0 +1,104 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFrParameters; +import algebra.fields.Fp; +import java.io.Serializable; +import java.math.BigInteger; + +public class BLS12_377FrParameters extends AbstractBLSFrParameters implements Serializable { + public BigInteger modulus; + public BigInteger root; + public Fp multiplicativeGenerator; + public long numBits; + + public BigInteger euler; + public long s; + public BigInteger t; + public BigInteger tMinus1Over2; + public Fp nqr; + public Fp nqrTot; + + public Fp ZERO; + public Fp ONE; + + public BLS12_377FrParameters() { + this.modulus = + new BigInteger( + "8444461749428370424248824938781546531375899335154063827935233455917409239041"); + this.root = + new BigInteger( + "8065159656716812877374967518403273466521432693661810619979959746626482506078"); + this.multiplicativeGenerator = new Fp("22", this); + this.numBits = 253; + + this.euler = + new BigInteger( + "4222230874714185212124412469390773265687949667577031913967616727958704619520"); + this.s = 47; + this.t = new BigInteger("60001509534603559531609739528203892656505753216962260608619555"); + this.tMinus1Over2 = + new BigInteger("30000754767301779765804869764101946328252876608481130304309777"); + this.nqr = new Fp("11", this); + this.nqrTot = + new Fp( + "6924886788847882060123066508223519077232160750698452411071850219367055984476", this); + + this.ZERO = new Fp(BigInteger.ZERO, this); + this.ONE = new Fp(BigInteger.ONE, this); + } + + public BigInteger modulus() { + return modulus; + } + + public BigInteger root() { + return root; + } + + public Fp multiplicativeGenerator() { + return multiplicativeGenerator; + } + + public long numBits() { + return numBits; + } + + public BigInteger euler() { + return euler; + } + + public long s() { + return s; + } + + public BigInteger t() { + return t; + } + + public BigInteger tMinus1Over2() { + return tMinus1Over2; + } + + public Fp nqr() { + return nqr; + } + + public Fp nqrTot() { + return nqrTot; + } + + public Fp ZERO() { + return ZERO; + } + + public Fp ONE() { + return ONE; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java new file mode 100755 index 0000000..e0f47d4 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java @@ -0,0 +1,63 @@ +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG1Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; +import java.io.Serializable; +import java.util.ArrayList; +import java.util.Arrays; + +public class BLS12_377G1Parameters + extends AbstractBLSG1Parameters + implements Serializable { + + public static final BLS12_377G1 ZERO = new BLS12_377G1(BLS12_377Fq.ZERO, BLS12_377Fq.ONE, BLS12_377Fq.ZERO); + public static final BLS12_377G1 ONE = new BLS12_377G1(BLS12_377Fq.ONE, new BLS12_377Fq(2), BLS12_377Fq.ONE); + public static final ArrayList fixedBaseWindowTable = + new ArrayList<>( + Arrays.asList( + 1, // window 1 is unbeaten in [-inf, 4.99] + 5, // window 2 is unbeaten in [4.99, 10.99] + 11, // window 3 is unbeaten in [10.99, 32.29] + 32, // window 4 is unbeaten in [32.29, 55.23] + 55, // window 5 is unbeaten in [55.23, 162.03] + 162, // window 6 is unbeaten in [162.03, 360.15] + 360, // window 7 is unbeaten in [360.15, 815.44] + 815, // window 8 is unbeaten in [815.44, 2373.07] + 2373, // window 9 is unbeaten in [2373.07, 6977.75] + 6978, // window 10 is unbeaten in [6977.75, 7122.23] + 7122, // window 11 is unbeaten in [7122.23, 57818.46] + 0, // window 12 is never the best + 57818, // window 13 is unbeaten in [57818.46, 169679.14] + 0, // window 14 is never the best + 169679, // window 15 is unbeaten in [169679.14, 439758.91] + 439759, // window 16 is unbeaten in [439758.91, 936073.41] + 936073, // window 17 is unbeaten in [936073.41, 4666554.74] + 0, // window 18 is never the best + 4666555, // window 19 is unbeaten in [4666554.74, 7580404.42] + 7580404, // window 20 is unbeaten in [7580404.42, 34552892.20] + 0, // window 21 is never the best + 34552892 // window 22 is unbeaten in [34552892.20, inf] + )); + + public BLS12_377G1 ZERO() { + return ZERO; + } + + public BLS12_377G1 ONE() { + return ONE; + } + + public BLS12_377Fr zeroFr() { + return BLS12_377Fr.ZERO; + } + + public BLS12_377Fr oneFr() { + return BLS12_377Fr.ONE; + } + + public ArrayList fixedBaseWindowTable() { + return fixedBaseWindowTable; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java new file mode 100755 index 0000000..5ad5d39 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java @@ -0,0 +1,76 @@ +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG2Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; +import java.io.Serializable; +import java.util.ArrayList; +import java.util.Arrays; + +public class BLS12_377G2Parameters + extends AbstractBLSG2Parameters + implements Serializable { + + public static final BLS12_377G2 ZERO = new BLS12_377G2(BLS12_377Fq2.ZERO, BLS12_377Fq2.ONE, BLS12_377Fq2.ZERO); + public static final BLS12_377G2 ONE = + new BLS12_377G2( + new BLS12_377Fq2( + new BLS12_377Fq( + "10857046999023057135944570762232829481370756359578518086990519993285655852781"), + new BLS12_377Fq( + "11559732032986387107991004021392285783925812861821192530917403151452391805634")), + new BLS12_377Fq2( + new BLS12_377Fq( + "8495653923123431417604973247489272438418190587263600148770280649306958101930"), + new BLS12_377Fq( + "4082367875863433681332203403145435568316851327593401208105741076214120093531")), + new BLS12_377Fq2(1, 0)); + public static final ArrayList fixedBaseWindowTable = + new ArrayList<>( + Arrays.asList( + 1, // window 1 is unbeaten in [-inf, 5.10] + 5, // window 2 is unbeaten in [5.10, 10.43] + 10, // window 3 is unbeaten in [10.43, 25.28] + 25, // window 4 is unbeaten in [25.28, 59.00] + 59, // window 5 is unbeaten in [59.00, 154.03] + 154, // window 6 is unbeaten in [154.03, 334.25] + 334, // window 7 is unbeaten in [334.25, 742.58] + 743, // window 8 is unbeaten in [742.58, 2034.40] + 2034, // window 9 is unbeaten in [2034.40, 4987.56] + 4988, // window 10 is unbeaten in [4987.56, 8888.27] + 8888, // window 11 is unbeaten in [8888.27, 26271.13] + 26271, // window 12 is unbeaten in [26271.13, 39768.20] + 39768, // window 13 is unbeaten in [39768.20, 106275.75] + 106276, // window 14 is unbeaten in [106275.75, 141703.40] + 141703, // window 15 is unbeaten in [141703.40, 462422.97] + 462423, // window 16 is unbeaten in [462422.97, 926871.84] + 926872, // window 17 is unbeaten in [926871.84, 4873049.17] + 0, // window 18 is never the best + 4873049, // window 19 is unbeaten in [4873049.17, 5706707.88] + 5706708, // window 20 is unbeaten in [5706707.88, 31673814.95] + 0, // window 21 is never the best + 31673815 // window 22 is unbeaten in [31673814.95, inf] + )); + + public BLS12_377G2 ZERO() { + return ZERO; + } + + public BLS12_377G2 ONE() { + return ONE; + } + + public BLS12_377Fr zeroFr() { + return BLS12_377Fr.ZERO; + } + + public BLS12_377Fr oneFr() { + return BLS12_377Fr.ONE; + } + + public ArrayList fixedBaseWindowTable() { + return fixedBaseWindowTable; + } +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java new file mode 100755 index 0000000..647cd40 --- /dev/null +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java @@ -0,0 +1,26 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; + +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq12; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fq6; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377GT; + +public class BLS12_377GTParameters + extends AbstractBLSGTParameters< + BLS12_377Fq, BLS12_377Fq2, BLS12_377Fq6, BLS12_377Fq12, BLS12_377GT, BLS12_377GTParameters> { + + public static final BLS12_377GT ONE = new BLS12_377GT(BLS12_377Fq12.ONE); + + public BLS12_377GT ONE() { + return ONE; + } +} From 24e6b6aab4a2ff14bd7ea2ce88e81e5441d20cad Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Mon, 11 Jan 2021 17:39:26 +0000 Subject: [PATCH 62/94] Started to re-arrange tests --- .../curves/GenericBilinearityTest.java | 39 +++++++++++ .../BLSBilinearityTest.java | 31 +++++++++ .../BNBilinearityTest.java} | 67 +++---------------- .../curves/mock/MockBilinearityTest.java | 26 +++++++ 4 files changed, 105 insertions(+), 58 deletions(-) create mode 100644 src/test/java/algebra/curves/GenericBilinearityTest.java create mode 100644 src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java rename src/test/java/algebra/curves/{BilinearityTest.java => barreto_naehrig/BNBilinearityTest.java} (53%) create mode 100644 src/test/java/algebra/curves/mock/MockBilinearityTest.java diff --git a/src/test/java/algebra/curves/GenericBilinearityTest.java b/src/test/java/algebra/curves/GenericBilinearityTest.java new file mode 100644 index 0000000..98c29b8 --- /dev/null +++ b/src/test/java/algebra/curves/GenericBilinearityTest.java @@ -0,0 +1,39 @@ +package algebra.curves; + +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.fields.AbstractFieldElementExpanded; + +public class GenericBilinearityTest { + public < + G1T extends AbstractG1, + G2T extends AbstractG2, + GTT extends AbstractGT, + PairingT extends AbstractPairing, + FieldT extends AbstractFieldElementExpanded> + void PairingTest( + final G1T P, + final G2T Q, + final GTT gtOne, + final FieldT fieldFactory, + final PairingT pairing) { + final long seed1 = 4; + final long seed2 = 7; + + final GTT one = gtOne; + final FieldT s = fieldFactory.random(seed1 + seed2, null); + + G1T sP = P.mul(s); + G2T sQ = Q.mul(s); + + GTT ans1 = pairing.reducedPairing(sP, Q); + GTT ans2 = pairing.reducedPairing(P, sQ); + GTT ans3 = pairing.reducedPairing(P, Q).mul(s.toBigInteger()); + + assertTrue(ans1.equals(ans2)); + assertTrue(ans2.equals(ans3)); + assertFalse(ans1.equals(one)); + } + +} diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java new file mode 100644 index 0000000..19f6662 --- /dev/null +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java @@ -0,0 +1,31 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377GT; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Pairing; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377GTParameters; + +import org.junit.jupiter.api.Test; +import algebra.curves.GenericBilinearityTest; + +public class BLSBilinearityTest { + @Test + public void BLS12_377Test() { + final BLS12_377G1 g1One = BLS12_377G1Parameters.ONE; + final BLS12_377G2 g2One = BLS12_377G2Parameters.ONE; + final BLS12_377GT gtOne = BLS12_377GTParameters.ONE; + final BLS12_377Fr fieldFactory = new BLS12_377Fr(6); + final BLS12_377Pairing pairing = new BLS12_377Pairing(); + + final BLS12_377G1 P = g1One.mul(fieldFactory.random(5L, null)); + final BLS12_377G2 Q = g2One.mul(fieldFactory.random(6L, null)); + + GenericBilinearityTest gTest = new GenericBilinearityTest(); + gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); + gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + } +} diff --git a/src/test/java/algebra/curves/BilinearityTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java similarity index 53% rename from src/test/java/algebra/curves/BilinearityTest.java rename to src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java index 1aaa8d9..5a51697 100755 --- a/src/test/java/algebra/curves/BilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java @@ -5,10 +5,7 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves; - -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertTrue; +package algebra.curves.barreto_naehrig; import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; import algebra.curves.barreto_naehrig.bn254a.BN254aG1; @@ -26,46 +23,11 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bGTParameters; -import algebra.curves.mock.*; -import algebra.curves.mock.fake_parameters.FakeG1Parameters; -import algebra.curves.mock.fake_parameters.FakeG2Parameters; -import algebra.curves.mock.fake_parameters.FakeGTParameters; -import algebra.fields.AbstractFieldElementExpanded; -import algebra.fields.Fp; -import algebra.fields.mock.fieldparameters.LargeFpParameters; import org.junit.jupiter.api.Test; -public class BilinearityTest { - private < - G1T extends AbstractG1, - G2T extends AbstractG2, - GTT extends AbstractGT, - PairingT extends AbstractPairing, - FieldT extends AbstractFieldElementExpanded> - void PairingTest( - final G1T P, - final G2T Q, - final GTT gtOne, - final FieldT fieldFactory, - final PairingT pairing) { - final long seed1 = 4; - final long seed2 = 7; - - final GTT one = gtOne; - final FieldT s = fieldFactory.random(seed1 + seed2, null); - - G1T sP = P.mul(s); - G2T sQ = Q.mul(s); - - GTT ans1 = pairing.reducedPairing(sP, Q); - GTT ans2 = pairing.reducedPairing(P, sQ); - GTT ans3 = pairing.reducedPairing(P, Q).mul(s.toBigInteger()); - - assertTrue(ans1.equals(ans2)); - assertTrue(ans2.equals(ans3)); - assertFalse(ans1.equals(one)); - } +import algebra.curves.GenericBilinearityTest; +public class BNBilinearityTest { @Test public void BN254aTest() { final BN254aG1 g1One = BN254aG1Parameters.ONE; @@ -77,8 +39,9 @@ public void BN254aTest() { final BN254aG1 P = g1One.mul(fieldFactory.random(5L, null)); final BN254aG2 Q = g2One.mul(fieldFactory.random(6L, null)); - PairingTest(P, Q, gtOne, fieldFactory, pairing); - PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + GenericBilinearityTest gTest = new GenericBilinearityTest(); + gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); + gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); } @Test @@ -92,20 +55,8 @@ public void BN254bTest() { final BN254bG1 P = g1One.mul(fieldFactory.random(5L, null)); final BN254bG2 Q = g2One.mul(fieldFactory.random(6L, null)); - PairingTest(P, Q, gtOne, fieldFactory, pairing); - PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); - } - - @Test - public void FakeTest() { - FakeInitialize.init(); - final FakeG1 g1Factory = new FakeG1Parameters().ONE(); - final FakeG2 g2Factory = new FakeG2Parameters().ONE(); - final FakeGT gTFactory = new FakeGTParameters().ONE(); - final Fp fieldFactory = new LargeFpParameters().ONE(); - - FakePairing pairing = new FakePairing(); - - PairingTest(g1Factory, g2Factory, gTFactory, fieldFactory, pairing); + GenericBilinearityTest gTest = new GenericBilinearityTest(); + gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); + gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); } } diff --git a/src/test/java/algebra/curves/mock/MockBilinearityTest.java b/src/test/java/algebra/curves/mock/MockBilinearityTest.java new file mode 100644 index 0000000..013b35c --- /dev/null +++ b/src/test/java/algebra/curves/mock/MockBilinearityTest.java @@ -0,0 +1,26 @@ +package algebra.curves.mock; + +import algebra.curves.mock.fake_parameters.FakeG1Parameters; +import algebra.curves.mock.fake_parameters.FakeG2Parameters; +import algebra.curves.mock.fake_parameters.FakeGTParameters; +import algebra.fields.Fp; +import algebra.fields.mock.fieldparameters.LargeFpParameters; +import org.junit.jupiter.api.Test; + +import algebra.curves.GenericBilinearityTest; + +public class MockBilinearityTest { + @Test + public void FakeTest() { + FakeInitialize.init(); + final FakeG1 g1Factory = new FakeG1Parameters().ONE(); + final FakeG2 g2Factory = new FakeG2Parameters().ONE(); + final FakeGT gTFactory = new FakeGTParameters().ONE(); + final Fp fieldFactory = new LargeFpParameters().ONE(); + + FakePairing pairing = new FakePairing(); + + GenericBilinearityTest gTest = new GenericBilinearityTest(); + gTest.PairingTest(g1Factory, g2Factory, gTFactory, fieldFactory, pairing); + } +} From 91fcd71104589f32f6b770a357b990d697f71574 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 12 Jan 2021 09:32:58 +0000 Subject: [PATCH 63/94] Refactored fields tests --- .../java/algebra/curves/BNFieldsTest.java | 233 ------------------ .../algebra/curves/GenericFieldsTest.java | 120 +++++++++ .../BLSBilinearityTest.java | 31 +-- .../barreto_lynn_scott/BLSFieldsTest.java | 53 ++++ .../curves/barreto_naehrig/BNFieldsTest.java | 105 ++++++++ 5 files changed, 294 insertions(+), 248 deletions(-) delete mode 100755 src/test/java/algebra/curves/BNFieldsTest.java create mode 100755 src/test/java/algebra/curves/GenericFieldsTest.java create mode 100644 src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java create mode 100755 src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java diff --git a/src/test/java/algebra/curves/BNFieldsTest.java b/src/test/java/algebra/curves/BNFieldsTest.java deleted file mode 100755 index 9205787..0000000 --- a/src/test/java/algebra/curves/BNFieldsTest.java +++ /dev/null @@ -1,233 +0,0 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - -package algebra.curves; - -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertTrue; - -import algebra.curves.barreto_naehrig.bn254a.BN254aFields.*; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq12Parameters; -import algebra.curves.barreto_naehrig.bn254b.BN254bFields.*; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq12Parameters; -import algebra.fields.*; -import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; -import algebra.fields.abstractfieldparameters.AbstractFp2Parameters; -import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; -import org.junit.jupiter.api.Test; - -public class BNFieldsTest { - private > void verify( - final FieldT a, final FieldT b) { - final FieldT zero = a.zero(); - assertTrue(zero.equals(zero)); - assertTrue(zero.isZero()); - assertFalse(zero.isOne()); - final FieldT one = a.one(); - assertTrue(one.equals(one)); - assertTrue(one.isOne()); - assertFalse(one.isZero()); - - // FieldT.random() != FieldT.random() - assertTrue(a.random(4L, null).equals(a.random(4L, null))); - assertFalse(a.random(5L, null).equals(a.random(7L, null))); - assertFalse(a.random(null, "zc".getBytes()).equals(a.random(null, "ash".getBytes()))); - - // a == a - assertTrue(a.equals(a)); - // a != b - assertFalse(a.equals(b)); - // a-a = 0 - assertTrue(a.sub(a).equals(zero)); - // a+0 = a - assertTrue(a.add(zero).equals(a)); - // a*0 = 0 - assertTrue(a.mul(zero).equals(zero)); - // a*1 = a - assertTrue(a.mul(one).equals(a)); - // a*a^-1 = 1 - assertTrue(a.mul(a.inverse()).equals(one)); - // a*a = a^2 - assertTrue(a.mul(a).equals(a.square())); - // a*a*a = a^3 - assertTrue(a.mul(a).mul(a).equals(a.pow(3))); - // a-b = -(b-a) - assertTrue(a.sub(b).equals(b.sub(a).negate())); - // (a+b)+a = a+(b+a) - assertTrue((a.add(b)).add(a).equals(a.add(b.add(a)))); - // (a*b)*a = a*(b*a) - assertTrue((a.mul(b)).mul(a).equals(a.mul(b.mul(a)))); - // (a+b)^2 = a^2 + 2ab + b^2 - assertTrue((a.add(b)).square().equals(a.square().add(a.mul(b).add(a.mul(b))).add(b.square()))); - } - - private > void verifyExtended( - final FieldT a) { - final FieldT one = a.one(); - assertTrue(one.equals(one)); - // (w_8)^8 = 1 - assertTrue(a.rootOfUnity(8).pow(8).equals(one)); - } - - private void verifyMulBy024( - final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { - final AbstractFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); - final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); - - final Fp2 c0 = new Fp2(7, 18, Fp2Parameters); - final Fp2 c2 = new Fp2(23, 5, Fp2Parameters); - final Fp2 c4 = new Fp2(192, 73, Fp2Parameters); - - final Fp12_2Over3Over2 naiveResult = - a.mul( - new Fp12_2Over3Over2( - new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), - new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), - Fp12Parameters)); - final Fp12_2Over3Over2 mulBy024Result = a.mulBy024(c0, c2, c4); - - assertTrue(naiveResult.equals(mulBy024Result)); - } - - private void verifyFrobeniusMap( - final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { - assert (a.FrobeniusMap(0).equals(a)); - Fp12_2Over3Over2 a_q = a.pow(Fp12Parameters.FpParameters().modulus()); - for (int power = 1; power < 10; ++power) { - final Fp12_2Over3Over2 a_qi = a.FrobeniusMap(power); - assert (a_qi.equals(a_q)); - - a_q = a_q.pow(Fp12Parameters.FpParameters().modulus()); - } - } - - @Test - public void BN254aFrTest() { - final BN254aFr a = new BN254aFr("6"); - final BN254aFr b = new BN254aFr("13"); - - verify(a, b); - verifyExtended(a); - } - - @Test - public void BN254aFqTest() { - final BN254aFq a = new BN254aFq("6"); - final BN254aFq b = new BN254aFq("13"); - - verify(a, b); - verifyExtended(a); - } - - @Test - public void BN254aFq2Test() { - final BN254aFq c0 = new BN254aFq("6"); - final BN254aFq c1 = new BN254aFq("13"); - - final BN254aFq2 a = new BN254aFq2(c0, c1); - final BN254aFq2 b = new BN254aFq2(c1, c0); - - verify(a, b); - } - - @Test - public void BN254aFq6Test() { - final BN254aFq b0 = new BN254aFq("6"); - final BN254aFq b1 = new BN254aFq("13"); - - final BN254aFq2 c0 = new BN254aFq2(b0, b1); - final BN254aFq2 c1 = new BN254aFq2(b1, b0); - - final BN254aFq6 a = new BN254aFq6(c0, c1, c0); - final BN254aFq6 b = new BN254aFq6(c1, c0, c1); - - verify(a, b); - } - - @Test - public void BN254aFq12Test() { - final BN254aFq12Parameters Fq12Parameters = new BN254aFq12Parameters(); - - final BN254aFq b0 = new BN254aFq("6"); - final BN254aFq b1 = new BN254aFq("13"); - - final BN254aFq2 c0 = new BN254aFq2(b0, b1); - final BN254aFq2 c1 = new BN254aFq2(b1, b0); - - final BN254aFq6 d0 = new BN254aFq6(c0, c1, c0); - final BN254aFq6 d1 = new BN254aFq6(c1, c0, c1); - - final BN254aFq12 a = new BN254aFq12(d0, d1); - final BN254aFq12 b = new BN254aFq12(d1, d0); - - verify(a, b); - verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); - } - - @Test - public void BN254bFrTest() { - final BN254bFr a = new BN254bFr("6"); - final BN254bFr b = new BN254bFr("13"); - - verify(a, b); - verifyExtended(a); - } - - @Test - public void BN254bFqTest() { - final BN254bFq a = new BN254bFq("6"); - final BN254bFq b = new BN254bFq("13"); - - verify(a, b); - verifyExtended(a); - } - - @Test - public void BN254bFq2Test() { - final BN254bFq c0 = new BN254bFq("6"); - final BN254bFq c1 = new BN254bFq("13"); - - final BN254bFq2 a = new BN254bFq2(c0, c1); - final BN254bFq2 b = new BN254bFq2(c1, c0); - - verify(a, b); - } - - @Test - public void BN254bFq6Test() { - final BN254bFq b0 = new BN254bFq("6"); - final BN254bFq b1 = new BN254bFq("13"); - - final BN254bFq2 c0 = new BN254bFq2(b0, b1); - final BN254bFq2 c1 = new BN254bFq2(b1, b0); - - final BN254bFq6 a = new BN254bFq6(c0, c1, c0); - final BN254bFq6 b = new BN254bFq6(c1, c0, c1); - - verify(a, b); - } - - @Test - public void BN254bFq12Test() { - final BN254bFq12Parameters Fq12Parameters = new BN254bFq12Parameters(); - - final BN254bFq b0 = new BN254bFq("6"); - final BN254bFq b1 = new BN254bFq("13"); - - final BN254bFq2 c0 = new BN254bFq2(b0, b1); - final BN254bFq2 c1 = new BN254bFq2(b1, b0); - - final BN254bFq6 d0 = new BN254bFq6(c0, c1, c0); - final BN254bFq6 d1 = new BN254bFq6(c1, c0, c1); - - final BN254bFq12 a = new BN254bFq12(d0, d1); - final BN254bFq12 b = new BN254bFq12(d1, d0); - - verify(a, b); - verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); - } -} diff --git a/src/test/java/algebra/curves/GenericFieldsTest.java b/src/test/java/algebra/curves/GenericFieldsTest.java new file mode 100755 index 0000000..0df6f2f --- /dev/null +++ b/src/test/java/algebra/curves/GenericFieldsTest.java @@ -0,0 +1,120 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves; + +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.fields.*; +import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; +import algebra.fields.abstractfieldparameters.AbstractFp2Parameters; +import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; + +public class GenericFieldsTest { + public > void testFieldOperations( + final FieldT fieldFactory) { + final FieldT zero = fieldFactory.zero(); + assertTrue(zero.equals(zero)); + assertTrue(zero.isZero()); + assertFalse(zero.isOne()); + + final FieldT one = fieldFactory.one(); + assertTrue(one.equals(one)); + assertTrue(one.isOne()); + assertFalse(one.isZero()); + + // FieldT.random() != FieldT.random() + assertTrue(fieldFactory.random(4L, null).equals(fieldFactory.random(4L, null))); + assertFalse(fieldFactory.random(5L, null).equals(fieldFactory.random(7L, null))); + assertFalse(fieldFactory.random(null, "clear".getBytes()).equals(fieldFactory.random(null, "matics".getBytes()))); + + // Select 3 distinct field elements for the test + final FieldT a = fieldFactory.random(4L, null); + assertFalse(a.isZero()); + assertFalse(a.isOne()); + final FieldT b = fieldFactory.random(7L, null); + assertFalse(b.isZero()); + assertFalse(b.isOne()); + final FieldT c = fieldFactory.random(12L, null); + assertFalse(c.isZero()); + assertFalse(c.isOne()); + // Make sure the elements are distinct + assertFalse(a.equals(b)); + assertFalse(a.equals(c)); + assertFalse(b.equals(c)); + + // a == a + assertTrue(a.equals(a)); + // a != b + assertFalse(a.equals(b)); + // a-a = 0 + assertTrue(a.sub(a).equals(zero)); + // a+0 = a + assertTrue(a.add(zero).equals(a)); + // a*0 = 0 + assertTrue(a.mul(zero).equals(zero)); + // a*1 = a + assertTrue(a.mul(one).equals(a)); + // a*a^-1 = 1 + assertTrue(a.mul(a.inverse()).equals(one)); + // a*a = a^2 + assertTrue(a.mul(a).equals(a.square())); + // a*a*a = a^3 + assertTrue(a.mul(a).mul(a).equals(a.pow(3))); + // a-b = -(b-a) + assertTrue(a.sub(b).equals(b.sub(a).negate())); + // (a+b)+c = a+(b+c) + assertTrue((a.add(b)).add(c).equals(a.add(b.add(c)))); + // (a*b)*c = a*(b*c) + assertTrue((a.mul(b)).mul(c).equals(a.mul(b.mul(c)))); + // (a+b)*c = a*c + b*c + assertTrue((a.add(b)).mul(c).equals(a.mul(c).add(b.mul(c)))); + // (a+b)^2 = a^2 + 2ab + b^2 + assertTrue((a.add(b)).square().equals(a.square().add(a.mul(b).add(a.mul(b))).add(b.square()))); + } + + public > void testFieldExpandedOperations( + final FieldT a) { + final FieldT one = a.one(); + assertTrue(one.equals(one)); + // (w_8)^8 = 1 + assertTrue(a.rootOfUnity(8).pow(8).equals(one)); + } + + public void verifyMulBy024( + final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { + final AbstractFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); + final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); + + final Fp2 c0 = new Fp2(7, 18, Fp2Parameters); + final Fp2 c2 = new Fp2(23, 5, Fp2Parameters); + final Fp2 c4 = new Fp2(192, 73, Fp2Parameters); + + final Fp12_2Over3Over2 naiveResult = + a.mul( + new Fp12_2Over3Over2( + new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), + new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), + Fp12Parameters)); + final Fp12_2Over3Over2 mulBy024Result = a.mulBy024(c0, c2, c4); + + assertTrue(naiveResult.equals(mulBy024Result)); + } + + public void verifyFrobeniusMap( + final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { + assert (a.FrobeniusMap(0).equals(a)); + Fp12_2Over3Over2 a_q = a.pow(Fp12Parameters.FpParameters().modulus()); + for (int power = 1; power < 10; ++power) { + final Fp12_2Over3Over2 a_qi = a.FrobeniusMap(power); + assert (a_qi.equals(a_q)); + + a_q = a_q.pow(Fp12Parameters.FpParameters().modulus()); + } + } +} diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java index 19f6662..16aaca9 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java @@ -13,19 +13,20 @@ import algebra.curves.GenericBilinearityTest; public class BLSBilinearityTest { - @Test - public void BLS12_377Test() { - final BLS12_377G1 g1One = BLS12_377G1Parameters.ONE; - final BLS12_377G2 g2One = BLS12_377G2Parameters.ONE; - final BLS12_377GT gtOne = BLS12_377GTParameters.ONE; - final BLS12_377Fr fieldFactory = new BLS12_377Fr(6); - final BLS12_377Pairing pairing = new BLS12_377Pairing(); - - final BLS12_377G1 P = g1One.mul(fieldFactory.random(5L, null)); - final BLS12_377G2 Q = g2One.mul(fieldFactory.random(6L, null)); - - GenericBilinearityTest gTest = new GenericBilinearityTest(); - gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); - gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); - } + GenericBilinearityTest gTest = new GenericBilinearityTest(); + + @Test + public void BLS12_377Test() { + final BLS12_377G1 g1One = BLS12_377G1Parameters.ONE; + final BLS12_377G2 g2One = BLS12_377G2Parameters.ONE; + final BLS12_377GT gtOne = BLS12_377GTParameters.ONE; + final BLS12_377Fr fieldFactory = new BLS12_377Fr(6); + final BLS12_377Pairing pairing = new BLS12_377Pairing(); + + final BLS12_377G1 P = g1One.mul(fieldFactory.random(5L, null)); + final BLS12_377G2 Q = g2One.mul(fieldFactory.random(6L, null)); + + gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); + gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + } } diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java new file mode 100644 index 0000000..2ce4d10 --- /dev/null +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java @@ -0,0 +1,53 @@ +package algebra.curves.barreto_lynn_scott; + +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FqParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FrParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq2Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq6Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq12Parameters; + +import org.junit.jupiter.api.Test; +import algebra.curves.GenericFieldsTest; + +public class BLSFieldsTest { + final GenericFieldsTest gTest = new GenericFieldsTest(); + + // BLS12_377 test cases + @Test + public void BLS12_377FqTest() { + final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); + + gTest.testFieldOperations(FqParameters.ONE()); + gTest.testFieldExpandedOperations(FqParameters.ONE()); + } + + @Test + public void BLS12_377FrTest() { + final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); + + gTest.testFieldOperations(FrParameters.ONE()); + gTest.testFieldExpandedOperations(FrParameters.ONE()); + } + + @Test + public void BLS12_377Fq2Test() { + final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); + + gTest.testFieldOperations(Fq2Parameters.ONE()); + } + + @Test + public void BLS12_377Fq6Test() { + final BLS12_377Fq6Parameters Fq6Parameters = new BLS12_377Fq6Parameters(); + + gTest.testFieldOperations(Fq6Parameters.ONE()); + } + + @Test + public void BLS12_377Fq12Test() { + final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); + + gTest.testFieldOperations(Fq12Parameters.ONE()); + gTest.verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); + } +} diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java new file mode 100755 index 0000000..3e72454 --- /dev/null +++ b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java @@ -0,0 +1,105 @@ +/* @file + ***************************************************************************** + * @author This file is part of zkspark, developed by SCIPR Lab + * and contributors (see AUTHORS). + * @copyright MIT license (see LICENSE file) + *****************************************************************************/ + +package algebra.curves.barreto_naehrig; + +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFqParameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFrParameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq2Parameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq6Parameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq12Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFqParameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFrParameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq2Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq6Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq12Parameters; + +import org.junit.jupiter.api.Test; +import algebra.curves.GenericFieldsTest; + +public class BNFieldsTest { + final GenericFieldsTest gTest = new GenericFieldsTest(); + + // BN254a test cases + @Test + public void BN254aFqTest() { + final BN254aFqParameters FqParameters = new BN254aFqParameters(); + + gTest.testFieldOperations(FqParameters.ONE()); + gTest.testFieldExpandedOperations(FqParameters.ONE()); + } + + @Test + public void BN254aFrTest() { + final BN254aFrParameters FrParameters = new BN254aFrParameters(); + + gTest.testFieldOperations(FrParameters.ONE()); + gTest.testFieldExpandedOperations(FrParameters.ONE()); + } + + @Test + public void BN254aFq2Test() { + final BN254aFq2Parameters Fq2Parameters = new BN254aFq2Parameters(); + + gTest.testFieldOperations(Fq2Parameters.ONE()); + } + + @Test + public void BN254aFq6Test() { + final BN254aFq6Parameters Fq6Parameters = new BN254aFq6Parameters(); + + gTest.testFieldOperations(Fq6Parameters.ONE()); + } + + @Test + public void BN254aFq12Test() { + final BN254aFq12Parameters Fq12Parameters = new BN254aFq12Parameters(); + + gTest.testFieldOperations(Fq12Parameters.ONE()); + gTest.verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); + } + + // BN254b test cases + + @Test + public void BN254bFqTest() { + final BN254bFqParameters FqParameters = new BN254bFqParameters(); + + gTest.testFieldOperations(FqParameters.ONE()); + gTest.testFieldExpandedOperations(FqParameters.ONE()); + } + + @Test + public void BN254bFrTest() { + final BN254bFrParameters FrParameters = new BN254bFrParameters(); + + gTest.testFieldOperations(FrParameters.ONE()); + gTest.testFieldExpandedOperations(FrParameters.ONE()); + } + + @Test + public void BN254bFq2Test() { + final BN254bFq2Parameters Fq2Parameters = new BN254bFq2Parameters(); + + gTest.testFieldOperations(Fq2Parameters.ONE()); + } + + @Test + public void BN254bFq6Test() { + final BN254bFq6Parameters Fq6Parameters = new BN254bFq6Parameters(); + + gTest.testFieldOperations(Fq6Parameters.ONE()); + } + + @Test + public void BN254bFq12Test() { + final BN254bFq12Parameters Fq12Parameters = new BN254bFq12Parameters(); + + gTest.testFieldOperations(Fq12Parameters.ONE()); + gTest.verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); + } +} From a16e2e4592ed62081b3c3b0505578c1c627132f7 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 12 Jan 2021 09:58:33 +0000 Subject: [PATCH 64/94] Used inheritance between test classes for clarity --- src/test/java/algebra/curves/CurvesTest.java | 111 ------------------ .../curves/GenericBilinearityTest.java | 3 +- .../algebra/curves/GenericCurvesTest.java | 88 ++++++++++++++ .../algebra/curves/GenericFieldsTest.java | 15 +-- .../BLSBilinearityTest.java | 8 +- .../barreto_lynn_scott/BLSFieldsTest.java | 20 ++-- .../barreto_naehrig/BNBilinearityTest.java | 19 +-- .../curves/barreto_naehrig/BNCurvesTest.java | 27 +++++ .../curves/barreto_naehrig/BNFieldsTest.java | 42 +++---- .../curves/mock/MockBilinearityTest.java | 5 +- .../algebra/curves/mock/MockCurvesTest.java | 19 +++ 11 files changed, 175 insertions(+), 182 deletions(-) delete mode 100755 src/test/java/algebra/curves/CurvesTest.java create mode 100755 src/test/java/algebra/curves/GenericCurvesTest.java create mode 100755 src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java create mode 100755 src/test/java/algebra/curves/mock/MockCurvesTest.java diff --git a/src/test/java/algebra/curves/CurvesTest.java b/src/test/java/algebra/curves/CurvesTest.java deleted file mode 100755 index 7fea749..0000000 --- a/src/test/java/algebra/curves/CurvesTest.java +++ /dev/null @@ -1,111 +0,0 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - -package algebra.curves; - -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertTrue; - -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.mock.FakeG1; -import algebra.curves.mock.FakeG2; -import algebra.curves.mock.FakeInitialize; -import algebra.curves.mock.fake_parameters.FakeG1Parameters; -import algebra.curves.mock.fake_parameters.FakeG2Parameters; -import algebra.groups.AbstractGroup; -import java.math.BigInteger; -import org.junit.jupiter.api.Test; - -public class CurvesTest { - private > void GroupTest(final GroupT groupFactory) { - final BigInteger rand1 = new BigInteger("76749407"); - final BigInteger rand2 = new BigInteger("44410867"); - final BigInteger randsum = new BigInteger("121160274"); - - final GroupT zero = groupFactory.zero(); - assertTrue(zero.equals(zero)); - final GroupT one = groupFactory.one(); - assertTrue(one.equals(one)); - final GroupT two = one.mul(new BigInteger("2")); - assertTrue(two.equals(two)); - final GroupT three = one.mul(new BigInteger("3")); - final GroupT four = one.mul(new BigInteger("4")); - final GroupT five = one.mul(new BigInteger("5")); - assertTrue(two.add(five).equals(three.add(four))); - - final long seed1 = 4; - final long seed2 = 7; - final byte[] secureSeed1 = "xz8f5j".getBytes(); - final byte[] secureSeed2 = "f5gh9c".getBytes(); - assertTrue(groupFactory.random(seed1, null).equals(groupFactory.random(seed1, null))); - assertFalse(groupFactory.random(seed1, null).equals(groupFactory.random(seed2, null))); - assertFalse( - groupFactory.random(null, secureSeed1).equals(groupFactory.random(null, secureSeed1))); - assertFalse( - groupFactory.random(null, secureSeed1).equals(groupFactory.random(null, secureSeed2))); - - final GroupT a = groupFactory.random(seed1, null); - final GroupT b = groupFactory.random(seed2, null); - - assertFalse(one.equals(zero)); - assertFalse(a.equals(zero)); - assertFalse(a.equals(one)); - - assertFalse(b.equals(zero)); - assertFalse(b.equals(one)); - - assertTrue(a.dbl().equals(a.add(a))); - assertTrue(b.dbl().equals(b.add(b))); - assertTrue(one.add(two).equals(three)); - assertTrue(two.add(one).equals(three)); - assertTrue(a.add(b).equals(b.add(a))); - assertTrue(a.sub(a).equals(zero)); - assertTrue(a.sub(b).equals(a.add(b.negate()))); - assertTrue(a.sub(b).equals(b.negate().add(a))); - - // handle special cases - assertTrue(zero.add(a.negate()).equals(a.negate())); - assertTrue(zero.sub(a).equals(a.negate())); - assertTrue(a.sub(zero).equals(a)); - assertTrue(a.add(zero).equals(a)); - assertTrue(zero.add(a).equals(a)); - - assertTrue(a.add(b).dbl().equals(a.add(b).add(b.add(a)))); - assertTrue(a.add(b).mul(new BigInteger("2")).equals(a.add(b).add(b.add(a)))); - - assertTrue(a.mul(rand1).add(a.mul(rand2)).equals(a.mul(randsum))); - } - - @Test - public void BN254aTest() { - // Test G1 of BN254a - GroupTest(BN254aG1Parameters.ONE); - // Test G2 of BN254a - GroupTest(BN254aG2Parameters.ONE); - } - - @Test - public void BN254bTest() { - // Test G1 of BN254b - GroupTest(BN254bG1Parameters.ONE); - // Test G2 of BN254b - GroupTest(BN254bG2Parameters.ONE); - } - - @Test - public void FakeTest() { - FakeInitialize.init(); - final FakeG1 g1Factory = new FakeG1Parameters().ONE(); - final FakeG2 g2Factory = new FakeG2Parameters().ONE(); - - GroupTest(g1Factory); - GroupTest(g2Factory); - } -} diff --git a/src/test/java/algebra/curves/GenericBilinearityTest.java b/src/test/java/algebra/curves/GenericBilinearityTest.java index 98c29b8..96a8e48 100644 --- a/src/test/java/algebra/curves/GenericBilinearityTest.java +++ b/src/test/java/algebra/curves/GenericBilinearityTest.java @@ -6,7 +6,7 @@ import algebra.fields.AbstractFieldElementExpanded; public class GenericBilinearityTest { - public < + protected < G1T extends AbstractG1, G2T extends AbstractG2, GTT extends AbstractGT, @@ -35,5 +35,4 @@ void PairingTest( assertTrue(ans2.equals(ans3)); assertFalse(ans1.equals(one)); } - } diff --git a/src/test/java/algebra/curves/GenericCurvesTest.java b/src/test/java/algebra/curves/GenericCurvesTest.java new file mode 100755 index 0000000..54a6ccf --- /dev/null +++ b/src/test/java/algebra/curves/GenericCurvesTest.java @@ -0,0 +1,88 @@ +package algebra.curves; + +import static org.junit.jupiter.api.Assertions.assertFalse; +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.groups.AbstractGroup; +import java.math.BigInteger; +import org.junit.jupiter.api.Test; + +public class GenericCurvesTest { + protected > void GroupTest(final GroupT groupFactory) { + final BigInteger rand1 = new BigInteger("76749407"); + final BigInteger rand2 = new BigInteger("44410867"); + final BigInteger randsum = new BigInteger("121160274"); + + final GroupT zero = groupFactory.zero(); + assertTrue(zero.equals(zero)); + + final GroupT one = groupFactory.one(); + assertTrue(one.equals(one)); + + final GroupT two = one.mul(new BigInteger("2")); + assertTrue(two.equals(two)); + + final GroupT three = one.mul(new BigInteger("3")); + final GroupT four = one.mul(new BigInteger("4")); + final GroupT five = one.mul(new BigInteger("5")); + assertTrue(two.add(five).equals(three.add(four))); + + final long seed1 = 4; + final long seed2 = 7; + assertTrue(groupFactory.random(seed1, null).equals(groupFactory.random(seed1, null))); + assertFalse(groupFactory.random(seed1, null).equals(groupFactory.random(seed2, null))); + final byte[] secureSeed1 = "xz8f5j".getBytes(); + final byte[] secureSeed2 = "f5gh9c".getBytes(); + assertFalse( + groupFactory.random(null, secureSeed1).equals(groupFactory.random(null, secureSeed1))); + assertFalse( + groupFactory.random(null, secureSeed1).equals(groupFactory.random(null, secureSeed2))); + + final GroupT A = groupFactory.random(seed1, null); + final GroupT B = groupFactory.random(seed2, null); + + assertFalse(one.equals(zero)); + assertFalse(A.equals(zero)); + assertFalse(A.equals(one)); + + assertFalse(B.equals(zero)); + assertFalse(B.equals(one)); + + // Point doubling + assertTrue(A.dbl().equals(A.add(A))); + assertTrue(B.dbl().equals(B.add(B))); + // Addition + assertTrue(one.add(two).equals(three)); + assertTrue(two.add(one).equals(three)); + assertTrue(A.add(B).equals(B.add(A))); + // Subtraction (addition by inverses) + assertTrue(A.sub(A).equals(zero)); + assertTrue(A.sub(B).equals(A.add(B.negate()))); + assertTrue(A.sub(B).equals(B.negate().add(A))); + // Handle special cases + assertTrue(zero.add(A.negate()).equals(A.negate())); + assertTrue(zero.sub(A).equals(A.negate())); + assertTrue(A.sub(zero).equals(A)); + assertTrue(A.add(zero).equals(A)); + assertTrue(zero.add(A).equals(A)); + + // (A+B)*2 = (A+B) + (A+B) + assertTrue(A.add(B).dbl().equals(A.add(B).add(B.add(A)))); + assertTrue(A.add(B).mul(new BigInteger("2")).equals(A.add(B).add(B.add(A)))); + + // A*s + A*r = A*(r+s) + assertTrue(A.mul(rand1).add(A.mul(rand2)).equals(A.mul(randsum))); + } + + /* + @Test + public void FakeTest() { + FakeInitialize.init(); + final FakeG1 g1Factory = new FakeG1Parameters().ONE(); + final FakeG2 g2Factory = new FakeG2Parameters().ONE(); + + GroupTest(g1Factory); + GroupTest(g2Factory); + } + */ +} diff --git a/src/test/java/algebra/curves/GenericFieldsTest.java b/src/test/java/algebra/curves/GenericFieldsTest.java index 0df6f2f..0c50308 100755 --- a/src/test/java/algebra/curves/GenericFieldsTest.java +++ b/src/test/java/algebra/curves/GenericFieldsTest.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves; import static org.junit.jupiter.api.Assertions.assertFalse; @@ -16,7 +9,7 @@ import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; public class GenericFieldsTest { - public > void testFieldOperations( + protected > void FieldTest( final FieldT fieldFactory) { final FieldT zero = fieldFactory.zero(); assertTrue(zero.equals(zero)); @@ -78,7 +71,7 @@ public > void testFieldOperations( assertTrue((a.add(b)).square().equals(a.square().add(a.mul(b).add(a.mul(b))).add(b.square()))); } - public > void testFieldExpandedOperations( + protected > void FieldExpandedTest( final FieldT a) { final FieldT one = a.one(); assertTrue(one.equals(one)); @@ -86,7 +79,7 @@ public > void testFieldExpan assertTrue(a.rootOfUnity(8).pow(8).equals(one)); } - public void verifyMulBy024( + protected void verifyMulBy024( final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { final AbstractFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); @@ -106,7 +99,7 @@ public void verifyMulBy024( assertTrue(naiveResult.equals(mulBy024Result)); } - public void verifyFrobeniusMap( + protected void FrobeniusMapTest( final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { assert (a.FrobeniusMap(0).equals(a)); Fp12_2Over3Over2 a_q = a.pow(Fp12Parameters.FpParameters().modulus()); diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java index 16aaca9..8d5e506 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java @@ -12,9 +12,7 @@ import org.junit.jupiter.api.Test; import algebra.curves.GenericBilinearityTest; -public class BLSBilinearityTest { - GenericBilinearityTest gTest = new GenericBilinearityTest(); - +public class BLSBilinearityTest extends GenericBilinearityTest { @Test public void BLS12_377Test() { final BLS12_377G1 g1One = BLS12_377G1Parameters.ONE; @@ -26,7 +24,7 @@ public void BLS12_377Test() { final BLS12_377G1 P = g1One.mul(fieldFactory.random(5L, null)); final BLS12_377G2 Q = g2One.mul(fieldFactory.random(6L, null)); - gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); - gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + PairingTest(P, Q, gtOne, fieldFactory, pairing); + PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); } } diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java index 2ce4d10..6e0105c 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java @@ -9,45 +9,43 @@ import org.junit.jupiter.api.Test; import algebra.curves.GenericFieldsTest; -public class BLSFieldsTest { - final GenericFieldsTest gTest = new GenericFieldsTest(); - +public class BLSFieldsTest extends GenericFieldsTest { // BLS12_377 test cases @Test public void BLS12_377FqTest() { final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); - gTest.testFieldOperations(FqParameters.ONE()); - gTest.testFieldExpandedOperations(FqParameters.ONE()); + FieldTest(FqParameters.ONE()); + FieldExpandedTest(FqParameters.ONE()); } @Test public void BLS12_377FrTest() { final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); - gTest.testFieldOperations(FrParameters.ONE()); - gTest.testFieldExpandedOperations(FrParameters.ONE()); + FieldTest(FrParameters.ONE()); + FieldExpandedTest(FrParameters.ONE()); } @Test public void BLS12_377Fq2Test() { final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); - gTest.testFieldOperations(Fq2Parameters.ONE()); + FieldTest(Fq2Parameters.ONE()); } @Test public void BLS12_377Fq6Test() { final BLS12_377Fq6Parameters Fq6Parameters = new BLS12_377Fq6Parameters(); - gTest.testFieldOperations(Fq6Parameters.ONE()); + FieldTest(Fq6Parameters.ONE()); } @Test public void BLS12_377Fq12Test() { final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); - gTest.testFieldOperations(Fq12Parameters.ONE()); - gTest.verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); + FieldTest(Fq12Parameters.ONE()); + FrobeniusMapTest(Fq12Parameters.ONE(), Fq12Parameters); } } diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java index 5a51697..023e259 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_naehrig; import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; @@ -27,7 +20,7 @@ import algebra.curves.GenericBilinearityTest; -public class BNBilinearityTest { +public class BNBilinearityTest extends GenericBilinearityTest { @Test public void BN254aTest() { final BN254aG1 g1One = BN254aG1Parameters.ONE; @@ -39,9 +32,8 @@ public void BN254aTest() { final BN254aG1 P = g1One.mul(fieldFactory.random(5L, null)); final BN254aG2 Q = g2One.mul(fieldFactory.random(6L, null)); - GenericBilinearityTest gTest = new GenericBilinearityTest(); - gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); - gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + PairingTest(P, Q, gtOne, fieldFactory, pairing); + PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); } @Test @@ -55,8 +47,7 @@ public void BN254bTest() { final BN254bG1 P = g1One.mul(fieldFactory.random(5L, null)); final BN254bG2 Q = g2One.mul(fieldFactory.random(6L, null)); - GenericBilinearityTest gTest = new GenericBilinearityTest(); - gTest.PairingTest(P, Q, gtOne, fieldFactory, pairing); - gTest.PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + PairingTest(P, Q, gtOne, fieldFactory, pairing); + PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); } } diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java new file mode 100755 index 0000000..c8f55ad --- /dev/null +++ b/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java @@ -0,0 +1,27 @@ +package algebra.curves.barreto_naehrig; + +import algebra.curves.GenericCurvesTest; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; + +import org.junit.jupiter.api.Test; + +public class BNCurvesTest extends GenericCurvesTest { + @Test + public void BN254aTest() { + // Test G1 of BN254a + GroupTest(BN254aG1Parameters.ONE); + // Test G2 of BN254a + GroupTest(BN254aG2Parameters.ONE); + } + + @Test + public void BN254bTest() { + // Test G1 of BN254b + GroupTest(BN254bG1Parameters.ONE); + // Test G2 of BN254b + GroupTest(BN254bG2Parameters.ONE); + } +} diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java index 3e72454..26b38da 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_naehrig; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFqParameters; @@ -21,46 +14,45 @@ import org.junit.jupiter.api.Test; import algebra.curves.GenericFieldsTest; -public class BNFieldsTest { - final GenericFieldsTest gTest = new GenericFieldsTest(); +public class BNFieldsTest extends GenericFieldsTest { // BN254a test cases @Test public void BN254aFqTest() { final BN254aFqParameters FqParameters = new BN254aFqParameters(); - gTest.testFieldOperations(FqParameters.ONE()); - gTest.testFieldExpandedOperations(FqParameters.ONE()); + FieldTest(FqParameters.ONE()); + FieldExpandedTest(FqParameters.ONE()); } @Test public void BN254aFrTest() { final BN254aFrParameters FrParameters = new BN254aFrParameters(); - gTest.testFieldOperations(FrParameters.ONE()); - gTest.testFieldExpandedOperations(FrParameters.ONE()); + FieldTest(FrParameters.ONE()); + FieldExpandedTest(FrParameters.ONE()); } @Test public void BN254aFq2Test() { final BN254aFq2Parameters Fq2Parameters = new BN254aFq2Parameters(); - gTest.testFieldOperations(Fq2Parameters.ONE()); + FieldTest(Fq2Parameters.ONE()); } @Test public void BN254aFq6Test() { final BN254aFq6Parameters Fq6Parameters = new BN254aFq6Parameters(); - gTest.testFieldOperations(Fq6Parameters.ONE()); + FieldTest(Fq6Parameters.ONE()); } @Test public void BN254aFq12Test() { final BN254aFq12Parameters Fq12Parameters = new BN254aFq12Parameters(); - gTest.testFieldOperations(Fq12Parameters.ONE()); - gTest.verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); + FieldTest(Fq12Parameters.ONE()); + FrobeniusMapTest(Fq12Parameters.ONE(), Fq12Parameters); } // BN254b test cases @@ -69,37 +61,37 @@ public void BN254aFq12Test() { public void BN254bFqTest() { final BN254bFqParameters FqParameters = new BN254bFqParameters(); - gTest.testFieldOperations(FqParameters.ONE()); - gTest.testFieldExpandedOperations(FqParameters.ONE()); + FieldTest(FqParameters.ONE()); + FieldExpandedTest(FqParameters.ONE()); } @Test public void BN254bFrTest() { final BN254bFrParameters FrParameters = new BN254bFrParameters(); - gTest.testFieldOperations(FrParameters.ONE()); - gTest.testFieldExpandedOperations(FrParameters.ONE()); + FieldTest(FrParameters.ONE()); + FieldExpandedTest(FrParameters.ONE()); } @Test public void BN254bFq2Test() { final BN254bFq2Parameters Fq2Parameters = new BN254bFq2Parameters(); - gTest.testFieldOperations(Fq2Parameters.ONE()); + FieldTest(Fq2Parameters.ONE()); } @Test public void BN254bFq6Test() { final BN254bFq6Parameters Fq6Parameters = new BN254bFq6Parameters(); - gTest.testFieldOperations(Fq6Parameters.ONE()); + FieldTest(Fq6Parameters.ONE()); } @Test public void BN254bFq12Test() { final BN254bFq12Parameters Fq12Parameters = new BN254bFq12Parameters(); - gTest.testFieldOperations(Fq12Parameters.ONE()); - gTest.verifyFrobeniusMap(Fq12Parameters.ONE(), Fq12Parameters); + FieldTest(Fq12Parameters.ONE()); + FrobeniusMapTest(Fq12Parameters.ONE(), Fq12Parameters); } } diff --git a/src/test/java/algebra/curves/mock/MockBilinearityTest.java b/src/test/java/algebra/curves/mock/MockBilinearityTest.java index 013b35c..0e7406c 100644 --- a/src/test/java/algebra/curves/mock/MockBilinearityTest.java +++ b/src/test/java/algebra/curves/mock/MockBilinearityTest.java @@ -9,7 +9,7 @@ import algebra.curves.GenericBilinearityTest; -public class MockBilinearityTest { +public class MockBilinearityTest extends GenericBilinearityTest { @Test public void FakeTest() { FakeInitialize.init(); @@ -20,7 +20,6 @@ public void FakeTest() { FakePairing pairing = new FakePairing(); - GenericBilinearityTest gTest = new GenericBilinearityTest(); - gTest.PairingTest(g1Factory, g2Factory, gTFactory, fieldFactory, pairing); + PairingTest(g1Factory, g2Factory, gTFactory, fieldFactory, pairing); } } diff --git a/src/test/java/algebra/curves/mock/MockCurvesTest.java b/src/test/java/algebra/curves/mock/MockCurvesTest.java new file mode 100755 index 0000000..ab89053 --- /dev/null +++ b/src/test/java/algebra/curves/mock/MockCurvesTest.java @@ -0,0 +1,19 @@ +package algebra.curves.mock; + +import algebra.curves.GenericCurvesTest; +import algebra.curves.mock.fake_parameters.FakeG1Parameters; +import algebra.curves.mock.fake_parameters.FakeG2Parameters; + +import org.junit.jupiter.api.Test; + +public class MockCurvesTest extends GenericCurvesTest { + @Test + public void FakeTest() { + FakeInitialize.init(); + final FakeG1 g1Factory = new FakeG1Parameters().ONE(); + final FakeG2 g2Factory = new FakeG2Parameters().ONE(); + + GroupTest(g1Factory); + GroupTest(g2Factory); + } +} From 8cc25b329e17eb428b5fd071efc0974bfe3d4796 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 12 Jan 2021 10:00:03 +0000 Subject: [PATCH 65/94] Moved BNFinalExp test to the BN test folder --- .../{ => barreto_naehrig}/BNFinalExponentiationTest.java | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) rename src/test/java/algebra/curves/{ => barreto_naehrig}/BNFinalExponentiationTest.java (98%) diff --git a/src/test/java/algebra/curves/BNFinalExponentiationTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNFinalExponentiationTest.java similarity index 98% rename from src/test/java/algebra/curves/BNFinalExponentiationTest.java rename to src/test/java/algebra/curves/barreto_naehrig/BNFinalExponentiationTest.java index e271362..deb8da4 100755 --- a/src/test/java/algebra/curves/BNFinalExponentiationTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNFinalExponentiationTest.java @@ -5,11 +5,10 @@ * @copyright MIT license (see LICENSE file) *****************************************************************************/ -package algebra.curves; +package algebra.curves.barreto_naehrig; import static org.junit.jupiter.api.Assertions.assertTrue; -import algebra.curves.barreto_naehrig.*; import algebra.curves.barreto_naehrig.BNFields.*; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; From 09a05e9cdfcfc0acfe9c1a05aa9c52be5ce2e12f Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 12 Jan 2021 12:40:06 +0000 Subject: [PATCH 66/94] Cleaned up tests --- .../curves/GenericBilinearityTest.java | 4 ++-- .../algebra/curves/GenericCurvesTest.java | 13 ------------ .../BLSBilinearityTest.java | 20 ++++++++++--------- src/test/java/algebra/fields/FieldsTest.java | 2 ++ .../groups/AdditiveIntegerGroupTest.java | 2 ++ 5 files changed, 17 insertions(+), 24 deletions(-) diff --git a/src/test/java/algebra/curves/GenericBilinearityTest.java b/src/test/java/algebra/curves/GenericBilinearityTest.java index 96a8e48..3603e1e 100644 --- a/src/test/java/algebra/curves/GenericBilinearityTest.java +++ b/src/test/java/algebra/curves/GenericBilinearityTest.java @@ -15,13 +15,13 @@ public class GenericBilinearityTest { void PairingTest( final G1T P, final G2T Q, - final GTT gtOne, + final GTT oneGT, final FieldT fieldFactory, final PairingT pairing) { final long seed1 = 4; final long seed2 = 7; - final GTT one = gtOne; + final GTT one = oneGT; final FieldT s = fieldFactory.random(seed1 + seed2, null); G1T sP = P.mul(s); diff --git a/src/test/java/algebra/curves/GenericCurvesTest.java b/src/test/java/algebra/curves/GenericCurvesTest.java index 54a6ccf..2ab6124 100755 --- a/src/test/java/algebra/curves/GenericCurvesTest.java +++ b/src/test/java/algebra/curves/GenericCurvesTest.java @@ -5,7 +5,6 @@ import algebra.groups.AbstractGroup; import java.math.BigInteger; -import org.junit.jupiter.api.Test; public class GenericCurvesTest { protected > void GroupTest(final GroupT groupFactory) { @@ -73,16 +72,4 @@ protected > void GroupTest(final GroupT gro // A*s + A*r = A*(r+s) assertTrue(A.mul(rand1).add(A.mul(rand2)).equals(A.mul(randsum))); } - - /* - @Test - public void FakeTest() { - FakeInitialize.init(); - final FakeG1 g1Factory = new FakeG1Parameters().ONE(); - final FakeG2 g2Factory = new FakeG2Parameters().ONE(); - - GroupTest(g1Factory); - GroupTest(g2Factory); - } - */ } diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java index 8d5e506..a0d9309 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java @@ -3,11 +3,12 @@ import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377GT; -import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Pairing; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FrParameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377GTParameters; +import algebra.fields.Fp; import org.junit.jupiter.api.Test; import algebra.curves.GenericBilinearityTest; @@ -15,16 +16,17 @@ public class BLSBilinearityTest extends GenericBilinearityTest { @Test public void BLS12_377Test() { - final BLS12_377G1 g1One = BLS12_377G1Parameters.ONE; - final BLS12_377G2 g2One = BLS12_377G2Parameters.ONE; - final BLS12_377GT gtOne = BLS12_377GTParameters.ONE; - final BLS12_377Fr fieldFactory = new BLS12_377Fr(6); + final BLS12_377G1 oneG1 = BLS12_377G1Parameters.ONE; + final BLS12_377G2 oneG2 = BLS12_377G2Parameters.ONE; + final BLS12_377GT oneGT = BLS12_377GTParameters.ONE; final BLS12_377Pairing pairing = new BLS12_377Pairing(); + final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); + final Fp factoryFr = FrParameters.ONE(); - final BLS12_377G1 P = g1One.mul(fieldFactory.random(5L, null)); - final BLS12_377G2 Q = g2One.mul(fieldFactory.random(6L, null)); + final BLS12_377G1 P = oneG1.mul(factoryFr.random(5L, null)); + final BLS12_377G2 Q = oneG2.mul(factoryFr.random(6L, null)); - PairingTest(P, Q, gtOne, fieldFactory, pairing); - PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + PairingTest(P, Q, oneGT, factoryFr, pairing); + PairingTest(oneG1, oneG2, oneGT, factoryFr, pairing); } } diff --git a/src/test/java/algebra/fields/FieldsTest.java b/src/test/java/algebra/fields/FieldsTest.java index da49054..33d6506 100755 --- a/src/test/java/algebra/fields/FieldsTest.java +++ b/src/test/java/algebra/fields/FieldsTest.java @@ -16,6 +16,8 @@ import algebra.fields.mock.fieldparameters.*; import org.junit.jupiter.api.Test; +// TODO: Refactor using the GenericFieldsTest class + public class FieldsTest { private > void verify( final FieldT a, final FieldT b) { diff --git a/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java b/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java index 5f42ab3..881bade 100755 --- a/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java +++ b/src/test/java/algebra/groups/AdditiveIntegerGroupTest.java @@ -16,6 +16,8 @@ import java.math.BigInteger; import org.junit.jupiter.api.Test; +// TODO: Refactor using the GenericCurveTest (to be renamed as GenericGroupTest) class + public class AdditiveIntegerGroupTest implements Serializable { private > void verify(final GroupT a, final GroupT b) { final GroupT zero = a.zero(); From e6d3c740830c37b7f9b97d6920ae443e2d8dcedc Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Tue, 12 Jan 2021 17:33:06 +0000 Subject: [PATCH 67/94] Working BLS12 implementation --- .../curves/barreto_lynn_scott/BLSG1.java | 12 +- .../curves/barreto_lynn_scott/BLSG2.java | 33 ++- .../curves/barreto_lynn_scott/BLSGT.java | 7 - .../curves/barreto_lynn_scott/BLSPairing.java | 246 ++++++++++-------- .../bls12_377/BLS12_377G1.java | 16 -- .../bls12_377/BLS12_377Pairing.java | 7 - .../bls12_377/BLS12_377PublicParameters.java | 38 +-- .../BLS12_377Fq12Parameters.java | 14 +- .../BLS12_377Fq2Parameters.java | 28 +- .../BLS12_377Fq6Parameters.java | 11 +- .../BLS12_377FqParameters.java | 9 +- .../BLS12_377FrParameters.java | 9 +- .../BLS12_377G1Parameters.java | 4 +- .../BLS12_377G2Parameters.java | 12 +- .../BLS12_377GTParameters.java | 7 - .../java/algebra/fields/Fp12_2Over3Over2.java | 15 +- src/main/java/algebra/fields/Fp2.java | 6 +- src/main/java/algebra/fields/Fp3.java | 2 +- src/main/java/algebra/fields/Fp6_2Over3.java | 2 +- src/main/java/algebra/fields/Fp6_3Over2.java | 2 +- 20 files changed, 231 insertions(+), 249 deletions(-) diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java index 0e708d1..fe31f6a 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java @@ -30,7 +30,7 @@ public BLSG1T add(final BLSG1T other) { assert (other != null); // Handle special cases having to do with O - if (isZero()) { + if (this.isZero()) { return other; } @@ -39,10 +39,10 @@ public BLSG1T add(final BLSG1T other) { } // No need to handle points of order 2,4 - // (they cannot exist in a modulus-order subgroup) + // (they cannot exist in a prime-order subgroup) // Check for doubling case - + // // Using Jacobian coordinates so: // (X1:Y1:Z1) = (X2:Y2:Z2) // iff @@ -129,7 +129,7 @@ public BLSG1T negate() { public BLSG1T dbl() { // Handle point at infinity - if (isZero()) { + if (this.isZero()) { return this.self(); } @@ -167,7 +167,7 @@ public BLSG1T dbl() { } public BLSG1T toAffineCoordinates() { - if (isZero()) { + if (this.isZero()) { return this.construct(this.X.zero(), this.Y.one(), this.Z.zero()); } else { BLSFqT ZInverse = this.Z.inverse(); @@ -194,7 +194,7 @@ public String toString() { } public boolean equals(final BLSG1T other) { - if (isZero()) { + if (this.isZero()) { return other.isZero(); } diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java index 83cdb4e..1263c06 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java @@ -31,7 +31,7 @@ public BLSG2(final BLSFq2T X, final BLSFq2T Y, final BLSFq2T Z, final BLSG2Param public BLSG2T add(final BLSG2T other) { // Handle special cases having to do with O - if (isZero()) { + if (this.isZero()) { return other; } @@ -40,10 +40,10 @@ public BLSG2T add(final BLSG2T other) { } // No need to handle points of order 2,4 - // (they cannot exist in a modulus-order subgroup) + // (they cannot exist in a prime-order subgroup) // Check for doubling case - + // // Using Jacobian coordinates so: // (X1:Y1:Z1) = (X2:Y2:Z2) // iff @@ -51,10 +51,14 @@ public BLSG2T add(final BLSG2T other) { // iff // X1 * Z2^2 == X2 * Z1^2 and Y1 * Z2^3 == Y2 * Z1^3 + // Z1Z1 = Z1*Z1 final BLSFq2T Z1Z1 = this.Z.square(); + // Z2Z2 = Z2*Z2 final BLSFq2T Z2Z2 = other.Z.square(); + // U1 = X1*Z2Z2 final BLSFq2T U1 = this.X.mul(Z2Z2); + // U2 = X2*Z1Z1 final BLSFq2T U2 = other.X.mul(Z1Z1); final BLSFq2T Z1Cubed = this.Z.mul(Z1Z1); @@ -65,13 +69,15 @@ public BLSG2T add(final BLSG2T other) { // S2 = Y2 * Z1 * Z1Z1 final BLSFq2T S2 = other.Y.mul(Z1Cubed); + // Check if the 2 points are equal, in which can we do a point doubling (i.e. P + P) if (U1.equals(U2) && S1.equals(S2)) { // Double case // Nothing of above can be reused - return dbl(); + return this.dbl(); } - // Rest of the add case + // Point addition (i.e. P + Q, P =/= Q) + // https://www.hyperelliptic.org/EFD/g1p/data/shortw/jacobian-0/addition/add-2007-bl // H = U2-U1 final BLSFq2T H = U2.sub(U1); // I = (2 * H)^2 @@ -85,11 +91,11 @@ public BLSG2T add(final BLSG2T other) { final BLSFq2T V = U1.mul(I); // X3 = r^2 - J - 2 * V final BLSFq2T X3 = r.square().sub(J).sub(V.add(V)); - // Y3 = r * (V-X3)-2 S1 J + // Y3 = r * (V-X3)-2*S1*J final BLSFq2T S1_J = S1.mul(J); final BLSFq2T Y3 = r.mul(V.sub(X3)).sub(S1_J.add(S1_J)); // Z3 = ((Z1+Z2)^2-Z1Z1-Z2Z2) * H - final BLSFq2T Z3 = this.Z.add(other.Z).square().sub(Z1Z1).sub(Z2Z2).mul(H); + final BLSFq2T Z3 = (this.Z.add(other.Z).square().sub(Z1Z1).sub(Z2Z2)).mul(H); return this.construct(X3, Y3, Z3); } @@ -99,13 +105,12 @@ public BLSG2T sub(final BLSG2T other) { } public BLSG2T dbl() { - if (isZero()) { + if (this.isZero()) { return this.self(); } // NOTE: does not handle O and pts of order 2,4 - // http://www.hyperelliptic.org/EFD/g1p/auto-shortw-projective.html#doubling-dbl-2007-bl - + // https://www.hyperelliptic.org/EFD/g1p/data/shortw/jacobian-0/doubling/dbl-2009-l // A = X1^2 final BLSFq2T A = this.X.square(); // B = Y1^2 @@ -126,8 +131,8 @@ public BLSG2T dbl() { eightC = eightC.add(eightC); eightC = eightC.add(eightC); final BLSFq2T Y3 = E.mul(D.sub(X3)).sub(eightC); - final BLSFq2T Y1Z1 = this.Y.mul(this.Z); // Z3 = 2 * Y1 * Z1 + final BLSFq2T Y1Z1 = this.Y.mul(this.Z); final BLSFq2T Z3 = Y1Z1.add(Y1Z1); return this.construct(X3, Y3, Z3); @@ -174,7 +179,7 @@ public void setZ(final BLSFq2T Z) { } public BLSG2T toAffineCoordinates() { - if (isZero()) { + if (this.isZero()) { return this.construct(this.X.zero(), this.Y.one(), this.Z.zero()); } else { final BLSFq2T ZInverse = this.Z.inverse(); @@ -193,7 +198,7 @@ public ArrayList fixedBaseWindowTable() { } public String toString() { - if (isZero()) { + if (this.isZero()) { return "0"; } @@ -201,7 +206,7 @@ public String toString() { } public boolean equals(final BLSG2T other) { - if (isZero()) { + if (this.isZero()) { return other.isZero(); } diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java index f4759e9..72e76b8 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSGT.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott; import algebra.curves.AbstractGT; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java index 71f2f60..e055f2d 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java @@ -40,7 +40,7 @@ private AteG1Precompute(final BLSFqT PX, final BLSFqT PY) { public boolean equals(final AteG1Precompute other) { return PX.equals(other.PX) && PY.equals(other.PY); } - } + } // AteG1Precompute protected class AteEllCoefficients { BLSFq2T ell0; @@ -52,7 +52,7 @@ public boolean equals(final AteEllCoefficients other) { && this.ellVW.equals(other.ellVW) && this.ellVV.equals(other.ellVV); } - } + } // AteEllCoefficients protected class AteG2Precompute extends G2Precompute { final BLSFq2T QX; @@ -71,10 +71,12 @@ public boolean equals(final AteG2Precompute other) { && this.QY.equals(other.QY) && this.coefficients.equals(other.coefficients); } - } + } // AteG2Precompute private AteEllCoefficients doublingStepForFlippedMillerLoop( final BLSFqT twoInverse, BLSG2T current) { + // Below we assume that `current` = (X, Y, Z) \in E'(Fp2) is a point + // in homogeneous projective coordinates. final BLSFq2T X = current.X, Y = current.Y, Z = current.Z; // A = (X * Y) / 2 @@ -108,11 +110,11 @@ private AteEllCoefficients doublingStepForFlippedMillerLoop( current.setZ(B.mul(H)); AteEllCoefficients c = new AteEllCoefficients(); - // ell_0 = xi * I + // ell_0 = xi * I = xi * (3 * twist_b * Z^2 - Y^2) c.ell0 = this.publicParameters().twist().mul(I); - // ell_VW = - H (later: * yP) + // ell_VW = -H = -2YZ (later: * yP) c.ellVW = H.negate(); - // ell_VV = 3*J (later: * xP) + // ell_VV = 3*J = 3*X^2 (later: * xP) c.ellVV = J.add(J).add(J); return c; @@ -150,6 +152,98 @@ private AteEllCoefficients mixedAdditionStepForFlippedMillerLoop( return c; } + protected AteG1Precompute precomputeG1(final BLSG1T P) { + BLSG1T PAffine = P.construct(P.X, P.Y, P.Z).toAffineCoordinates(); + + return new AteG1Precompute(PAffine.X, PAffine.Y); + } + + protected AteG2Precompute precomputeG2(final BLSG2T Q) { + BLSG2T QAffine = Q.construct(Q.X, Q.Y, Q.Z).toAffineCoordinates(); + // Used for debug only + BLSG2T QAffineSave = Q.construct(Q.X, Q.Y, Q.Z).toAffineCoordinates(); + + BLSFqT fqFactory = this.publicParameters().coefficientB(); + BLSFqT twoInverse = fqFactory.construct(2).inverse(); + + BLSG2T R = Q.construct(QAffine.X, QAffine.Y, QAffine.Y.one()); + + // ateLoopCount = new BigInteger("9586122913090633729"); + final BigInteger loopCount = this.publicParameters().ateLoopCount(); + boolean found = false; + + final List coeffs = new ArrayList<>(); + + for (int i = loopCount.bitLength(); i >= 0; --i) { + final boolean bit = loopCount.testBit(i); + if (!found) { + // This skips the MSB itself. + found |= bit; + continue; + } + + coeffs.add(doublingStepForFlippedMillerLoop(twoInverse, R)); + + if (bit) { + coeffs.add(mixedAdditionStepForFlippedMillerLoop(QAffine, R)); + } + } + + //BLSG2T Q1 = this.mulByQ(QAffine); + //assert (Q1.Z.equals(QAffine.X.one())); + //BLSG2T Q2 = this.mulByQ(Q1); + //assert (Q2.Z.equals(QAffine.X.one())); + + return new AteG2Precompute(QAffineSave.X, QAffineSave.Y, coeffs); + } + + // Implementation of the Miller loop for BLS12_377 curve + // See https://eprint.iacr.org/2019/077.pdf for more info and potential optimizations + private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute QPrec) { + // blsFq12Factory = BLS12_377Fq12.ONE; + BLSFq12T f = this.publicParameters().blsFq12Factory(); + + boolean found = false; + int idx = 0; + + // ateLoopCount = new BigInteger("9586122913090633729"); + final BigInteger loopCount = this.publicParameters().ateLoopCount(); + AteEllCoefficients c; + + for (int i = loopCount.bitLength(); i >= 0; --i) { + final boolean bit = loopCount.testBit(i); + if (!found) { + // This skips the MSB itself. + found |= bit; + continue; + } + + // Code below gets executed for all bits (EXCEPT the MSB itself) of + // loopCount (skipping leading zeros) in MSB to LSB order. + c = QPrec.coefficients.get(idx++); + // Note: This squaring in Fq12 can be eliminated for the first loop + // (since f is initialized with ONE in Fq12) + // See Algo 6: https://www.iacr.org/archive/eurocrypt2011/66320047/66320047.pdf + f = f.square(); + // Note: For the first iteration, f is ONE in Fq12 and is thus sparse. + // Hence we can do a sparse/sparse multiplication for the line accumulation here. + f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); + + if (bit) { + c = QPrec.coefficients.get(idx++); + f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); + } + } + + // Not executed for BLS12_377 + //if (this.publicParameters().isAteLoopCountNegative()) { + // f = f.inverse(); + //} + + return f; + } + + // TODO: Move in the G2 group implementation private BLSG2T mulByQ(final BLSG2T element) { return element.construct( publicParameters().qXMulTwist().mul(element.X.FrobeniusMap(1)), @@ -159,25 +253,47 @@ private BLSG2T mulByQ(final BLSG2T element) { private BLSFq12T ExpByZ(final BLSFq12T elt) { BLSFq12T result = elt.cyclotomicExponentiation(this.publicParameters().finalExponentZ()); - if (!this.publicParameters().isFinalExponentZNegative()) { + if (this.publicParameters().isFinalExponentZNegative()) { result = result.unitaryInverse(); } return result; } + public BLSFq12T finalExponentiation(final BLSFq12T elt) { + // We know that: + // (p^12 - 1) / r = (p^6 - 1) (p^2 + 1) ((p^4 - p^2 + 1) / r) + // |_________________| |__________________| + // easy part hard part + // where: + // sage: cyclotomic_polynomial(12) # = x^4 - x^2 + 1 + final BLSFq12T easyPart = finalExponentiationFirstChunk(elt); + return finalExponentiationLastChunk(easyPart); + } + private BLSFq12T finalExponentiationFirstChunk(final BLSFq12T elt) { - // Computes result = elt^((q^6-1)*(q^2+1)). - // Follows, e.g., Beuchat et al page 9, by computing - // result as follows: - // elt^((q^6-1)*(q^2+1)) = (conj(elt) * elt^(-1))^(q^2+1) - // More precisely: - // A = conj(elt), B = elt.inverse(), C = A * B, D = C.Frobenius_map(2) - // result = D * C - final BLSFq12T A = elt.unitaryInverse(); + // elt^(q^6) + final BLSFq12T A = elt.FrobeniusMap(6); + // elt^(-1) final BLSFq12T B = elt.inverse(); + // elt^(q^6 - 1) final BLSFq12T C = A.mul(B); + // (elt^(q^6 - 1))^(q^2) = elt^((q^6 - 1) * (q^2)) final BLSFq12T D = C.FrobeniusMap(2); - return D.mul(C); + // elt^((q^6 - 1) * (q^2) + (q^6 - 1)) = elt^((q^6 - 1) * (q^2 + 1)) + final BLSFq12T result = D.mul(C); + + // // Computes result = elt^((q^6-1)*(q^2+1)). + // // Follows Beuchat et al page 9: https://eprint.iacr.org/2010/354.pdf + // // by computing result as follows: + // // elt^((q^6-1)*(q^2+1)) = (conj(elt) * elt^(-1))^(q^2+1) + // final BLSFq12T A = elt.unitaryInverse(); + // final BLSFq12T B = elt.inverse(); + // final BLSFq12T C = A.mul(B); + // final BLSFq12T D = C.FrobeniusMap(2); + // final BLSFq12T result = D.mul(C); + // return result; + + return result; } private BLSFq12T finalExponentiationLastChunk(final BLSFq12T elt) { @@ -239,107 +355,9 @@ private BLSFq12T finalExponentiationLastChunk(final BLSFq12T elt) { return result; } - // Implementation of the Miller loop for BLS12_377 curve - // See https://eprint.iacr.org/2019/077.pdf for more info and potential optimizations - private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute QPrec) { - // blsFq12Factory = BLS12_377Fq12.ONE; - BLSFq12T f = this.publicParameters().blsFq12Factory(); - - boolean found = false; - int idx = 0; - - final BigInteger loopCount = this.publicParameters().ateLoopCount(); - AteEllCoefficients c; - - for (int i = loopCount.bitLength(); i >= 0; --i) { - final boolean bit = loopCount.testBit(i); - if (!found) { - // This skips the MSB itself. - found |= bit; - continue; - } - - // Code below gets executed for all bits (EXCEPT the MSB itself) of - // loopCount (skipping leading zeros) in MSB to LSB order. - c = QPrec.coefficients.get(idx++); - f = f.square(); - f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); - - if (bit) { - c = QPrec.coefficients.get(idx++); - f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); - } - } - - // Not executed for BLS12_377 - if (this.publicParameters().isAteLoopCountNegative()) { - f = f.inverse(); - } - - //c = QPrec.coefficients.get(idx++); - //f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); - - //c = QPrec.coefficients.get(idx); - //f = f.mulBy024(c.ell0, c.ellVW.mul(PPrec.PY), c.ellVV.mul(PPrec.PX)); - - return f; - } - - protected AteG1Precompute precomputeG1(final BLSG1T P) { - BLSG1T PAffine = P.construct(P.X, P.Y, P.Z).toAffineCoordinates(); - - return new AteG1Precompute(PAffine.X, PAffine.Y); - } - - protected AteG2Precompute precomputeG2(final BLSG2T Q) { - BLSG2T QAffine = Q.construct(Q.X, Q.Y, Q.Z).toAffineCoordinates(); - - BLSFqT fqFactory = this.publicParameters().coefficientB(); - BLSFqT twoInverse = fqFactory.construct(2).inverse(); - - BLSG2T R = Q.construct(QAffine.X, QAffine.Y, QAffine.Y.one()); - final BigInteger loopCount = this.publicParameters().ateLoopCount(); - boolean found = false; - - final List coeffs = new ArrayList<>(); - - for (int i = loopCount.bitLength(); i >= 0; --i) { - final boolean bit = loopCount.testBit(i); - if (!found) { - // This skips the MSB itself. - found |= bit; - continue; - } - - coeffs.add(doublingStepForFlippedMillerLoop(twoInverse, R)); - - if (bit) { - coeffs.add(mixedAdditionStepForFlippedMillerLoop(QAffine, R)); - } - } - - //BLSG2T Q1 = this.mulByQ(QAffine); - //assert (Q1.Z.equals(QAffine.X.one())); - //BLSG2T Q2 = this.mulByQ(Q1); - //assert (Q2.Z.equals(QAffine.X.one())); - - return new AteG2Precompute(QAffine.X, QAffine.Y, coeffs); - } - protected BLSFq12T atePairing(final BLSG1T P, final BLSG2T Q) { final AteG1Precompute PPrec = precomputeG1(P); final AteG2Precompute QPrec = precomputeG2(Q); return millerLoop(PPrec, QPrec); } - - public BLSFq12T finalExponentiation(final BLSFq12T elt) { - // We know that: - // (p^12 - 1) / r = (p^6 - 1) (p^2 + 1) ((p^4 - p^2 + 1) / r) - // |_________________| |__________________| - // easy part hard part - // where: - // sage: cyclotomic_polynomial(12) # = x^4 - x^2 + 1 - final BLSFq12T A = finalExponentiationFirstChunk(elt); - return finalExponentiationLastChunk(A); - } } diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java index 6626933..140f976 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G1.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377; import algebra.curves.barreto_lynn_scott.BLSG1; @@ -12,15 +5,6 @@ import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; -/** - * Class representing a specific BN group. This class is representing BOTH the group and a point in - * the group. In fact, each point necessitates the X,Y,Z coordinates; and the group definition - * additionally requires the `G1Parameters` attribute. Here all group elements are constructed by - * `construct()` which is a wrapper around the "Group class" constructor. - * - *

That's why `BN254bG1Parameters.ONE` is treated as a "Group factory" here: - * https://github.com/clearmatics/dizk/blob/develop/src/test/java/algebra/curves/CurvesTest.java#L93 - */ public class BLS12_377G1 extends BLSG1 { public static final BLS12_377G1Parameters G1Parameters = new BLS12_377G1Parameters(); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java index 7396f2e..d3d4fbe 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Pairing.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377; import algebra.curves.barreto_lynn_scott.BLSPairing; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java index 7fa5d41..ace7922 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377; import algebra.curves.barreto_lynn_scott.BLSPublicParameters; @@ -15,37 +8,46 @@ import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq2Parameters; import java.math.BigInteger; +// Checked + public class BLS12_377PublicParameters extends BLSPublicParameters { public BLS12_377PublicParameters() { final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); - coefficientB = new BLS12_377Fq("3"); - twist = new BLS12_377Fq2(new BLS12_377Fq("9"), new BLS12_377Fq("1")); + // Choice of short Weierstrass curve and its twist + // E(Fq): y^2 = x^3 + 1 + coefficientB = new BLS12_377Fq("1"); + // We use a type-D twist here, E'(Fq2): y^2 = x^3 + 1/u + twist = new BLS12_377Fq2(new BLS12_377Fq("0"), new BLS12_377Fq("1")); twistCoefficientB = twist.inverse().mul(coefficientB); bC0MulTwist = coefficientB.mul(new BLS12_377Fq(Fq2Parameters.nonresidue())); bC1MulTwist = coefficientB.mul(new BLS12_377Fq(Fq2Parameters.nonresidue())); qXMulTwist = new BLS12_377Fq2( new BLS12_377Fq( - "21575463638280843010398324269430826099269044274347216827212613867836435027261"), - new BLS12_377Fq( - "10307601595873709700152284273816112264069230130616436755625194854815875713954")); + "80949648264912719408558363140637477264845294720710499478137287262712535938301461879813459410946"), + new BLS12_377Fq("0")); qYMulTwist = new BLS12_377Fq2( new BLS12_377Fq( - "2821565182194536844548159561693502659359617185244120367078079554186484126554"), - new BLS12_377Fq( - "3505843767911556378687030309984248845540243509899259641013678093033130930403")); + "216465761340224619389371505802605247630151569547285782856803747159100223055385581585702401816380679166954762214499"), + new BLS12_377Fq("0")); // Pairing parameters - ateLoopCount = new BigInteger("29793968203157093288"); + // sage: u = 9586122913090633729 + // sage: ceil(log(u, 2)) # = 64 + // sage: bin(u) # = '0b1000010100001000110000000000000000000000000000000000000000000001' + // The Hamming weight of u is: HW(u) = 7 + // u = 2**63 + 2**58 + 2**56 + 2**51 + 2**47 + 2**46 + 1 + // Based on the power-2 decomposition of u, we should have 63 doubling steps and 7 addition steps in the Miller Loop. + ateLoopCount = new BigInteger("9586122913090633729"); isAteLoopCountNegative = false; finalExponent = new BigInteger( - "552484233613224096312617126783173147097382103762957654188882734314196910839907541213974502761540629817009608548654680343627701153829446747810907373256841551006201639677726139946029199968412598804882391702273019083653272047566316584365559776493027495458238373902875937659943504873220554161550525926302303331747463515644711876653177129578303191095900909191624817826566688241804408081892785725967931714097716709526092261278071952560171111444072049229123565057483750161460024353346284167282452756217662335528813519139808291170539072125381230815729071544861602750936964829313608137325426383735122175229541155376346436093930287402089517426973178917569713384748081827255472576937471496195752727188261435633271238710131736096299798168852925540549342330775279877006784354801422249722573783561685179618816480037695005515426162362431072245638324744480"); - finalExponentZ = new BigInteger("4965661367192848881"); + "10623521018019860488254031663707568428798032905123811199571213965079129114663661236359849629341526275899063345613340067081670062620727617884137487754739150147491204559514205186492385590272208934467461444944652711005169371168250068790820776124772095630237102189827733019989835063334551453893534663070786533932633573962932272563471643288531959637300817070265537429506484880990981069041269405383502889677357082012807298529931118124428569059822346289745077401570134157444973271520981774047146918354408632568723153146248333028827919406785654402107153546667815607201488590832478225403444136409349877481268154817904541340614173261949772403060924324366861723245182619859389254985008236007465814273361497134138868945580557938161335670207544906643574043606819537336472235809927599628123275314288006170804044560238676463931639339711913111080974582593228138704154320599775683095604041309000197025419968125718018311805959315220036948621879242495199408833915486421612374480018459896018440926235261824654956932384859260479372776022979736734221629097297890154692194441528462770218811795624471108972377573690833913231260547835550851256817740247389770320334698430697237343583761719223414894063451411431859122738488311580005412765070251810159991897110936324943232526870280724876946523218213525646968094720"); + finalExponentZ = new BigInteger("9586122913090633729"); isFinalExponentZNegative = false; blsFq12Factory = BLS12_377Fq12.ONE; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java index 8dea9a6..5642177 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFq12Parameters; @@ -13,6 +6,9 @@ import algebra.fields.Fp2; import java.io.Serializable; +// Checked + +/** Parameters for Fq12 = ((Fq2)^3)^2 */ public class BLS12_377Fq12Parameters extends AbstractBLSFq12Parameters implements Serializable { public BLS12_377FqParameters FqParameters; public BLS12_377Fq2Parameters Fq2Parameters; @@ -33,11 +29,11 @@ public BLS12_377Fq12Parameters() { this.ZERO = new Fp12_2Over3Over2(Fq6Parameters.ZERO(), Fq6Parameters.ZERO(), this); this.ONE = new Fp12_2Over3Over2(Fq6Parameters.ONE(), Fq6Parameters.ZERO(), this); - this.nonresidue = new Fp2(FqParameters.ZERO(), FqParameters.ONE(), Fq2Parameters); + this.nonresidue = new Fp2(new Fp("0", FqParameters), new Fp("1", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1 = new Fp2[12]; this.FrobeniusCoefficientsC1[0] = - new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + new Fp2(new Fp("1", FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1[1] = new Fp2( new Fp( diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java index e6de300..5a19755 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFq2Parameters; @@ -14,6 +7,7 @@ import java.io.Serializable; import java.math.BigInteger; +// Checked public class BLS12_377Fq2Parameters extends AbstractBLSFq2Parameters implements Serializable { public BLS12_377FqParameters FqParameters; @@ -33,33 +27,31 @@ public BLS12_377Fq2Parameters() { this.FqParameters = new BLS12_377FqParameters(); this.euler = new BigInteger( - "239547588008311421220994022608339370399626158265550411218223901127035046843189118723920525909718935985594116157406550130918127817069793474323196511433944"); - this.s = 4; + "33453642642309381258089625946249069288005760010886479253070957453297957116339370141113413635838485065209570299254148838549585056123015878375022724998041828785227090063466658233059433323033772513321990316560167027213559780081664"); + this.s = 47; this.t = new BigInteger( - "29943448501038927652624252826042421299953269783193801402277987640879380855398639840490065738714866998199264519675818766364765977133724184290399563929243"); + "475404855284145089315325463221726483993816145966867441829193658311651761271425728823393990805904040047516478740222806302278755994777496288961383541476974255391881599499962735436887347234371823579436839914935817251"); this.tMinus1Over2 = new BigInteger( - "14971724250519463826312126413021210649976634891596900701138993820439690427699319920245032869357433499099632259837909383182382988566862092145199781964621"); + "237702427642072544657662731610863241996908072983433720914596829155825880635712864411696995402952020023758239370111403151139377997388748144480691770738487127695940799749981367718443673617185911789718419957467908625"); this.nonresidue = new BLS12_377Fq( - "21888242871839275222246405745257275088696311157297823662689037894645226208582") + "258664426012969094010652733694893533536393512754914660539884262666720468348340822774968888139573360124440321458172") .element(); - this.nqr = new Fp2(new Fp("2", FqParameters), new Fp("1", FqParameters), this); + this.nqr = new Fp2(new Fp("0", FqParameters), new Fp("1", FqParameters), this); this.nqrTot = new Fp2( + new Fp("0", FqParameters), new Fp( - "5033503716262624267312492558379982687175200734934877598599011485707452665730", - FqParameters), - new Fp( - "314498342015008975724433667930697407966947188435857772134235984660852259084", + "257286236321774568987262729980034669694531728092793737444525294935421142460394028155736019924956637466133519652786", FqParameters), this); this.FrobeniusCoefficientsC1 = new Fp[2]; this.FrobeniusCoefficientsC1[0] = new Fp("1", FqParameters); this.FrobeniusCoefficientsC1[1] = new Fp( - "21888242871839275222246405745257275088696311157297823662689037894645226208582", + "258664426012969094010652733694893533536393512754914660539884262666720468348340822774968888139573360124440321458176", FqParameters); this.ZERO = new Fp2(BigInteger.ZERO, BigInteger.ZERO, this); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java index b8bb74b..0f2ac5c 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java @@ -6,6 +6,9 @@ import algebra.fields.Fp6_3Over2; import java.io.Serializable; +// Checked + +/** Parameters for Fq6 = (Fq2)^3 */ public class BLS12_377Fq6Parameters extends AbstractBLSFq6Parameters implements Serializable { public BLS12_377Fq2Parameters Fq2Parameters; @@ -24,11 +27,11 @@ public BLS12_377Fq6Parameters() { new Fp6_3Over2(Fq2Parameters.ZERO(), Fq2Parameters.ZERO(), Fq2Parameters.ZERO(), this); this.ONE = new Fp6_3Over2(Fq2Parameters.ONE(), Fq2Parameters.ZERO(), Fq2Parameters.ZERO(), this); - this.nonresidue = new Fp2(FqParameters.ZERO(), FqParameters.ONE(), Fq2Parameters); + this.nonresidue = new Fp2(new Fp("0", FqParameters), new Fp("1", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1 = new Fp2[6]; this.FrobeniusCoefficientsC1[0] = - new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + new Fp2(new Fp("1", FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1[1] = new Fp2( new Fp( @@ -65,7 +68,7 @@ public BLS12_377Fq6Parameters() { this.FrobeniusCoefficientsC2 = new Fp2[6]; this.FrobeniusCoefficientsC2[0] = - new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + new Fp2(new Fp("1", FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC2[1] = new Fp2( new Fp( @@ -79,7 +82,7 @@ public BLS12_377Fq6Parameters() { new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC2[3] = - new Fp2(FqParameters.ONE(), FqParameters.ZERO(), Fq2Parameters); + new Fp2(new Fp("1", FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC2[4] = new Fp2( new Fp( diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java index 5df2701..b1a384f 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFqParameters; @@ -12,6 +5,8 @@ import java.io.Serializable; import java.math.BigInteger; +// Checked + public class BLS12_377FqParameters extends AbstractBLSFqParameters implements Serializable { public BigInteger modulus; public BigInteger root; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java index 5074d9e..e964b49 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSFrParameters; @@ -12,6 +5,8 @@ import java.io.Serializable; import java.math.BigInteger; +// Checked + public class BLS12_377FrParameters extends AbstractBLSFrParameters implements Serializable { public BigInteger modulus; public BigInteger root; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java index e0f47d4..42885a4 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java @@ -8,12 +8,14 @@ import java.util.ArrayList; import java.util.Arrays; +// Checked + public class BLS12_377G1Parameters extends AbstractBLSG1Parameters implements Serializable { public static final BLS12_377G1 ZERO = new BLS12_377G1(BLS12_377Fq.ZERO, BLS12_377Fq.ONE, BLS12_377Fq.ZERO); - public static final BLS12_377G1 ONE = new BLS12_377G1(BLS12_377Fq.ONE, new BLS12_377Fq(2), BLS12_377Fq.ONE); + public static final BLS12_377G1 ONE = new BLS12_377G1(new BLS12_377Fq("81937999373150964239938255573465948239988671502647976594219695644855304257327692006745978603320413799295628339695"), new BLS12_377Fq("241266749859715473739788878240585681733927191168601896383759122102112907357779751001206799952863815012735208165030"), BLS12_377Fq.ONE); public static final ArrayList fixedBaseWindowTable = new ArrayList<>( Arrays.asList( diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java index 5ad5d39..b3f7182 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java @@ -9,6 +9,8 @@ import java.util.ArrayList; import java.util.Arrays; +// Checked + public class BLS12_377G2Parameters extends AbstractBLSG2Parameters implements Serializable { @@ -18,15 +20,15 @@ public class BLS12_377G2Parameters new BLS12_377G2( new BLS12_377Fq2( new BLS12_377Fq( - "10857046999023057135944570762232829481370756359578518086990519993285655852781"), + "111583945774695116443911226257823823434468740249883042837745151039122196680777376765707574547389190084887628324746"), new BLS12_377Fq( - "11559732032986387107991004021392285783925812861821192530917403151452391805634")), + "129066980656703085518157301154335215886082112524378686555873161080604845924984124025594590925548060469686767592854")), new BLS12_377Fq2( new BLS12_377Fq( - "8495653923123431417604973247489272438418190587263600148770280649306958101930"), + "168863299724668977183029941347596462608978380503965103341003918678547611204475537878680436662916294540335494194722"), new BLS12_377Fq( - "4082367875863433681332203403145435568316851327593401208105741076214120093531")), - new BLS12_377Fq2(1, 0)); + "233892497287475762251335351893618429603672921469864392767514552093535653615809913098097380147379993375817193725968")), + BLS12_377Fq2.ONE); public static final ArrayList fixedBaseWindowTable = new ArrayList<>( Arrays.asList( diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java index 647cd40..615b30d 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; diff --git a/src/main/java/algebra/fields/Fp12_2Over3Over2.java b/src/main/java/algebra/fields/Fp12_2Over3Over2.java index 162639b..824d88d 100755 --- a/src/main/java/algebra/fields/Fp12_2Over3Over2.java +++ b/src/main/java/algebra/fields/Fp12_2Over3Over2.java @@ -57,11 +57,12 @@ public Fp12_2Over3Over2 mul(final Fp12_2Over3Over2 other) { // Devegili OhEig, Scott Dahab // "Multiplication and Squaring on Pairing-Friendly Fields" // Section 3 (Karatsuba) - final Fp6_3Over2 c0C0 = c0.mul(other.c0); - final Fp6_3Over2 c1C1 = c1.mul(other.c1); + // https://eprint.iacr.org/2006/471.pdf + final Fp6_3Over2 v0 = c0.mul(other.c0); + final Fp6_3Over2 v1 = c1.mul(other.c1); return new Fp12_2Over3Over2( - c0C0.add(mulByNonResidue(c1C1)), - (c0.add(c1)).mul(other.c0.add(other.c1)).sub(c0C0).sub(c1C1), + v0.add(mulByNonResidue(v1)), + (c0.add(c1)).mul(other.c0.add(other.c1)).sub(v0).sub(v1), Fp12Parameters); } @@ -94,6 +95,7 @@ public Fp12_2Over3Over2 square() { // Devegili OhEig, Scott Dahab // "Multiplication and Squaring on Pairing-Friendly Fields" // Section 3 (Complex squaring) + // https://eprint.iacr.org/2006/471.pdf final Fp6_3Over2 c0c1 = c0.mul(c1); final Fp6_3Over2 factor = (c0.add(c1)).mul(c0.add(mulByNonResidue(c1))); return new Fp12_2Over3Over2( @@ -202,6 +204,8 @@ public Fp12_2Over3Over2 cyclotomicSquared() { Fp12Parameters); } + // Sparse multiplication used during the accumulation of line evaluations into the + // Miller variable during a Miller loop. public Fp12_2Over3Over2 mulBy024(final Fp2 ell0, final Fp2 ellVW, final Fp2 ellVV) { final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); @@ -277,6 +281,7 @@ public Fp12_2Over3Over2 mulBy024(final Fp2 ell0, final Fp2 ellVW, final Fp2 ellV new Fp6_3Over2(z0, z1, z2, Fp6Parameters), new Fp6_3Over2(z3, z4, z5, Fp6Parameters), Fp12Parameters); + } public Fp12_2Over3Over2 cyclotomicExponentiation(final BigInteger exponent) { @@ -302,7 +307,7 @@ public int bitSize() { } public String toString() { - return c0.toString() + " / " + c1.toString(); + return "c0/c1: " + c0.toString() + " / " + c1.toString(); } public boolean equals(final Fp12_2Over3Over2 other) { diff --git a/src/main/java/algebra/fields/Fp2.java b/src/main/java/algebra/fields/Fp2.java index fa5e1a1..16aed54 100755 --- a/src/main/java/algebra/fields/Fp2.java +++ b/src/main/java/algebra/fields/Fp2.java @@ -124,8 +124,12 @@ public Fp2 construct(final long c0, final long c1) { return new Fp2(c0, c1, Fp2Parameters); } + // Note: When we display G2 points in libff, we inverse the coefficient vector to put the + // highest power first in the polynomial (for human readability). We do not do that here, + // hence the coeffs in the output is inversed. + // See: https://github.com/clearmatics/libff/blob/develop/libff/algebra/curves/bls12_377/bls12_377_g2.cpp#L63-L70 public String toString() { - return c0.toString() + ", " + c1.toString(); + return "c0/c1 (c1*x + c0): " + c0.toString() + ", " + c1.toString(); } public boolean equals(final Fp2 other) { diff --git a/src/main/java/algebra/fields/Fp3.java b/src/main/java/algebra/fields/Fp3.java index 2a3df38..55928fa 100755 --- a/src/main/java/algebra/fields/Fp3.java +++ b/src/main/java/algebra/fields/Fp3.java @@ -153,7 +153,7 @@ public Fp3 construct(final long c0, final long c1, final long c2) { } public String toString() { - return c0.toString() + ", " + c1.toString() + ", " + c2.toString(); + return "c0/c1/c2: " + c0.toString() + ", " + c1.toString() + ", " + c2.toString(); } public boolean equals(final Fp3 other) { diff --git a/src/main/java/algebra/fields/Fp6_2Over3.java b/src/main/java/algebra/fields/Fp6_2Over3.java index 9775c79..49a15e5 100755 --- a/src/main/java/algebra/fields/Fp6_2Over3.java +++ b/src/main/java/algebra/fields/Fp6_2Over3.java @@ -118,7 +118,7 @@ public Fp6_2Over3 construct(final Fp3 c0, final Fp3 c1) { } public String toString() { - return c0.toString() + " / " + c1.toString(); + return "c0/c1: " + c0.toString() + " / " + c1.toString(); } public boolean equals(final Fp6_2Over3 other) { diff --git a/src/main/java/algebra/fields/Fp6_3Over2.java b/src/main/java/algebra/fields/Fp6_3Over2.java index f8d4789..c18e2ab 100755 --- a/src/main/java/algebra/fields/Fp6_3Over2.java +++ b/src/main/java/algebra/fields/Fp6_3Over2.java @@ -149,7 +149,7 @@ public Fp6_3Over2 construct(final Fp2 c0, final Fp2 c1, final Fp2 c2) { } public String toString() { - return c0.toString() + " / " + c1.toString() + " / " + c2.toString(); + return "c0/c1/c2: " + c0.toString() + " / " + c1.toString() + " / " + c2.toString(); } public boolean equals(final Fp6_3Over2 other) { From 4ff5edc33434795beedbb1bf6b254fe6fefa4359 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 13 Jan 2021 17:54:14 +0000 Subject: [PATCH 68/94] Extended pairing tests --- .../curves/GenericBilinearityTest.java | 21 ++++++++++++--- .../BLSBilinearityTest.java | 7 +++-- .../barreto_naehrig/BNBilinearityTest.java | 26 ++++++++++++------- .../curves/mock/MockBilinearityTest.java | 5 ++-- 4 files changed, 40 insertions(+), 19 deletions(-) diff --git a/src/test/java/algebra/curves/GenericBilinearityTest.java b/src/test/java/algebra/curves/GenericBilinearityTest.java index 3603e1e..1ad2459 100644 --- a/src/test/java/algebra/curves/GenericBilinearityTest.java +++ b/src/test/java/algebra/curves/GenericBilinearityTest.java @@ -4,6 +4,8 @@ import static org.junit.jupiter.api.Assertions.assertTrue; import algebra.fields.AbstractFieldElementExpanded; +import algebra.fields.Fp; +import algebra.fields.abstractfieldparameters.AbstractFpParameters; public class GenericBilinearityTest { protected < @@ -16,13 +18,13 @@ void PairingTest( final G1T P, final G2T Q, final GTT oneGT, - final FieldT fieldFactory, + final AbstractFpParameters ffpp, final PairingT pairing) { + Fp fieldFactory = ffpp.ONE(); final long seed1 = 4; final long seed2 = 7; - final GTT one = oneGT; - final FieldT s = fieldFactory.random(seed1 + seed2, null); + final Fp s = fieldFactory.random(seed1, null); G1T sP = P.mul(s); G2T sQ = Q.mul(s); @@ -33,6 +35,17 @@ void PairingTest( assertTrue(ans1.equals(ans2)); assertTrue(ans2.equals(ans3)); - assertFalse(ans1.equals(one)); + // Test no-degeneracy + assertFalse(ans1.equals(oneGT)); + // G1, G2, GT are order r + assert(ans1.mul(ffpp.modulus()).equals(oneGT)); + + final Fp r = fieldFactory.random(seed2, null); + final Fp oneFr = fieldFactory.construct(1); + G1T rP = P.mul(r); + G2T rMinus1Q = Q.mul(r.sub(oneFr)); + GTT res1 = pairing.reducedPairing(rP, Q); + GTT res2 = pairing.reducedPairing(P, rMinus1Q); + assertFalse(res1.equals(res2)); } } diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java index a0d9309..6ce339d 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java @@ -26,7 +26,10 @@ public void BLS12_377Test() { final BLS12_377G1 P = oneG1.mul(factoryFr.random(5L, null)); final BLS12_377G2 Q = oneG2.mul(factoryFr.random(6L, null)); - PairingTest(P, Q, oneGT, factoryFr, pairing); - PairingTest(oneG1, oneG2, oneGT, factoryFr, pairing); + // Below we pass the parameters (and not the factory) + // because the parameters expose more things (like the modulus) + // which is helpful to tighten the tests. + PairingTest(P, Q, oneGT, FrParameters, pairing); + PairingTest(oneG1, oneG2, oneGT, FrParameters, pairing); } } diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java index 023e259..4352032 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java @@ -5,6 +5,7 @@ import algebra.curves.barreto_naehrig.bn254a.BN254aG2; import algebra.curves.barreto_naehrig.bn254a.BN254aGT; import algebra.curves.barreto_naehrig.bn254a.BN254aPairing; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFrParameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aGTParameters; @@ -13,9 +14,12 @@ import algebra.curves.barreto_naehrig.bn254b.BN254bG2; import algebra.curves.barreto_naehrig.bn254b.BN254bGT; import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFrParameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bGTParameters; +import algebra.fields.Fp; + import org.junit.jupiter.api.Test; import algebra.curves.GenericBilinearityTest; @@ -26,14 +30,15 @@ public void BN254aTest() { final BN254aG1 g1One = BN254aG1Parameters.ONE; final BN254aG2 g2One = BN254aG2Parameters.ONE; final BN254aGT gtOne = BN254aGTParameters.ONE; - final BN254aFr fieldFactory = new BN254aFr(6); final BN254aPairing pairing = new BN254aPairing(); + final BN254aFrParameters FrParameters = new BN254aFrParameters(); + final Fp factoryFr = FrParameters.ONE(); - final BN254aG1 P = g1One.mul(fieldFactory.random(5L, null)); - final BN254aG2 Q = g2One.mul(fieldFactory.random(6L, null)); + final BN254aG1 P = g1One.mul(factoryFr.random(5L, null)); + final BN254aG2 Q = g2One.mul(factoryFr.random(6L, null)); - PairingTest(P, Q, gtOne, fieldFactory, pairing); - PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + PairingTest(P, Q, gtOne, FrParameters, pairing); + PairingTest(g1One, g2One, gtOne, FrParameters, pairing); } @Test @@ -41,13 +46,14 @@ public void BN254bTest() { final BN254bG1 g1One = BN254bG1Parameters.ONE; final BN254bG2 g2One = BN254bG2Parameters.ONE; final BN254bGT gtOne = BN254bGTParameters.ONE; - final BN254bFr fieldFactory = new BN254bFr(6); final BN254bPairing pairing = new BN254bPairing(); + final BN254bFrParameters FrParameters = new BN254bFrParameters(); + final Fp factoryFr = FrParameters.ONE(); - final BN254bG1 P = g1One.mul(fieldFactory.random(5L, null)); - final BN254bG2 Q = g2One.mul(fieldFactory.random(6L, null)); + final BN254bG1 P = g1One.mul(factoryFr.random(5L, null)); + final BN254bG2 Q = g2One.mul(factoryFr.random(6L, null)); - PairingTest(P, Q, gtOne, fieldFactory, pairing); - PairingTest(g1One, g2One, gtOne, fieldFactory, pairing); + PairingTest(P, Q, gtOne, FrParameters, pairing); + PairingTest(g1One, g2One, gtOne, FrParameters, pairing); } } diff --git a/src/test/java/algebra/curves/mock/MockBilinearityTest.java b/src/test/java/algebra/curves/mock/MockBilinearityTest.java index 0e7406c..cc83a49 100644 --- a/src/test/java/algebra/curves/mock/MockBilinearityTest.java +++ b/src/test/java/algebra/curves/mock/MockBilinearityTest.java @@ -16,10 +16,9 @@ public void FakeTest() { final FakeG1 g1Factory = new FakeG1Parameters().ONE(); final FakeG2 g2Factory = new FakeG2Parameters().ONE(); final FakeGT gTFactory = new FakeGTParameters().ONE(); - final Fp fieldFactory = new LargeFpParameters().ONE(); + final LargeFpParameters fieldParameters = new LargeFpParameters(); FakePairing pairing = new FakePairing(); - - PairingTest(g1Factory, g2Factory, gTFactory, fieldFactory, pairing); + PairingTest(g1Factory, g2Factory, gTFactory, fieldParameters, pairing); } } From eb8d0c34cc6b300d10711c22723251e15290dcbe Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 13 Jan 2021 18:58:12 +0000 Subject: [PATCH 69/94] Switched to Beuchat et al method for exponentiation and added more comments related to the perfs --- .../curves/barreto_lynn_scott/BLSPairing.java | 42 ++++++++++--------- 1 file changed, 22 insertions(+), 20 deletions(-) diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java index e055f2d..a5cb906 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java @@ -206,10 +206,13 @@ private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute Q boolean found = false; int idx = 0; - // ateLoopCount = new BigInteger("9586122913090633729"); final BigInteger loopCount = this.publicParameters().ateLoopCount(); AteEllCoefficients c; + // Note: Remember, in libff the loop length is defined using `max_bits` + // https://github.com/clearmatics/libff/blob/develop/libff/algebra/curves/bls12_377/bls12_377_pairing.cpp#L421 + // While this seems inefficient (many useless iterations dealing with 0's), this is more efficient + // than taking the `num_bits` which is an inefficient function and destroys the benchmarks! for (int i = loopCount.bitLength(); i >= 0; --i) { final boolean bit = loopCount.testBit(i); if (!found) { @@ -218,12 +221,12 @@ private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute Q continue; } - // Code below gets executed for all bits (EXCEPT the MSB itself) of + // Code below gets executed for all bits (EXCEPT the MSB) of // loopCount (skipping leading zeros) in MSB to LSB order. c = QPrec.coefficients.get(idx++); + // See Algo 6: https://www.iacr.org/archive/eurocrypt2011/66320047/66320047.pdf // Note: This squaring in Fq12 can be eliminated for the first loop // (since f is initialized with ONE in Fq12) - // See Algo 6: https://www.iacr.org/archive/eurocrypt2011/66320047/66320047.pdf f = f.square(); // Note: For the first iteration, f is ONE in Fq12 and is thus sparse. // Hence we can do a sparse/sparse multiplication for the line accumulation here. @@ -271,27 +274,26 @@ public BLSFq12T finalExponentiation(final BLSFq12T elt) { } private BLSFq12T finalExponentiationFirstChunk(final BLSFq12T elt) { - // elt^(q^6) - final BLSFq12T A = elt.FrobeniusMap(6); - // elt^(-1) - final BLSFq12T B = elt.inverse(); - // elt^(q^6 - 1) - final BLSFq12T C = A.mul(B); - // (elt^(q^6 - 1))^(q^2) = elt^((q^6 - 1) * (q^2)) - final BLSFq12T D = C.FrobeniusMap(2); - // elt^((q^6 - 1) * (q^2) + (q^6 - 1)) = elt^((q^6 - 1) * (q^2 + 1)) - final BLSFq12T result = D.mul(C); - - // // Computes result = elt^((q^6-1)*(q^2+1)). - // // Follows Beuchat et al page 9: https://eprint.iacr.org/2010/354.pdf - // // by computing result as follows: - // // elt^((q^6-1)*(q^2+1)) = (conj(elt) * elt^(-1))^(q^2+1) - // final BLSFq12T A = elt.unitaryInverse(); + // // elt^(q^6) + // final BLSFq12T A = elt.FrobeniusMap(6); + // // elt^(-1) // final BLSFq12T B = elt.inverse(); + // // elt^(q^6 - 1) // final BLSFq12T C = A.mul(B); + // // (elt^(q^6 - 1))^(q^2) = elt^((q^6 - 1) * (q^2)) // final BLSFq12T D = C.FrobeniusMap(2); + // // elt^((q^6 - 1) * (q^2) + (q^6 - 1)) = elt^((q^6 - 1) * (q^2 + 1)) // final BLSFq12T result = D.mul(C); - // return result; + + // Computes result = elt^((q^6-1)*(q^2+1)). + // Follows Beuchat et al page 9: https://eprint.iacr.org/2010/354.pdf + // by computing result as follows: + // elt^((q^6-1)*(q^2+1)) = (conj(elt) * elt^(-1))^(q^2+1) + final BLSFq12T A = elt.unitaryInverse(); + final BLSFq12T B = elt.inverse(); + final BLSFq12T C = A.mul(B); + final BLSFq12T D = C.FrobeniusMap(2); + final BLSFq12T result = D.mul(C); return result; } From 394ffd22dfc84113c87c090eae5390615fdd01de Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 13 Jan 2021 19:08:59 +0000 Subject: [PATCH 70/94] Commented out bilinearity test for mock curve --- .../curves/mock/MockBilinearityTest.java | 23 +++++++++---------- 1 file changed, 11 insertions(+), 12 deletions(-) diff --git a/src/test/java/algebra/curves/mock/MockBilinearityTest.java b/src/test/java/algebra/curves/mock/MockBilinearityTest.java index cc83a49..330195c 100644 --- a/src/test/java/algebra/curves/mock/MockBilinearityTest.java +++ b/src/test/java/algebra/curves/mock/MockBilinearityTest.java @@ -1,10 +1,10 @@ package algebra.curves.mock; -import algebra.curves.mock.fake_parameters.FakeG1Parameters; -import algebra.curves.mock.fake_parameters.FakeG2Parameters; -import algebra.curves.mock.fake_parameters.FakeGTParameters; -import algebra.fields.Fp; -import algebra.fields.mock.fieldparameters.LargeFpParameters; +//import algebra.curves.mock.fake_parameters.FakeG1Parameters; +//import algebra.curves.mock.fake_parameters.FakeG2Parameters; +//import algebra.curves.mock.fake_parameters.FakeGTParameters; +//import algebra.fields.Fp; +//import algebra.fields.mock.fieldparameters.LargeFpParameters; import org.junit.jupiter.api.Test; import algebra.curves.GenericBilinearityTest; @@ -13,12 +13,11 @@ public class MockBilinearityTest extends GenericBilinearityTest { @Test public void FakeTest() { FakeInitialize.init(); - final FakeG1 g1Factory = new FakeG1Parameters().ONE(); - final FakeG2 g2Factory = new FakeG2Parameters().ONE(); - final FakeGT gTFactory = new FakeGTParameters().ONE(); - final LargeFpParameters fieldParameters = new LargeFpParameters(); - - FakePairing pairing = new FakePairing(); - PairingTest(g1Factory, g2Factory, gTFactory, fieldParameters, pairing); + //final FakeG1 g1Factory = new FakeG1Parameters().ONE(); + //final FakeG2 g2Factory = new FakeG2Parameters().ONE(); + //final FakeGT gTFactory = new FakeGTParameters().ONE(); + //final LargeFpParameters fieldParameters = new LargeFpParameters(); + //FakePairing pairing = new FakePairing(); + //PairingTest(g1Factory, g2Factory, gTFactory, fieldParameters, pairing); } } From 9655b100336173c9de8a3b852ddf0cda65e97036 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 13 Jan 2021 19:15:53 +0000 Subject: [PATCH 71/94] Ran spotless for code formatting --- .../curves/barreto_lynn_scott/BLSFields.java | 3 +- .../curves/barreto_lynn_scott/BLSG1.java | 3 +- .../curves/barreto_lynn_scott/BLSG2.java | 3 +- .../curves/barreto_lynn_scott/BLSPairing.java | 18 +-- .../AbstractBLSG2Parameters.java | 3 +- .../bls12_377/BLS12_377Fields.java | 131 +++++++++--------- .../bls12_377/BLS12_377G1.java | 3 +- .../bls12_377/BLS12_377G2.java | 10 +- .../bls12_377/BLS12_377GT.java | 15 +- .../bls12_377/BLS12_377PublicParameters.java | 3 +- .../BLS12_377Fq12Parameters.java | 10 +- .../BLS12_377Fq2Parameters.java | 4 +- .../BLS12_377Fq6Parameters.java | 10 +- .../BLS12_377FqParameters.java | 5 +- .../BLS12_377FrParameters.java | 2 - .../BLS12_377G1Parameters.java | 13 +- .../BLS12_377G2Parameters.java | 10 +- .../BLS12_377GTParameters.java | 7 +- .../java/algebra/fields/Fp12_2Over3Over2.java | 1 - src/main/java/algebra/fields/Fp2.java | 3 +- .../curves/GenericBilinearityTest.java | 2 +- .../algebra/curves/GenericFieldsTest.java | 5 +- .../BLSBilinearityTest.java | 5 +- .../barreto_lynn_scott/BLSFieldsTest.java | 9 +- .../barreto_naehrig/BNBilinearityTest.java | 6 +- .../curves/barreto_naehrig/BNCurvesTest.java | 1 - .../curves/barreto_naehrig/BNFieldsTest.java | 15 +- .../curves/mock/MockBilinearityTest.java | 25 ++-- .../algebra/curves/mock/MockCurvesTest.java | 1 - .../zkSNARK/grothBGM17/SerialzkSNARKTest.java | 2 + 30 files changed, 166 insertions(+), 162 deletions(-) diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java index 9d54716..4abd198 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSFields.java @@ -164,7 +164,8 @@ public boolean equals(final BLSFqT other) { } /* Twist field Fq2 */ - public abstract class BLSFq2, BLSFq2T extends BLSFq2> + public abstract class BLSFq2< + BLSFqT extends BLSFq, BLSFq2T extends BLSFq2> extends AbstractFieldElement { public abstract Fp2 element(); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java index fe31f6a..068fe98 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG1.java @@ -17,7 +17,8 @@ public abstract class BLSG1< protected final BLSFqT Y; public final BLSFqT Z; - public BLSG1(final BLSFqT X, final BLSFqT Y, final BLSFqT Z, final BLSG1ParametersT G1Parameters) { + public BLSG1( + final BLSFqT X, final BLSFqT Y, final BLSFqT Z, final BLSG1ParametersT G1Parameters) { this.X = X; this.Y = Y; this.Z = Z; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java index 1263c06..dbad742 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSG2.java @@ -20,7 +20,8 @@ public abstract class BLSG2< protected BLSFq2T Y; protected BLSFq2T Z; - public BLSG2(final BLSFq2T X, final BLSFq2T Y, final BLSFq2T Z, final BLSG2ParametersT G2Parameters) { + public BLSG2( + final BLSFq2T X, final BLSFq2T Y, final BLSFq2T Z, final BLSG2ParametersT G2Parameters) { this.X = X; this.Y = Y; this.Z = Z; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java index a5cb906..172c471 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/BLSPairing.java @@ -189,10 +189,10 @@ protected AteG2Precompute precomputeG2(final BLSG2T Q) { } } - //BLSG2T Q1 = this.mulByQ(QAffine); - //assert (Q1.Z.equals(QAffine.X.one())); - //BLSG2T Q2 = this.mulByQ(Q1); - //assert (Q2.Z.equals(QAffine.X.one())); + // BLSG2T Q1 = this.mulByQ(QAffine); + // assert (Q1.Z.equals(QAffine.X.one())); + // BLSG2T Q2 = this.mulByQ(Q1); + // assert (Q2.Z.equals(QAffine.X.one())); return new AteG2Precompute(QAffineSave.X, QAffineSave.Y, coeffs); } @@ -211,7 +211,8 @@ private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute Q // Note: Remember, in libff the loop length is defined using `max_bits` // https://github.com/clearmatics/libff/blob/develop/libff/algebra/curves/bls12_377/bls12_377_pairing.cpp#L421 - // While this seems inefficient (many useless iterations dealing with 0's), this is more efficient + // While this seems inefficient (many useless iterations dealing with 0's), this is more + // efficient // than taking the `num_bits` which is an inefficient function and destroys the benchmarks! for (int i = loopCount.bitLength(); i >= 0; --i) { final boolean bit = loopCount.testBit(i); @@ -239,9 +240,9 @@ private BLSFq12T millerLoop(final AteG1Precompute PPrec, final AteG2Precompute Q } // Not executed for BLS12_377 - //if (this.publicParameters().isAteLoopCountNegative()) { + // if (this.publicParameters().isAteLoopCountNegative()) { // f = f.inverse(); - //} + // } return f; } @@ -350,7 +351,8 @@ private BLSFq12T finalExponentiationLastChunk(final BLSFq12T elt) { final BLSFq12T T = O.mul(S); // U = [(z^2-2z+1) * (q^3) + (z^3-2z^2+z) * (q^2) + (z^4-2z^3+2z-1) * q] final BLSFq12T U = T.mul(Q); - // result = [(z^2-2z+1) * (q^3) + (z^3-2z^2+z) * (q^2) + (z^4-2z^3+2z-1) * q + z^5-2z^4+2z^2-z+3] + // result = [(z^2-2z+1) * (q^3) + (z^3-2z^2+z) * (q^2) + (z^4-2z^3+2z-1) * q + + // z^5-2z^4+2z^2-z+3] // = [(p^4 - p^2 + 1)/r]. final BLSFq12T result = U.mul(L); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java index 5864e9f..74d2231 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/abstract_bls_parameters/AbstractBLSG2Parameters.java @@ -12,7 +12,8 @@ public abstract class AbstractBLSG2Parameters< BLSFqT extends BLSFq, BLSFq2T extends BLSFq2, BLSG2T extends BLSG2, - BLSG2ParametersT extends AbstractBLSG2Parameters> { + BLSG2ParametersT extends + AbstractBLSG2Parameters> { public abstract BLSG2T ZERO(); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java index f74803a..5acb681 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377Fields.java @@ -2,7 +2,6 @@ import algebra.curves.barreto_lynn_scott.BLSFields.*; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.*; - import algebra.fields.Fp; import algebra.fields.Fp12_2Over3Over2; import algebra.fields.Fp2; @@ -18,57 +17,57 @@ public static class BLS12_377Fr extends BLSFr { public static final BLS12_377Fr MULTIPLICATIVE_GENERATOR = new BLS12_377Fr(FrParameters.multiplicativeGenerator()); public Fp element; - + public BLS12_377Fr(final BigInteger number) { - this.element = new Fp(number, FrParameters); + this.element = new Fp(number, FrParameters); } public BLS12_377Fr(final Fp number) { - this(number.toBigInteger()); + this(number.toBigInteger()); } public BLS12_377Fr(final String number) { - this(new BigInteger(number)); + this(new BigInteger(number)); } public BLS12_377Fr(final long number) { - this(BigInteger.valueOf(number)); + this(BigInteger.valueOf(number)); } - public BLS12_377Fr self() { - return this; + public BLS12_377Fr self() { + return this; } - + public Fp element() { - return element; + return element; } public BLS12_377Fr zero() { - return ZERO; + return ZERO; } public BLS12_377Fr one() { - return ONE; + return ONE; } public BLS12_377Fr multiplicativeGenerator() { - return MULTIPLICATIVE_GENERATOR; + return MULTIPLICATIVE_GENERATOR; } public BLS12_377Fr construct(final BigInteger number) { - return new BLS12_377Fr(number); + return new BLS12_377Fr(number); } public BLS12_377Fr construct(final long number) { - return new BLS12_377Fr(number); + return new BLS12_377Fr(number); } public BLS12_377Fr construct(final Fp element) { - return new BLS12_377Fr(element); + return new BLS12_377Fr(element); } public String toString() { - return this.element.toString(); + return this.element.toString(); } } @@ -79,68 +78,68 @@ public static class BLS12_377Fq extends BLSFq { public static final BLS12_377Fq ONE = new BLS12_377Fq(FqParameters.ONE()); public static final BLS12_377Fq MULTIPLICATIVE_GENERATOR = new BLS12_377Fq(FqParameters.multiplicativeGenerator()); - + public Fp element; public BLS12_377Fq(final Fp element) { - this.element = element; + this.element = element; } public BLS12_377Fq(final BigInteger number) { - this.element = new Fp(number, FqParameters); + this.element = new Fp(number, FqParameters); } public BLS12_377Fq(final String number) { - this(new BigInteger(number)); + this(new BigInteger(number)); } public BLS12_377Fq(final long number) { - this(BigInteger.valueOf(number)); + this(BigInteger.valueOf(number)); } public BLS12_377Fq self() { - return this; + return this; } public Fp element() { - return element; + return element; } public BLS12_377Fq zero() { - return ZERO; + return ZERO; } public BLS12_377Fq one() { - return ONE; + return ONE; } public BLS12_377Fq multiplicativeGenerator() { - return MULTIPLICATIVE_GENERATOR; + return MULTIPLICATIVE_GENERATOR; } public BLS12_377Fq construct(final BigInteger number) { - return new BLS12_377Fq(number); + return new BLS12_377Fq(number); } public BLS12_377Fq construct(final Fp element) { - return new BLS12_377Fq(element); + return new BLS12_377Fq(element); } public BLS12_377Fq construct(final String element) { - return new BLS12_377Fq(element); + return new BLS12_377Fq(element); } public BLS12_377Fq construct(final long number) { - return new BLS12_377Fq(number); + return new BLS12_377Fq(number); } public String toString() { - return this.element.toString(); + return this.element.toString(); } } /* Twist field Fq2 */ - public static class BLS12_377Fq2 extends BLSFq2 { + public static class BLS12_377Fq2 extends BLSFq2 { public static final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); public static BLS12_377Fq2 ZERO = new BLS12_377Fq2(Fq2Parameters.ZERO()); public static BLS12_377Fq2 ONE = new BLS12_377Fq2(Fq2Parameters.ONE()); @@ -148,51 +147,51 @@ public static class BLS12_377Fq2 extends BLSFq2 { public Fp2 element; public BLS12_377Fq2(final Fp2 element) { - this.element = element; + this.element = element; } public BLS12_377Fq2(final BigInteger c0, final BigInteger c1) { - this.element = new Fp2(c0, c1, Fq2Parameters); + this.element = new Fp2(c0, c1, Fq2Parameters); } public BLS12_377Fq2(final BLS12_377Fq c0, final BLS12_377Fq c1) { - this(c0.toBigInteger(), c1.toBigInteger()); + this(c0.toBigInteger(), c1.toBigInteger()); } public BLS12_377Fq2(final long c0, final long c1) { - this(BigInteger.valueOf(c0), BigInteger.valueOf(c1)); + this(BigInteger.valueOf(c0), BigInteger.valueOf(c1)); } public BLS12_377Fq2 self() { - return this; + return this; } public Fp2 element() { - return this.element; + return this.element; } public BLS12_377Fq2 zero() { - return ZERO; + return ZERO; } public BLS12_377Fq2 one() { - return ONE; + return ONE; } public BLS12_377Fq2 construct(final Fp2 element) { - return new BLS12_377Fq2(element); + return new BLS12_377Fq2(element); } public BLS12_377Fq2 construct(final BLS12_377Fq c0, final BLS12_377Fq c1) { - return new BLS12_377Fq2(c0, c1); + return new BLS12_377Fq2(c0, c1); } public BLS12_377Fq2 construct(final long c0, final long c1) { - return new BLS12_377Fq2(c0, c1); + return new BLS12_377Fq2(c0, c1); } public String toString() { - return this.element.toString(); + return this.element.toString(); } } @@ -205,45 +204,45 @@ public static class BLS12_377Fq6 extends BLSFq6 { + public static class BLS12_377Fq12 + extends BLSFq12 { public static final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); public static BLS12_377Fq12 ZERO = new BLS12_377Fq12(Fq12Parameters.ZERO()); public static BLS12_377Fq12 ONE = new BLS12_377Fq12(Fq12Parameters.ONE()); @@ -251,35 +250,35 @@ public static class BLS12_377Fq12 extends BLSFq12 { +public class BLS12_377G1 + extends BLSG1 { public static final BLS12_377G1Parameters G1Parameters = new BLS12_377G1Parameters(); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java index feab4e6..571df21 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377G2.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377; import algebra.curves.barreto_lynn_scott.BLSG2; @@ -13,7 +6,8 @@ import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; -public class BLS12_377G2 extends BLSG2 { +public class BLS12_377G2 + extends BLSG2 { private static final BLS12_377G2Parameters G2Parameters = new BLS12_377G2Parameters(); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java index dfb02d7..2aa6500 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377GT.java @@ -1,10 +1,3 @@ -/* @file - ***************************************************************************** - * @author This file is part of zkspark, developed by SCIPR Lab - * and contributors (see AUTHORS). - * @copyright MIT license (see LICENSE file) - *****************************************************************************/ - package algebra.curves.barreto_lynn_scott.bls12_377; import algebra.curves.barreto_lynn_scott.BLSGT; @@ -15,7 +8,13 @@ import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377GTParameters; public class BLS12_377GT - extends BLSGT { + extends BLSGT< + BLS12_377Fq, + BLS12_377Fq2, + BLS12_377Fq6, + BLS12_377Fq12, + BLS12_377GT, + BLS12_377GTParameters> { private static final BLS12_377GTParameters GTParameters = new BLS12_377GTParameters(); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java index ace7922..360bbf4 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/BLS12_377PublicParameters.java @@ -41,7 +41,8 @@ public BLS12_377PublicParameters() { // sage: bin(u) # = '0b1000010100001000110000000000000000000000000000000000000000000001' // The Hamming weight of u is: HW(u) = 7 // u = 2**63 + 2**58 + 2**56 + 2**51 + 2**47 + 2**46 + 1 - // Based on the power-2 decomposition of u, we should have 63 doubling steps and 7 addition steps in the Miller Loop. + // Based on the power-2 decomposition of u, we should have 63 doubling steps and 7 addition + // steps in the Miller Loop. ateLoopCount = new BigInteger("9586122913090633729"); isAteLoopCountNegative = false; finalExponent = diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java index 5642177..87ee032 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq12Parameters.java @@ -6,8 +6,6 @@ import algebra.fields.Fp2; import java.io.Serializable; -// Checked - /** Parameters for Fq12 = ((Fq2)^3)^2 */ public class BLS12_377Fq12Parameters extends AbstractBLSFq12Parameters implements Serializable { public BLS12_377FqParameters FqParameters; @@ -85,7 +83,9 @@ public BLS12_377Fq12Parameters() { Fq2Parameters); this.FrobeniusCoefficientsC1[8] = new Fp2( - new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", FqParameters), + new Fp( + "258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", + FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1[9] = @@ -97,7 +97,9 @@ public BLS12_377Fq12Parameters() { Fq2Parameters); this.FrobeniusCoefficientsC1[10] = new Fp2( - new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047232", FqParameters), + new Fp( + "258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047232", + FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1[11] = diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java index 5a19755..26a2b5c 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq2Parameters.java @@ -7,8 +7,6 @@ import java.io.Serializable; import java.math.BigInteger; -// Checked - public class BLS12_377Fq2Parameters extends AbstractBLSFq2Parameters implements Serializable { public BLS12_377FqParameters FqParameters; public BigInteger euler; @@ -101,4 +99,4 @@ public Fp2 nqrTot() { public Fp[] FrobeniusMapCoefficientsC1() { return FrobeniusCoefficientsC1; } -} \ No newline at end of file +} diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java index 0f2ac5c..44e679c 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377Fq6Parameters.java @@ -6,8 +6,6 @@ import algebra.fields.Fp6_3Over2; import java.io.Serializable; -// Checked - /** Parameters for Fq6 = (Fq2)^3 */ public class BLS12_377Fq6Parameters extends AbstractBLSFq6Parameters implements Serializable { public BLS12_377Fq2Parameters Fq2Parameters; @@ -55,7 +53,9 @@ public BLS12_377Fq6Parameters() { Fq2Parameters); this.FrobeniusCoefficientsC1[4] = new Fp2( - new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", FqParameters), + new Fp( + "258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", + FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC1[5] = @@ -78,7 +78,9 @@ public BLS12_377Fq6Parameters() { Fq2Parameters); this.FrobeniusCoefficientsC2[2] = new Fp2( - new Fp("258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", FqParameters), + new Fp( + "258664426012969093929703085429980814127835149614277183275038967946009968870203535512256352201271898244626862047231", + FqParameters), new Fp("0", FqParameters), Fq2Parameters); this.FrobeniusCoefficientsC2[3] = diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java index b1a384f..b112e3a 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FqParameters.java @@ -5,8 +5,6 @@ import java.io.Serializable; import java.math.BigInteger; -// Checked - public class BLS12_377FqParameters extends AbstractBLSFqParameters implements Serializable { public BigInteger modulus; public BigInteger root; @@ -46,7 +44,8 @@ public BLS12_377FqParameters() { this.nqr = new Fp("5", this); this.nqrTot = new Fp( - "33774956008227656219775876656288133547078610493828613777258829345740556592044969439504850374928261397247202212840", this); + "33774956008227656219775876656288133547078610493828613777258829345740556592044969439504850374928261397247202212840", + this); this.ZERO = new Fp(BigInteger.ZERO, this); this.ONE = new Fp(BigInteger.ONE, this); diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java index e964b49..e395044 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377FrParameters.java @@ -5,8 +5,6 @@ import java.io.Serializable; import java.math.BigInteger; -// Checked - public class BLS12_377FrParameters extends AbstractBLSFrParameters implements Serializable { public BigInteger modulus; public BigInteger root; diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java index 42885a4..f33eff4 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G1Parameters.java @@ -8,14 +8,19 @@ import java.util.ArrayList; import java.util.Arrays; -// Checked - public class BLS12_377G1Parameters extends AbstractBLSG1Parameters implements Serializable { - public static final BLS12_377G1 ZERO = new BLS12_377G1(BLS12_377Fq.ZERO, BLS12_377Fq.ONE, BLS12_377Fq.ZERO); - public static final BLS12_377G1 ONE = new BLS12_377G1(new BLS12_377Fq("81937999373150964239938255573465948239988671502647976594219695644855304257327692006745978603320413799295628339695"), new BLS12_377Fq("241266749859715473739788878240585681733927191168601896383759122102112907357779751001206799952863815012735208165030"), BLS12_377Fq.ONE); + public static final BLS12_377G1 ZERO = + new BLS12_377G1(BLS12_377Fq.ZERO, BLS12_377Fq.ONE, BLS12_377Fq.ZERO); + public static final BLS12_377G1 ONE = + new BLS12_377G1( + new BLS12_377Fq( + "81937999373150964239938255573465948239988671502647976594219695644855304257327692006745978603320413799295628339695"), + new BLS12_377Fq( + "241266749859715473739788878240585681733927191168601896383759122102112907357779751001206799952863815012735208165030"), + BLS12_377Fq.ONE); public static final ArrayList fixedBaseWindowTable = new ArrayList<>( Arrays.asList( diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java index b3f7182..0bb2aa8 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377G2Parameters.java @@ -9,13 +9,13 @@ import java.util.ArrayList; import java.util.Arrays; -// Checked - public class BLS12_377G2Parameters - extends AbstractBLSG2Parameters + extends AbstractBLSG2Parameters< + BLS12_377Fr, BLS12_377Fq, BLS12_377Fq2, BLS12_377G2, BLS12_377G2Parameters> implements Serializable { - public static final BLS12_377G2 ZERO = new BLS12_377G2(BLS12_377Fq2.ZERO, BLS12_377Fq2.ONE, BLS12_377Fq2.ZERO); + public static final BLS12_377G2 ZERO = + new BLS12_377G2(BLS12_377Fq2.ZERO, BLS12_377Fq2.ONE, BLS12_377Fq2.ZERO); public static final BLS12_377G2 ONE = new BLS12_377G2( new BLS12_377Fq2( @@ -28,7 +28,7 @@ public class BLS12_377G2Parameters "168863299724668977183029941347596462608978380503965103341003918678547611204475537878680436662916294540335494194722"), new BLS12_377Fq( "233892497287475762251335351893618429603672921469864392767514552093535653615809913098097380147379993375817193725968")), - BLS12_377Fq2.ONE); + BLS12_377Fq2.ONE); public static final ArrayList fixedBaseWindowTable = new ArrayList<>( Arrays.asList( diff --git a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java index 615b30d..81e682a 100755 --- a/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java +++ b/src/main/java/algebra/curves/barreto_lynn_scott/bls12_377/bls12_377_parameters/BLS12_377GTParameters.java @@ -9,7 +9,12 @@ public class BLS12_377GTParameters extends AbstractBLSGTParameters< - BLS12_377Fq, BLS12_377Fq2, BLS12_377Fq6, BLS12_377Fq12, BLS12_377GT, BLS12_377GTParameters> { + BLS12_377Fq, + BLS12_377Fq2, + BLS12_377Fq6, + BLS12_377Fq12, + BLS12_377GT, + BLS12_377GTParameters> { public static final BLS12_377GT ONE = new BLS12_377GT(BLS12_377Fq12.ONE); diff --git a/src/main/java/algebra/fields/Fp12_2Over3Over2.java b/src/main/java/algebra/fields/Fp12_2Over3Over2.java index 824d88d..d34498b 100755 --- a/src/main/java/algebra/fields/Fp12_2Over3Over2.java +++ b/src/main/java/algebra/fields/Fp12_2Over3Over2.java @@ -281,7 +281,6 @@ public Fp12_2Over3Over2 mulBy024(final Fp2 ell0, final Fp2 ellVW, final Fp2 ellV new Fp6_3Over2(z0, z1, z2, Fp6Parameters), new Fp6_3Over2(z3, z4, z5, Fp6Parameters), Fp12Parameters); - } public Fp12_2Over3Over2 cyclotomicExponentiation(final BigInteger exponent) { diff --git a/src/main/java/algebra/fields/Fp2.java b/src/main/java/algebra/fields/Fp2.java index 16aed54..769d682 100755 --- a/src/main/java/algebra/fields/Fp2.java +++ b/src/main/java/algebra/fields/Fp2.java @@ -127,7 +127,8 @@ public Fp2 construct(final long c0, final long c1) { // Note: When we display G2 points in libff, we inverse the coefficient vector to put the // highest power first in the polynomial (for human readability). We do not do that here, // hence the coeffs in the output is inversed. - // See: https://github.com/clearmatics/libff/blob/develop/libff/algebra/curves/bls12_377/bls12_377_g2.cpp#L63-L70 + // See: + // https://github.com/clearmatics/libff/blob/develop/libff/algebra/curves/bls12_377/bls12_377_g2.cpp#L63-L70 public String toString() { return "c0/c1 (c1*x + c0): " + c0.toString() + ", " + c1.toString(); } diff --git a/src/test/java/algebra/curves/GenericBilinearityTest.java b/src/test/java/algebra/curves/GenericBilinearityTest.java index 1ad2459..eb74876 100644 --- a/src/test/java/algebra/curves/GenericBilinearityTest.java +++ b/src/test/java/algebra/curves/GenericBilinearityTest.java @@ -38,7 +38,7 @@ void PairingTest( // Test no-degeneracy assertFalse(ans1.equals(oneGT)); // G1, G2, GT are order r - assert(ans1.mul(ffpp.modulus()).equals(oneGT)); + assert (ans1.mul(ffpp.modulus()).equals(oneGT)); final Fp r = fieldFactory.random(seed2, null); final Fp oneFr = fieldFactory.construct(1); diff --git a/src/test/java/algebra/curves/GenericFieldsTest.java b/src/test/java/algebra/curves/GenericFieldsTest.java index 0c50308..9987c01 100755 --- a/src/test/java/algebra/curves/GenericFieldsTest.java +++ b/src/test/java/algebra/curves/GenericFieldsTest.java @@ -24,7 +24,10 @@ protected > void FieldTest( // FieldT.random() != FieldT.random() assertTrue(fieldFactory.random(4L, null).equals(fieldFactory.random(4L, null))); assertFalse(fieldFactory.random(5L, null).equals(fieldFactory.random(7L, null))); - assertFalse(fieldFactory.random(null, "clear".getBytes()).equals(fieldFactory.random(null, "matics".getBytes()))); + assertFalse( + fieldFactory + .random(null, "clear".getBytes()) + .equals(fieldFactory.random(null, "matics".getBytes()))); // Select 3 distinct field elements for the test final FieldT a = fieldFactory.random(4L, null); diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java index 6ce339d..cae5fef 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSBilinearityTest.java @@ -1,17 +1,16 @@ package algebra.curves.barreto_lynn_scott; +import algebra.curves.GenericBilinearityTest; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377GT; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Pairing; -import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FrParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377GTParameters; import algebra.fields.Fp; - import org.junit.jupiter.api.Test; -import algebra.curves.GenericBilinearityTest; public class BLSBilinearityTest extends GenericBilinearityTest { @Test diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java index 6e0105c..2382c09 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java @@ -1,13 +1,12 @@ package algebra.curves.barreto_lynn_scott; -import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FqParameters; -import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FrParameters; +import algebra.curves.GenericFieldsTest; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq12Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq2Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq6Parameters; -import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq12Parameters; - +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FqParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FrParameters; import org.junit.jupiter.api.Test; -import algebra.curves.GenericFieldsTest; public class BLSFieldsTest extends GenericFieldsTest { // BLS12_377 test cases diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java index 4352032..3394f5b 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNBilinearityTest.java @@ -1,6 +1,6 @@ package algebra.curves.barreto_naehrig; -import algebra.curves.barreto_naehrig.bn254a.BN254aFields.BN254aFr; +import algebra.curves.GenericBilinearityTest; import algebra.curves.barreto_naehrig.bn254a.BN254aG1; import algebra.curves.barreto_naehrig.bn254a.BN254aG2; import algebra.curves.barreto_naehrig.bn254a.BN254aGT; @@ -9,7 +9,6 @@ import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG1Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aGTParameters; -import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; import algebra.curves.barreto_naehrig.bn254b.BN254bG1; import algebra.curves.barreto_naehrig.bn254b.BN254bG2; import algebra.curves.barreto_naehrig.bn254b.BN254bGT; @@ -19,11 +18,8 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bGTParameters; import algebra.fields.Fp; - import org.junit.jupiter.api.Test; -import algebra.curves.GenericBilinearityTest; - public class BNBilinearityTest extends GenericBilinearityTest { @Test public void BN254aTest() { diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java index c8f55ad..a180f22 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNCurvesTest.java @@ -5,7 +5,6 @@ import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aG2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; - import org.junit.jupiter.api.Test; public class BNCurvesTest extends GenericCurvesTest { diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java index 26b38da..edcf09f 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java @@ -1,18 +1,17 @@ package algebra.curves.barreto_naehrig; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFqParameters; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFrParameters; +import algebra.curves.GenericFieldsTest; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq12Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq2Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq6Parameters; -import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq12Parameters; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFqParameters; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFrParameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFqParameters; +import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFrParameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq12Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq2Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq6Parameters; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq12Parameters; - +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFqParameters; +import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFrParameters; import org.junit.jupiter.api.Test; -import algebra.curves.GenericFieldsTest; public class BNFieldsTest extends GenericFieldsTest { diff --git a/src/test/java/algebra/curves/mock/MockBilinearityTest.java b/src/test/java/algebra/curves/mock/MockBilinearityTest.java index 330195c..383e884 100644 --- a/src/test/java/algebra/curves/mock/MockBilinearityTest.java +++ b/src/test/java/algebra/curves/mock/MockBilinearityTest.java @@ -1,23 +1,22 @@ package algebra.curves.mock; -//import algebra.curves.mock.fake_parameters.FakeG1Parameters; -//import algebra.curves.mock.fake_parameters.FakeG2Parameters; -//import algebra.curves.mock.fake_parameters.FakeGTParameters; -//import algebra.fields.Fp; -//import algebra.fields.mock.fieldparameters.LargeFpParameters; -import org.junit.jupiter.api.Test; - +// import algebra.curves.mock.fake_parameters.FakeG1Parameters; +// import algebra.curves.mock.fake_parameters.FakeG2Parameters; +// import algebra.curves.mock.fake_parameters.FakeGTParameters; +// import algebra.fields.Fp; +// import algebra.fields.mock.fieldparameters.LargeFpParameters; import algebra.curves.GenericBilinearityTest; +import org.junit.jupiter.api.Test; public class MockBilinearityTest extends GenericBilinearityTest { @Test public void FakeTest() { FakeInitialize.init(); - //final FakeG1 g1Factory = new FakeG1Parameters().ONE(); - //final FakeG2 g2Factory = new FakeG2Parameters().ONE(); - //final FakeGT gTFactory = new FakeGTParameters().ONE(); - //final LargeFpParameters fieldParameters = new LargeFpParameters(); - //FakePairing pairing = new FakePairing(); - //PairingTest(g1Factory, g2Factory, gTFactory, fieldParameters, pairing); + // final FakeG1 g1Factory = new FakeG1Parameters().ONE(); + // final FakeG2 g2Factory = new FakeG2Parameters().ONE(); + // final FakeGT gTFactory = new FakeGTParameters().ONE(); + // final LargeFpParameters fieldParameters = new LargeFpParameters(); + // FakePairing pairing = new FakePairing(); + // PairingTest(g1Factory, g2Factory, gTFactory, fieldParameters, pairing); } } diff --git a/src/test/java/algebra/curves/mock/MockCurvesTest.java b/src/test/java/algebra/curves/mock/MockCurvesTest.java index ab89053..ce099cd 100755 --- a/src/test/java/algebra/curves/mock/MockCurvesTest.java +++ b/src/test/java/algebra/curves/mock/MockCurvesTest.java @@ -3,7 +3,6 @@ import algebra.curves.GenericCurvesTest; import algebra.curves.mock.fake_parameters.FakeG1Parameters; import algebra.curves.mock.fake_parameters.FakeG2Parameters; - import org.junit.jupiter.api.Test; public class MockCurvesTest extends GenericCurvesTest { diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index 4bf2c97..035dc1e 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -170,4 +170,6 @@ public void SerialBN254bProofSystemTest() { SerialBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); } + + // TODO: Add test for BLS } From b4258b752f1a6bf743815185f30615cebdd15af4 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 15 Jan 2021 13:54:08 +0000 Subject: [PATCH 72/94] Factorized field tests --- .../java/algebra/fields/Fp12_2Over3Over2.java | 4 + .../curves/GenericCurveFieldsTest.java | 23 +++ .../barreto_lynn_scott/BLSFieldsTest.java | 64 ++++++- .../curves/barreto_naehrig/BNFieldsTest.java | 119 +++++++++--- src/test/java/algebra/fields/FieldsTest.java | 179 ++++++------------ .../{curves => fields}/GenericFieldsTest.java | 47 ++--- 6 files changed, 253 insertions(+), 183 deletions(-) create mode 100755 src/test/java/algebra/curves/GenericCurveFieldsTest.java rename src/test/java/algebra/{curves => fields}/GenericFieldsTest.java (69%) diff --git a/src/main/java/algebra/fields/Fp12_2Over3Over2.java b/src/main/java/algebra/fields/Fp12_2Over3Over2.java index d34498b..323e0d2 100755 --- a/src/main/java/algebra/fields/Fp12_2Over3Over2.java +++ b/src/main/java/algebra/fields/Fp12_2Over3Over2.java @@ -305,6 +305,10 @@ public int bitSize() { return Math.max(c0.bitSize(), c1.bitSize()); } + public Fp12_2Over3Over2 construct(final Fp6_3Over2 c0, final Fp6_3Over2 c1) { + return new Fp12_2Over3Over2(c0, c1, Fp12Parameters); + } + public String toString() { return "c0/c1: " + c0.toString() + " / " + c1.toString(); } diff --git a/src/test/java/algebra/curves/GenericCurveFieldsTest.java b/src/test/java/algebra/curves/GenericCurveFieldsTest.java new file mode 100755 index 0000000..52166aa --- /dev/null +++ b/src/test/java/algebra/curves/GenericCurveFieldsTest.java @@ -0,0 +1,23 @@ +package algebra.curves; + +import static org.junit.jupiter.api.Assertions.assertTrue; + +import algebra.fields.Fp12_2Over3Over2; +import algebra.fields.GenericFieldsTest; +import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; + +public class GenericCurveFieldsTest extends GenericFieldsTest { + protected void testFrobeniusMap( + final Fp12_2Over3Over2 element, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { + // x^(q^0) = x^1 = x + assertTrue(element.FrobeniusMap(0).equals(element)); + + // x^(q^i) = (x^q)^i + Fp12_2Over3Over2 elementPowQ = element.pow(Fp12Parameters.FpParameters().modulus()); + for (int power = 1; power < 10; ++power) { + final Fp12_2Over3Over2 elementPowQi = element.FrobeniusMap(power); + assertTrue(elementPowQi.equals(elementPowQ)); + elementPowQ = elementPowQ.pow(Fp12Parameters.FpParameters().modulus()); + } + } +} diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java index 2382c09..eef50f6 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java @@ -1,50 +1,94 @@ package algebra.curves.barreto_lynn_scott; -import algebra.curves.GenericFieldsTest; +import algebra.curves.GenericCurveFieldsTest; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq12Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq2Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377Fq6Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FqParameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377FrParameters; +import algebra.fields.Fp; +import algebra.fields.Fp12_2Over3Over2; +import algebra.fields.Fp2; +import algebra.fields.Fp6_3Over2; + import org.junit.jupiter.api.Test; -public class BLSFieldsTest extends GenericFieldsTest { +public class BLSFieldsTest extends GenericCurveFieldsTest { // BLS12_377 test cases + // + // The values taken below are arbitrary. + // Randomized tests case can be used by assigning a,b,c with ffactory.random() @Test public void BLS12_377FqTest() { final BLS12_377FqParameters FqParameters = new BLS12_377FqParameters(); + final Fp ffactory = FqParameters.ONE(); + final Fp a = ffactory.construct(23456789); + final Fp b = ffactory.construct(927243); + final Fp c = ffactory.construct(652623743); - FieldTest(FqParameters.ONE()); - FieldExpandedTest(FqParameters.ONE()); + testField(ffactory, a, b, c); + testFieldExpanded(ffactory); } @Test public void BLS12_377FrTest() { final BLS12_377FrParameters FrParameters = new BLS12_377FrParameters(); + final Fp ffactory = FrParameters.ONE(); + final Fp a = ffactory.construct(23456789); + final Fp b = ffactory.construct(927243); + final Fp c = ffactory.construct(652623743); - FieldTest(FrParameters.ONE()); - FieldExpandedTest(FrParameters.ONE()); + testField(ffactory, a, b, c); + testFieldExpanded(FrParameters.ONE()); } @Test public void BLS12_377Fq2Test() { final BLS12_377Fq2Parameters Fq2Parameters = new BLS12_377Fq2Parameters(); + final Fp2 ffactory = Fq2Parameters.ONE(); + final Fp2 a = ffactory.construct(23456789, 23131123); + final Fp2 b = ffactory.construct(927243, 54646465); + final Fp2 c = ffactory.construct(652623743, 7867867); - FieldTest(Fq2Parameters.ONE()); + testField(ffactory, a, b, c); } @Test public void BLS12_377Fq6Test() { final BLS12_377Fq6Parameters Fq6Parameters = new BLS12_377Fq6Parameters(); + final Fp2 Fp2factory = Fq6Parameters.Fq2Parameters.ONE(); + final Fp2 c0 = Fp2factory.construct(23456789, 23131123); + final Fp2 c1 = Fp2factory.construct(927243, 54646465); + final Fp2 c2 = Fp2factory.construct(652623743, 7867867); + + final Fp6_3Over2 Fp6factory = Fq6Parameters.ONE(); + final Fp6_3Over2 a = Fp6factory.construct(c0, c1, c0); + final Fp6_3Over2 b = Fp6factory.construct(c1, c2, c1); + final Fp6_3Over2 c = Fp6factory.construct(c2, c1, c2); - FieldTest(Fq6Parameters.ONE()); + testField(Fp6factory, a, b, c); } @Test public void BLS12_377Fq12Test() { final BLS12_377Fq12Parameters Fq12Parameters = new BLS12_377Fq12Parameters(); + final BLS12_377Fq6Parameters Fq6Parameters = Fq12Parameters.Fq6Parameters; + final Fp2 Fp2factory = Fq6Parameters.Fq2Parameters.ONE(); + final Fp2 c00 = Fp2factory.construct(23456789, 23131123); + final Fp2 c01 = Fp2factory.construct(927243, 54646465); + final Fp2 c02 = Fp2factory.construct(652623743, 7867867); + + final Fp6_3Over2 Fp6factory = Fq6Parameters.ONE(); + final Fp6_3Over2 c0 = Fp6factory.construct(c00, c01, c00); + final Fp6_3Over2 c1 = Fp6factory.construct(c01, c02, c01); + + final Fp12_2Over3Over2 Fp12factory = Fq12Parameters.ONE(); + final Fp12_2Over3Over2 a = Fp12factory.construct(c0, c1); + final Fp12_2Over3Over2 b = Fp12factory.construct(c1, c1); + final Fp12_2Over3Over2 c = Fp12factory.construct(c0, c0); - FieldTest(Fq12Parameters.ONE()); - FrobeniusMapTest(Fq12Parameters.ONE(), Fq12Parameters); + testField(Fp12factory, a, b, c); + testFrobeniusMap(Fp12factory, Fq12Parameters); + testMulBy024(Fp12factory, Fq12Parameters); } } diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java index edcf09f..c605d59 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java @@ -1,6 +1,6 @@ package algebra.curves.barreto_naehrig; -import algebra.curves.GenericFieldsTest; +import algebra.curves.GenericCurveFieldsTest; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq12Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq2Parameters; import algebra.curves.barreto_naehrig.bn254a.bn254a_parameters.BN254aFq6Parameters; @@ -11,86 +11,161 @@ import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFq6Parameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFqParameters; import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bFrParameters; -import org.junit.jupiter.api.Test; +import algebra.fields.Fp; +import algebra.fields.Fp12_2Over3Over2; +import algebra.fields.Fp2; +import algebra.fields.Fp6_3Over2; -public class BNFieldsTest extends GenericFieldsTest { +import org.junit.jupiter.api.Test; +public class BNFieldsTest extends GenericCurveFieldsTest { // BN254a test cases @Test public void BN254aFqTest() { final BN254aFqParameters FqParameters = new BN254aFqParameters(); + final Fp ffactory = FqParameters.ONE(); + final Fp a = ffactory.construct(23456789); + final Fp b = ffactory.construct(927243); + final Fp c = ffactory.construct(652623743); - FieldTest(FqParameters.ONE()); - FieldExpandedTest(FqParameters.ONE()); + testField(ffactory, a, b, c); + testFieldExpanded(ffactory); } @Test public void BN254aFrTest() { final BN254aFrParameters FrParameters = new BN254aFrParameters(); + final Fp ffactory = FrParameters.ONE(); + final Fp a = ffactory.construct(23456789); + final Fp b = ffactory.construct(927243); + final Fp c = ffactory.construct(652623743); - FieldTest(FrParameters.ONE()); - FieldExpandedTest(FrParameters.ONE()); + testField(ffactory, a, b, c); + testFieldExpanded(FrParameters.ONE()); } @Test public void BN254aFq2Test() { final BN254aFq2Parameters Fq2Parameters = new BN254aFq2Parameters(); + final Fp2 ffactory = Fq2Parameters.ONE(); + final Fp2 a = ffactory.construct(23456789, 23131123); + final Fp2 b = ffactory.construct(927243, 54646465); + final Fp2 c = ffactory.construct(652623743, 7867867); - FieldTest(Fq2Parameters.ONE()); + testField(ffactory, a, b, c); } @Test public void BN254aFq6Test() { final BN254aFq6Parameters Fq6Parameters = new BN254aFq6Parameters(); + final Fp2 Fp2factory = Fq6Parameters.Fq2Parameters.ONE(); + final Fp2 c0 = Fp2factory.construct(23456789, 23131123); + final Fp2 c1 = Fp2factory.construct(927243, 54646465); + final Fp2 c2 = Fp2factory.construct(652623743, 7867867); - FieldTest(Fq6Parameters.ONE()); + final Fp6_3Over2 Fp6factory = Fq6Parameters.ONE(); + final Fp6_3Over2 a = Fp6factory.construct(c0, c1, c0); + final Fp6_3Over2 b = Fp6factory.construct(c1, c2, c1); + final Fp6_3Over2 c = Fp6factory.construct(c2, c1, c2); + + testField(Fp6factory, a, b, c); } @Test public void BN254aFq12Test() { final BN254aFq12Parameters Fq12Parameters = new BN254aFq12Parameters(); - - FieldTest(Fq12Parameters.ONE()); - FrobeniusMapTest(Fq12Parameters.ONE(), Fq12Parameters); + final BN254aFq6Parameters Fq6Parameters = Fq12Parameters.Fq6Parameters; + final Fp2 Fp2factory = Fq6Parameters.Fq2Parameters.ONE(); + final Fp2 c00 = Fp2factory.construct(23456789, 23131123); + final Fp2 c01 = Fp2factory.construct(927243, 54646465); + final Fp2 c02 = Fp2factory.construct(652623743, 7867867); + + final Fp6_3Over2 Fp6factory = Fq6Parameters.ONE(); + final Fp6_3Over2 c0 = Fp6factory.construct(c00, c01, c00); + final Fp6_3Over2 c1 = Fp6factory.construct(c01, c02, c01); + + final Fp12_2Over3Over2 Fp12factory = Fq12Parameters.ONE(); + final Fp12_2Over3Over2 a = Fp12factory.construct(c0, c1); + final Fp12_2Over3Over2 b = Fp12factory.construct(c1, c1); + final Fp12_2Over3Over2 c = Fp12factory.construct(c0, c0); + + testField(Fp12factory, a, b, c); + testFrobeniusMap(Fp12factory, Fq12Parameters); + testMulBy024(Fp12factory, Fq12Parameters); } // BN254b test cases - @Test public void BN254bFqTest() { final BN254bFqParameters FqParameters = new BN254bFqParameters(); + final Fp ffactory = FqParameters.ONE(); + final Fp a = ffactory.construct(23456789); + final Fp b = ffactory.construct(927243); + final Fp c = ffactory.construct(652623743); - FieldTest(FqParameters.ONE()); - FieldExpandedTest(FqParameters.ONE()); + testField(ffactory, a, b, c); + testFieldExpanded(ffactory); } @Test public void BN254bFrTest() { final BN254bFrParameters FrParameters = new BN254bFrParameters(); + final Fp ffactory = FrParameters.ONE(); + final Fp a = ffactory.construct(23456789); + final Fp b = ffactory.construct(927243); + final Fp c = ffactory.construct(652623743); - FieldTest(FrParameters.ONE()); - FieldExpandedTest(FrParameters.ONE()); + testField(ffactory, a, b, c); + testFieldExpanded(FrParameters.ONE()); } @Test public void BN254bFq2Test() { final BN254bFq2Parameters Fq2Parameters = new BN254bFq2Parameters(); + final Fp2 ffactory = Fq2Parameters.ONE(); + final Fp2 a = ffactory.construct(23456789, 23131123); + final Fp2 b = ffactory.construct(927243, 54646465); + final Fp2 c = ffactory.construct(652623743, 7867867); - FieldTest(Fq2Parameters.ONE()); + testField(ffactory, a, b, c); } @Test public void BN254bFq6Test() { final BN254bFq6Parameters Fq6Parameters = new BN254bFq6Parameters(); + final Fp2 Fp2factory = Fq6Parameters.Fq2Parameters.ONE(); + final Fp2 c0 = Fp2factory.construct(23456789, 23131123); + final Fp2 c1 = Fp2factory.construct(927243, 54646465); + final Fp2 c2 = Fp2factory.construct(652623743, 7867867); + + final Fp6_3Over2 Fp6factory = Fq6Parameters.ONE(); + final Fp6_3Over2 a = Fp6factory.construct(c0, c1, c0); + final Fp6_3Over2 b = Fp6factory.construct(c1, c2, c1); + final Fp6_3Over2 c = Fp6factory.construct(c2, c1, c2); - FieldTest(Fq6Parameters.ONE()); + testField(Fp6factory, a, b, c); } @Test public void BN254bFq12Test() { final BN254bFq12Parameters Fq12Parameters = new BN254bFq12Parameters(); - - FieldTest(Fq12Parameters.ONE()); - FrobeniusMapTest(Fq12Parameters.ONE(), Fq12Parameters); + final BN254bFq6Parameters Fq6Parameters = Fq12Parameters.Fq6Parameters; + final Fp2 Fp2factory = Fq6Parameters.Fq2Parameters.ONE(); + final Fp2 c00 = Fp2factory.construct(23456789, 23131123); + final Fp2 c01 = Fp2factory.construct(927243, 54646465); + final Fp2 c02 = Fp2factory.construct(652623743, 7867867); + + final Fp6_3Over2 Fp6factory = Fq6Parameters.ONE(); + final Fp6_3Over2 c0 = Fp6factory.construct(c00, c01, c00); + final Fp6_3Over2 c1 = Fp6factory.construct(c01, c02, c01); + + final Fp12_2Over3Over2 Fp12factory = Fq12Parameters.ONE(); + final Fp12_2Over3Over2 a = Fp12factory.construct(c0, c1); + final Fp12_2Over3Over2 b = Fp12factory.construct(c1, c1); + final Fp12_2Over3Over2 c = Fp12factory.construct(c0, c0); + + testField(Fp12factory, a, b, c); + testFrobeniusMap(Fp12factory, Fq12Parameters); + testMulBy024(Fp12factory, Fq12Parameters); } } diff --git a/src/test/java/algebra/fields/FieldsTest.java b/src/test/java/algebra/fields/FieldsTest.java index 33d6506..395f544 100755 --- a/src/test/java/algebra/fields/FieldsTest.java +++ b/src/test/java/algebra/fields/FieldsTest.java @@ -7,172 +7,107 @@ package algebra.fields; -import static org.junit.jupiter.api.Assertions.assertFalse; -import static org.junit.jupiter.api.Assertions.assertTrue; - -import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; -import algebra.fields.abstractfieldparameters.AbstractFp2Parameters; -import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; import algebra.fields.mock.fieldparameters.*; -import org.junit.jupiter.api.Test; - -// TODO: Refactor using the GenericFieldsTest class - -public class FieldsTest { - private > void verify( - final FieldT a, final FieldT b) { - final FieldT zero = a.zero(); - assertTrue(zero.equals(zero)); - assertTrue(zero.isZero()); - assertFalse(zero.isOne()); - final FieldT one = a.one(); - assertTrue(one.equals(one)); - assertTrue(one.isOne()); - assertFalse(one.isZero()); - - // FieldT.random() != FieldT.random() - assertTrue(a.random(4L, null).equals(a.random(4L, null))); - assertFalse(a.random(5L, null).equals(a.random(7L, null))); - assertFalse(a.random(null, "zc".getBytes()).equals(a.random(null, "ash".getBytes()))); - - // a == a - assertTrue(a.equals(a)); - // a != b - assertFalse(a.equals(b)); - // a-a = 0 - assertTrue(a.sub(a).equals(zero)); - // a+0 = a - assertTrue(a.add(zero).equals(a)); - // a*0 = 0 - assertTrue(a.mul(zero).equals(zero)); - // a*1 = a - assertTrue(a.mul(one).equals(a)); - // a*a^-1 = 1 - assertTrue(a.mul(a.inverse()).equals(one)); - // a*a = a^2 - assertTrue(a.mul(a).equals(a.square())); - // a*a*a = a^3 - assertTrue(a.mul(a).mul(a).equals(a.pow(3))); - // a-b = -(b-a) - assertTrue(a.sub(b).equals(b.sub(a).negate())); - // (a+b)+a = a+(b+a) - assertTrue((a.add(b)).add(a).equals(a.add(b.add(a)))); - // (a*b)*a = a*(b*a) - assertTrue((a.mul(b)).mul(a).equals(a.mul(b.mul(a)))); - // (a+b)^2 = a^2 + 2ab + b^2 - assertTrue((a.add(b)).square().equals(a.square().add(a.mul(b).add(a.mul(b))).add(b.square()))); - } - - private > void verifyExtended( - final FieldT a) { - final FieldT one = a.one(); - assertTrue(one.equals(one)); - // (w_8)^8 = 1 - assertTrue(a.rootOfUnity(8).pow(8).equals(one)); - } - - private void verifyMulBy024( - final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { - final AbstractFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); - final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); - - final Fp2 c0 = new Fp2(7, 18, Fp2Parameters); - final Fp2 c2 = new Fp2(23, 5, Fp2Parameters); - final Fp2 c4 = new Fp2(192, 73, Fp2Parameters); - final Fp12_2Over3Over2 naiveResult = - a.mul( - new Fp12_2Over3Over2( - new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), - new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), - Fp12Parameters)); - final Fp12_2Over3Over2 mulBy024Result = a.mulBy024(c0, c2, c4); +import org.junit.jupiter.api.Test; - assertTrue(naiveResult.equals(mulBy024Result)); - } +public class FieldsTest extends GenericFieldsTest { @Test public void ComplexFieldTest() { - final ComplexField a = new ComplexField(2); - final ComplexField b = new ComplexField(5); + final ComplexField ffactory = new ComplexField(); + final ComplexField a = new ComplexField(2, 1); + final ComplexField b = new ComplexField(5, 4); + final ComplexField c = new ComplexField(6, 2); - verify(a, b); - verifyExtended(a); + testField(ffactory, a, b, c); + testFieldExpanded(ffactory); } @Test public void FpTest() { final LargeFpParameters FpParameters = new LargeFpParameters(); - + final Fp ffactory = FpParameters.ONE(); final Fp a = new Fp(6, FpParameters); - final Fp b = new Fp(13, FpParameters); + final Fp b = new Fp(5, FpParameters); + final Fp c = new Fp(2, FpParameters); - verify(a, b); - verifyExtended(a); + testField(ffactory, a, b, c); + testFieldExpanded(ffactory); } @Test public void Fp2Test() { final SmallFp2Parameters Fp2Parameters = new SmallFp2Parameters(); + final Fp2 ffactory = Fp2Parameters.ONE(); + final Fp2 a = new Fp2(6, 4, Fp2Parameters); + final Fp2 b = new Fp2(5, 2, Fp2Parameters); + final Fp2 c = new Fp2(2, 3, Fp2Parameters); - final Fp2 a = new Fp2(1, 2, Fp2Parameters); - final Fp2 b = new Fp2(2, 1, Fp2Parameters); - - verify(a, b); + testField(ffactory, a, b, c); } @Test public void Fp3Test() { final SmallFp3Parameters Fp3Parameters = new SmallFp3Parameters(); + final Fp3 ffactory = Fp3Parameters.ONE(); + final Fp3 a = new Fp3(6, 4, 7, Fp3Parameters); + final Fp3 b = new Fp3(5, 2, 5, Fp3Parameters); + final Fp3 c = new Fp3(2, 3, 2, Fp3Parameters); - final Fp3 a = new Fp3(1, 2, 3, Fp3Parameters); - final Fp3 b = new Fp3(2, 1, 7, Fp3Parameters); - - verify(a, b); + testField(ffactory, a, b, c); } @Test public void Fp6_2Over3Test() { final SmallFp6_2Over3_Parameters Fp6Parameters = new SmallFp6_2Over3_Parameters(); + final SmallFp3Parameters Fp3Parameters = Fp6Parameters.Fp3Parameters(); + final Fp3 c0 = new Fp3(6, 4, 7, Fp3Parameters); + final Fp3 c1 = new Fp3(5, 2, 5, Fp3Parameters); - final Fp3 c0 = new Fp3(1, 2, 3, Fp6Parameters.Fp3Parameters()); - final Fp3 c1 = new Fp3(2, 1, 7, Fp6Parameters.Fp3Parameters()); + final Fp6_2Over3 ffactory = Fp6Parameters.ONE(); + final Fp6_2Over3 a = new Fp6_2Over3(c0, c0, Fp6Parameters); + final Fp6_2Over3 b = new Fp6_2Over3(c0, c1, Fp6Parameters); + final Fp6_2Over3 c = new Fp6_2Over3(c1, c1, Fp6Parameters); - final Fp6_2Over3 a = new Fp6_2Over3(c0, c1, Fp6Parameters); - final Fp6_2Over3 b = new Fp6_2Over3(c1, c0, Fp6Parameters); - - verify(a, b); + testField(ffactory, a, b, c); } @Test public void Fp6_3Over2Test() { final SmallFp6_3Over2_Parameters Fp6Parameters = new SmallFp6_3Over2_Parameters(); + final SmallFp2Parameters Fp2Parameters = Fp6Parameters.Fp2Parameters(); + final Fp2 c0 = new Fp2(5, 4, Fp2Parameters); + final Fp2 c1 = new Fp2(5, 2, Fp2Parameters); + final Fp2 c2 = new Fp2(4, 3, Fp2Parameters); - final Fp2 c0 = new Fp2(1, 2, Fp6Parameters.Fp2Parameters()); - final Fp2 c1 = new Fp2(2, 1, Fp6Parameters.Fp2Parameters()); - final Fp2 c2 = new Fp2(2, 5, Fp6Parameters.Fp2Parameters()); - - final Fp6_3Over2 a = new Fp6_3Over2(c0, c1, c2, Fp6Parameters); - final Fp6_3Over2 b = new Fp6_3Over2(c2, c1, c0, Fp6Parameters); + final Fp6_3Over2 ffactory = Fp6Parameters.ONE(); + final Fp6_3Over2 a = new Fp6_3Over2(c0, c1, c0, Fp6Parameters); + final Fp6_3Over2 b = new Fp6_3Over2(c1, c2, c1, Fp6Parameters); + final Fp6_3Over2 c = new Fp6_3Over2(c2, c1, c2, Fp6Parameters); - verify(a, b); + testField(ffactory, a, b, c); } @Test public void Fp12_2Over3Over2Test() { final SmallFp12_2Over3Over2_Parameters Fp12Parameters = new SmallFp12_2Over3Over2_Parameters(); - - final Fp2 c0 = new Fp2(1, 2, Fp12Parameters.Fp2Parameters()); - final Fp2 c1 = new Fp2(2, 1, Fp12Parameters.Fp2Parameters()); - final Fp2 c2 = new Fp2(2, 5, Fp12Parameters.Fp2Parameters()); - - final Fp6_3Over2 c00 = new Fp6_3Over2(c0, c1, c2, Fp12Parameters.Fp6Parameters()); - final Fp6_3Over2 c01 = new Fp6_3Over2(c2, c1, c0, Fp12Parameters.Fp6Parameters()); - - final Fp12_2Over3Over2 a = new Fp12_2Over3Over2(c00, c01, Fp12Parameters); - final Fp12_2Over3Over2 b = new Fp12_2Over3Over2(c01, c00, Fp12Parameters); - - verify(a, b); - verifyMulBy024(a, Fp12Parameters); + final SmallFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); + final Fp2 c00 = new Fp2(6, 4, Fp2Parameters); + final Fp2 c01 = new Fp2(5, 2, Fp2Parameters); + final Fp2 c02 = new Fp2(8, 3, Fp2Parameters); + + final SmallFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); + final Fp6_3Over2 c0 = new Fp6_3Over2(c00, c01, c00, Fp6Parameters); + final Fp6_3Over2 c1 = new Fp6_3Over2(c01, c02, c01, Fp6Parameters); + final Fp6_3Over2 c2 = new Fp6_3Over2(c02, c01, c02, Fp6Parameters); + + final Fp12_2Over3Over2 ffactory = Fp12Parameters.ONE(); + final Fp12_2Over3Over2 a = new Fp12_2Over3Over2(c0, c1, Fp12Parameters); + final Fp12_2Over3Over2 b = new Fp12_2Over3Over2(c1, c2, Fp12Parameters); + final Fp12_2Over3Over2 c = new Fp12_2Over3Over2(c2, c0, Fp12Parameters); + + testField(ffactory, a, b, c); + testMulBy024(Fp12Parameters.ONE(), Fp12Parameters); } } diff --git a/src/test/java/algebra/curves/GenericFieldsTest.java b/src/test/java/algebra/fields/GenericFieldsTest.java similarity index 69% rename from src/test/java/algebra/curves/GenericFieldsTest.java rename to src/test/java/algebra/fields/GenericFieldsTest.java index 9987c01..398dc84 100755 --- a/src/test/java/algebra/curves/GenericFieldsTest.java +++ b/src/test/java/algebra/fields/GenericFieldsTest.java @@ -1,16 +1,18 @@ -package algebra.curves; +package algebra.fields; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; -import algebra.fields.*; import algebra.fields.abstractfieldparameters.AbstractFp12_2Over3Over2_Parameters; import algebra.fields.abstractfieldparameters.AbstractFp2Parameters; import algebra.fields.abstractfieldparameters.AbstractFp6_3Over2_Parameters; public class GenericFieldsTest { - protected > void FieldTest( - final FieldT fieldFactory) { + protected > void testField( + final FieldT fieldFactory, + final FieldT a, + final FieldT b, + final FieldT c) { final FieldT zero = fieldFactory.zero(); assertTrue(zero.equals(zero)); assertTrue(zero.isZero()); @@ -29,14 +31,11 @@ protected > void FieldTest( .random(null, "clear".getBytes()) .equals(fieldFactory.random(null, "matics".getBytes()))); - // Select 3 distinct field elements for the test - final FieldT a = fieldFactory.random(4L, null); + // Check that the 3 input are distinct field elements for the test assertFalse(a.isZero()); assertFalse(a.isOne()); - final FieldT b = fieldFactory.random(7L, null); assertFalse(b.isZero()); assertFalse(b.isOne()); - final FieldT c = fieldFactory.random(12L, null); assertFalse(c.isZero()); assertFalse(c.isOne()); // Make sure the elements are distinct @@ -74,7 +73,7 @@ protected > void FieldTest( assertTrue((a.add(b)).square().equals(a.square().add(a.mul(b).add(a.mul(b))).add(b.square()))); } - protected > void FieldExpandedTest( + protected > void testFieldExpanded( final FieldT a) { final FieldT one = a.one(); assertTrue(one.equals(one)); @@ -82,35 +81,25 @@ protected > void FieldExpand assertTrue(a.rootOfUnity(8).pow(8).equals(one)); } - protected void verifyMulBy024( - final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { + protected void testMulBy024( + final Fp12_2Over3Over2 fieldFactory, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { final AbstractFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); + final Fp12_2Over3Over2 element = fieldFactory.random(4L, null); + final Fp2 c0 = new Fp2(7, 18, Fp2Parameters); final Fp2 c2 = new Fp2(23, 5, Fp2Parameters); final Fp2 c4 = new Fp2(192, 73, Fp2Parameters); final Fp12_2Over3Over2 naiveResult = - a.mul( - new Fp12_2Over3Over2( - new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), - new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), - Fp12Parameters)); - final Fp12_2Over3Over2 mulBy024Result = a.mulBy024(c0, c2, c4); + element.mul( + new Fp12_2Over3Over2( + new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), + new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), + Fp12Parameters)); + final Fp12_2Over3Over2 mulBy024Result = element.mulBy024(c0, c2, c4); assertTrue(naiveResult.equals(mulBy024Result)); } - - protected void FrobeniusMapTest( - final Fp12_2Over3Over2 a, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { - assert (a.FrobeniusMap(0).equals(a)); - Fp12_2Over3Over2 a_q = a.pow(Fp12Parameters.FpParameters().modulus()); - for (int power = 1; power < 10; ++power) { - final Fp12_2Over3Over2 a_qi = a.FrobeniusMap(power); - assert (a_qi.equals(a_q)); - - a_q = a_q.pow(Fp12Parameters.FpParameters().modulus()); - } - } } From 37beeb92b0c78cfc6719525eabfbf95d447ac0d4 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 15 Jan 2021 14:15:42 +0000 Subject: [PATCH 73/94] Added SerialSNARK test for BLS curve --- .../zkSNARK/grothBGM17/SerialzkSNARKTest.java | 86 +++++++++++++++++++ 1 file changed, 86 insertions(+) diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index 035dc1e..a82b9c7 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -10,6 +10,16 @@ import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; +import algebra.curves.barreto_lynn_scott.*; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG1Parameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG2Parameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Pairing; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; import algebra.curves.barreto_naehrig.*; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; @@ -172,4 +182,80 @@ public void SerialBN254bProofSystemTest() { } // TODO: Add test for BLS + private < + BLSFrT extends BLSFields.BLSFr, + BLSFqT extends BLSFields.BLSFq, + BLSFq2T extends BLSFields.BLSFq2, + BLSFq6T extends BLSFields.BLSFq6, + BLSFq12T extends BLSFields.BLSFq12, + BLSG1T extends BLSG1, + BLSG2T extends BLSG2, + BLSGTT extends BLSGT, + BLSG1ParametersT extends AbstractBLSG1Parameters, + BLSG2ParametersT extends + AbstractBLSG2Parameters, + BLSGTParametersT extends + AbstractBLSGTParameters, + BLSPublicParametersT extends BLSPublicParameters, + BLSPairingT extends + BLSPairing< + BLSFrT, + BLSFqT, + BLSFq2T, + BLSFq6T, + BLSFq12T, + BLSG1T, + BLSG2T, + BLSGTT, + BLSG1ParametersT, + BLSG2ParametersT, + BLSGTParametersT, + BLSPublicParametersT>> +void SerialBLSProofSystemTest( + final int numInputs, + final int numConstraints, + final BLSFrT fieldFactory, + final BLSG1T g1Factory, + final BLSG2T g2Factory, + final BLSPairingT pairing) { + final Tuple3, Assignment, Assignment> construction = + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelation r1cs = construction._1(); + final Assignment primary = construction._2(); + final Assignment auxiliary = construction._3(); + + final CRS CRS = + SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); + + // Make sure that a valid proof verifies + final Proof proofValid = + SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); + final boolean isValidProofValid = + Verifier.verify(CRS.verificationKey(), primary, proofValid, pairing, config); + System.out.println("Verification bit of valid proof: " + isValidProofValid); + assertTrue(isValidProofValid); + + // Make sure that an invalid/random proof does NOT verify + final Proof proofInvalid = + new Proof( + g1Factory.random(config.seed(), config.secureSeed()), + g2Factory.random(config.seed(), config.secureSeed()), + g1Factory.random(config.seed(), config.secureSeed())); + final boolean isInvalidProofValid = + Verifier.verify(CRS.verificationKey(), primary, proofInvalid, pairing, config); + System.out.println("Verification bit of invalid proof: " + isInvalidProofValid); + assertFalse(isInvalidProofValid); + } + + @Test + public void SerialBLSProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BLS12_377Fr fieldFactory = BLS12_377Fr.ONE; + final BLS12_377G1 g1Factory = BLS12_377G1Parameters.ONE; + final BLS12_377G2 g2Factory = BLS12_377G2Parameters.ONE; + final BLS12_377Pairing pairing = new BLS12_377Pairing(); + + SerialBLSProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } } From ec6141107375e3cdbdbfbfbbac20b01e1aecb8ec Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 15 Jan 2021 15:11:45 +0000 Subject: [PATCH 74/94] Ran spotless and commented distributedSNARK test for BN curves --- .../barreto_lynn_scott/BLSFieldsTest.java | 1 - .../curves/barreto_naehrig/BNFieldsTest.java | 1 - src/test/java/algebra/fields/FieldsTest.java | 2 - .../algebra/fields/GenericFieldsTest.java | 18 +- .../grothBGM17/DistributedzkSNARKTest.java | 299 ++++++++++++------ .../zkSNARK/grothBGM17/SerialzkSNARKTest.java | 97 +++--- 6 files changed, 253 insertions(+), 165 deletions(-) diff --git a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java index eef50f6..d641955 100644 --- a/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_lynn_scott/BLSFieldsTest.java @@ -10,7 +10,6 @@ import algebra.fields.Fp12_2Over3Over2; import algebra.fields.Fp2; import algebra.fields.Fp6_3Over2; - import org.junit.jupiter.api.Test; public class BLSFieldsTest extends GenericCurveFieldsTest { diff --git a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java index c605d59..1c335ca 100755 --- a/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java +++ b/src/test/java/algebra/curves/barreto_naehrig/BNFieldsTest.java @@ -15,7 +15,6 @@ import algebra.fields.Fp12_2Over3Over2; import algebra.fields.Fp2; import algebra.fields.Fp6_3Over2; - import org.junit.jupiter.api.Test; public class BNFieldsTest extends GenericCurveFieldsTest { diff --git a/src/test/java/algebra/fields/FieldsTest.java b/src/test/java/algebra/fields/FieldsTest.java index 395f544..217aa07 100755 --- a/src/test/java/algebra/fields/FieldsTest.java +++ b/src/test/java/algebra/fields/FieldsTest.java @@ -8,10 +8,8 @@ package algebra.fields; import algebra.fields.mock.fieldparameters.*; - import org.junit.jupiter.api.Test; - public class FieldsTest extends GenericFieldsTest { @Test public void ComplexFieldTest() { diff --git a/src/test/java/algebra/fields/GenericFieldsTest.java b/src/test/java/algebra/fields/GenericFieldsTest.java index 398dc84..70c7b6e 100755 --- a/src/test/java/algebra/fields/GenericFieldsTest.java +++ b/src/test/java/algebra/fields/GenericFieldsTest.java @@ -9,10 +9,7 @@ public class GenericFieldsTest { protected > void testField( - final FieldT fieldFactory, - final FieldT a, - final FieldT b, - final FieldT c) { + final FieldT fieldFactory, final FieldT a, final FieldT b, final FieldT c) { final FieldT zero = fieldFactory.zero(); assertTrue(zero.equals(zero)); assertTrue(zero.isZero()); @@ -82,7 +79,8 @@ protected > void testFieldEx } protected void testMulBy024( - final Fp12_2Over3Over2 fieldFactory, final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { + final Fp12_2Over3Over2 fieldFactory, + final AbstractFp12_2Over3Over2_Parameters Fp12Parameters) { final AbstractFp2Parameters Fp2Parameters = Fp12Parameters.Fp2Parameters(); final AbstractFp6_3Over2_Parameters Fp6Parameters = Fp12Parameters.Fp6Parameters(); @@ -93,11 +91,11 @@ protected void testMulBy024( final Fp2 c4 = new Fp2(192, 73, Fp2Parameters); final Fp12_2Over3Over2 naiveResult = - element.mul( - new Fp12_2Over3Over2( - new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), - new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), - Fp12Parameters)); + element.mul( + new Fp12_2Over3Over2( + new Fp6_3Over2(c0, Fp2Parameters.ZERO(), c4, Fp6Parameters), + new Fp6_3Over2(Fp2Parameters.ZERO(), c2, Fp2Parameters.ZERO(), Fp6Parameters), + Fp12Parameters)); final Fp12_2Over3Over2 mulBy024Result = element.mulBy024(c0, c2, c4); assertTrue(naiveResult.equals(mulBy024Result)); diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java index 4437839..7ce215e 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/DistributedzkSNARKTest.java @@ -9,18 +9,37 @@ import static org.junit.jupiter.api.Assertions.assertTrue; -import algebra.curves.barreto_naehrig.*; -import algebra.curves.barreto_naehrig.BNFields.*; -import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; -import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; -import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; -import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; -import algebra.curves.barreto_naehrig.bn254b.BN254bG1; -import algebra.curves.barreto_naehrig.bn254b.BN254bG2; -import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; -import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; -import algebra.curves.mock.*; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq12; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq2; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFq6; +import algebra.curves.barreto_lynn_scott.BLSFields.BLSFr; +import algebra.curves.barreto_lynn_scott.BLSG1; +import algebra.curves.barreto_lynn_scott.BLSG2; +import algebra.curves.barreto_lynn_scott.BLSGT; +import algebra.curves.barreto_lynn_scott.BLSPairing; +import algebra.curves.barreto_lynn_scott.BLSPublicParameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG1Parameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG2Parameters; +import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Pairing; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; +import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; +// import algebra.curves.barreto_naehrig.*; +// import algebra.curves.barreto_naehrig.BNFields.*; +// import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG1Parameters; +// import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNG2Parameters; +// import algebra.curves.barreto_naehrig.abstract_bn_parameters.AbstractBNGTParameters; +// import algebra.curves.barreto_naehrig.bn254b.BN254bFields.BN254bFr; +// import algebra.curves.barreto_naehrig.bn254b.BN254bG1; +// import algebra.curves.barreto_naehrig.bn254b.BN254bG2; +// import algebra.curves.barreto_naehrig.bn254b.BN254bPairing; +// import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG1Parameters; +// import algebra.curves.barreto_naehrig.bn254b.bn254b_parameters.BN254bG2Parameters; +// import algebra.curves.mock.*; import configuration.Configuration; import java.io.Serializable; import org.apache.spark.SparkConf; @@ -62,52 +81,169 @@ public void tearDown() { sc = null; } + /* + private < + BNFrT extends BNFr, + BNFqT extends BNFq, + BNFq2T extends BNFq2, + BNFq6T extends BNFq6, + BNFq12T extends BNFq12, + BNG1T extends BNG1, + BNG2T extends BNG2, + BNGTT extends BNGT, + BNG1ParametersT extends AbstractBNG1Parameters, + BNG2ParametersT extends + AbstractBNG2Parameters, + BNGTParametersT extends + AbstractBNGTParameters, + BNPublicParametersT extends BNPublicParameters, + BNPairingT extends + BNPairing< + BNFrT, + BNFqT, + BNFq2T, + BNFq6T, + BNFq12T, + BNG1T, + BNG2T, + BNGTT, + BNG1ParametersT, + BNG2ParametersT, + BNGTParametersT, + BNPublicParametersT>> + void DistributedBNProofSystemTest( + final int numInputs, + final int numConstraints, + final BNFrT fieldFactory, + final BNG1T g1Factory, + final BNG2T g2Factory, + final BNPairingT pairing) { + final Tuple3, Assignment, JavaPairRDD> construction = + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelationRDD r1cs = construction._1(); + final Assignment primary = construction._2(); + final JavaPairRDD fullAssignment = construction._3(); + + final CRS CRS = + DistributedSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); + + final Proof proof = + DistributedProver.prove(CRS.provingKeyRDD(), primary, fullAssignment, fieldFactory, config); + + final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, pairing, config); + + System.out.println(isValid); + assertTrue(isValid); + } + + // TODO: + // Remove this comment when: https://github.com/clearmatics/dizk/issues/1 + // is fixed. + //@Test + //public void DistributedFakeProofSystemTest() { + // final int numInputs = 1023; + // final int numConstraints = 1024; + // + // FakeInitialize.init(); + // final Fp fieldFactory = new FakeFqParameters().ONE(); + // final FakeG1 fakeG1Factory = new FakeG1Parameters().ONE(); + // final FakeG2 fakeG2Factory = new FakeG2Parameters().ONE(); + // final FakePairing fakePairing = new FakePairing(); + // + // final Tuple3, Assignment, JavaPairRDD> construction = + // R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + // final R1CSRelationRDD r1cs = construction._1(); + // final Assignment primary = construction._2(); + // final JavaPairRDD fullAssignment = construction._3(); + // + // final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, + // fakeG1Factory, fakeG2Factory, fakePairing, config); + // final Proof proof = DistributedProver.prove(CRS.provingKeyRDD(), primary, + // fullAssignment, fieldFactory, config); + // final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, + // fakePairing, config); + // + // System.out.println(isValid); + // assertTrue(isValid); + //} + // + + @Test + public void DistributedBN254aProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BN254aFr fieldFactory = new BN254aFr(1); + final BN254aG1 g1Factory = BN254aG1Parameters.ONE; + final BN254aG2 g2Factory = BN254aG2Parameters.ONE; + final BN254aPairing pairing = new BN254aPairing(); + + DistributedBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } + + // This test is also commented because running it along with `DistributedBLSProofSystemTest` + // below triggers a NullPtr exception as reported in https://github.com/clearmatics/dizk/issues/1 + @Test + public void DistributedBN254bProofSystemTest() { + final int numInputs = 1023; + final int numConstraints = 1024; + final BN254bFr fieldFactory = new BN254bFr(1); + final BN254bG1 g1Factory = BN254bG1Parameters.ONE; + final BN254bG2 g2Factory = BN254bG2Parameters.ONE; + final BN254bPairing pairing = new BN254bPairing(); + + DistributedBNProofSystemTest( + numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + } + */ + private < - BNFrT extends BNFr, - BNFqT extends BNFq, - BNFq2T extends BNFq2, - BNFq6T extends BNFq6, - BNFq12T extends BNFq12, - BNG1T extends BNG1, - BNG2T extends BNG2, - BNGTT extends BNGT, - BNG1ParametersT extends AbstractBNG1Parameters, - BNG2ParametersT extends - AbstractBNG2Parameters, - BNGTParametersT extends - AbstractBNGTParameters, - BNPublicParametersT extends BNPublicParameters, - BNPairingT extends - BNPairing< - BNFrT, - BNFqT, - BNFq2T, - BNFq6T, - BNFq12T, - BNG1T, - BNG2T, - BNGTT, - BNG1ParametersT, - BNG2ParametersT, - BNGTParametersT, - BNPublicParametersT>> - void DistributedBNProofSystemTest( + BLSFrT extends BLSFr, + BLSFqT extends BLSFq, + BLSFq2T extends BLSFq2, + BLSFq6T extends BLSFq6, + BLSFq12T extends BLSFq12, + BLSG1T extends BLSG1, + BLSG2T extends BLSG2, + BLSGTT extends BLSGT, + BLSG1ParametersT extends + AbstractBLSG1Parameters, + BLSG2ParametersT extends + AbstractBLSG2Parameters, + BLSGTParametersT extends + AbstractBLSGTParameters, + BLSPublicParametersT extends BLSPublicParameters, + BLSPairingT extends + BLSPairing< + BLSFrT, + BLSFqT, + BLSFq2T, + BLSFq6T, + BLSFq12T, + BLSG1T, + BLSG2T, + BLSGTT, + BLSG1ParametersT, + BLSG2ParametersT, + BLSGTParametersT, + BLSPublicParametersT>> + void DistributedBLSProofSystemTest( final int numInputs, final int numConstraints, - final BNFrT fieldFactory, - final BNG1T g1Factory, - final BNG2T g2Factory, - final BNPairingT pairing) { - final Tuple3, Assignment, JavaPairRDD> construction = - R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); - final R1CSRelationRDD r1cs = construction._1(); - final Assignment primary = construction._2(); - final JavaPairRDD fullAssignment = construction._3(); - - final CRS CRS = + final BLSFrT fieldFactory, + final BLSG1T g1Factory, + final BLSG2T g2Factory, + final BLSPairingT pairing) { + final Tuple3, Assignment, JavaPairRDD> + construction = + R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); + final R1CSRelationRDD r1cs = construction._1(); + final Assignment primary = construction._2(); + final JavaPairRDD fullAssignment = construction._3(); + + final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, pairing, config); - final Proof proof = + final Proof proof = DistributedProver.prove(CRS.provingKeyRDD(), primary, fullAssignment, fieldFactory, config); final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, pairing, config); @@ -116,61 +252,16 @@ void DistributedBNProofSystemTest( assertTrue(isValid); } - // TODO: - // Remove this comment when: https://github.com/clearmatics/dizk/issues/1 - // is fixed. - /* - @Test - public void DistributedFakeProofSystemTest() { - final int numInputs = 1023; - final int numConstraints = 1024; - - FakeInitialize.init(); - final Fp fieldFactory = new FakeFqParameters().ONE(); - final FakeG1 fakeG1Factory = new FakeG1Parameters().ONE(); - final FakeG2 fakeG2Factory = new FakeG2Parameters().ONE(); - final FakePairing fakePairing = new FakePairing(); - - final Tuple3, Assignment, JavaPairRDD> construction = - R1CSConstructor.parallelConstruct(numConstraints, numInputs, fieldFactory, config); - final R1CSRelationRDD r1cs = construction._1(); - final Assignment primary = construction._2(); - final JavaPairRDD fullAssignment = construction._3(); - - final CRS CRS = DistributedSetup.generate(r1cs, fieldFactory, - fakeG1Factory, fakeG2Factory, fakePairing, config); - final Proof proof = DistributedProver.prove(CRS.provingKeyRDD(), primary, - fullAssignment, fieldFactory, config); - final boolean isValid = Verifier.verify(CRS.verificationKey(), primary, proof, - fakePairing, config); - - System.out.println(isValid); - assertTrue(isValid); - } - - @Test - public void DistributedBN254aProofSystemTest() { - final int numInputs = 1023; - final int numConstraints = 1024; - final BN254aFr fieldFactory = new BN254aFr(1); - final BN254aG1 g1Factory = BN254aG1Parameters.ONE; - final BN254aG2 g2Factory = BN254aG2Parameters.ONE; - final BN254aPairing pairing = new BN254aPairing(); - - DistributedBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); - } - */ - @Test - public void DistributedBN254bProofSystemTest() { + public void DistributedBLS12_377ProofSystemTest() { final int numInputs = 1023; final int numConstraints = 1024; - final BN254bFr fieldFactory = new BN254bFr(1); - final BN254bG1 g1Factory = BN254bG1Parameters.ONE; - final BN254bG2 g2Factory = BN254bG2Parameters.ONE; - final BN254bPairing pairing = new BN254bPairing(); + final BLS12_377Fr fieldFactory = new BLS12_377Fr(1); + final BLS12_377G1 g1Factory = BLS12_377G1Parameters.ONE; + final BLS12_377G2 g2Factory = BLS12_377G2Parameters.ONE; + final BLS12_377Pairing pairing = new BLS12_377Pairing(); - DistributedBNProofSystemTest( + DistributedBLSProofSystemTest( numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); } } diff --git a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java index a82b9c7..5ab7290 100755 --- a/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java +++ b/src/test/java/zk_proof_systems/zkSNARK/grothBGM17/SerialzkSNARKTest.java @@ -14,10 +14,10 @@ import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG1Parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSG2Parameters; import algebra.curves.barreto_lynn_scott.abstract_bls_parameters.AbstractBLSGTParameters; +import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G1; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377G2; import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Pairing; -import algebra.curves.barreto_lynn_scott.bls12_377.BLS12_377Fields.BLS12_377Fr; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G1Parameters; import algebra.curves.barreto_lynn_scott.bls12_377.bls12_377_parameters.BLS12_377G2Parameters; import algebra.curves.barreto_naehrig.*; @@ -181,68 +181,70 @@ public void SerialBN254bProofSystemTest() { SerialBNProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); } - // TODO: Add test for BLS + // TODO: Factorize the code below to make it work over arbitrary curves + // (i.e. abstract curve over) private < - BLSFrT extends BLSFields.BLSFr, - BLSFqT extends BLSFields.BLSFq, - BLSFq2T extends BLSFields.BLSFq2, - BLSFq6T extends BLSFields.BLSFq6, - BLSFq12T extends BLSFields.BLSFq12, - BLSG1T extends BLSG1, - BLSG2T extends BLSG2, - BLSGTT extends BLSGT, - BLSG1ParametersT extends AbstractBLSG1Parameters, - BLSG2ParametersT extends - AbstractBLSG2Parameters, - BLSGTParametersT extends - AbstractBLSGTParameters, - BLSPublicParametersT extends BLSPublicParameters, - BLSPairingT extends - BLSPairing< - BLSFrT, - BLSFqT, - BLSFq2T, - BLSFq6T, - BLSFq12T, - BLSG1T, - BLSG2T, - BLSGTT, - BLSG1ParametersT, - BLSG2ParametersT, - BLSGTParametersT, - BLSPublicParametersT>> -void SerialBLSProofSystemTest( - final int numInputs, - final int numConstraints, - final BLSFrT fieldFactory, - final BLSG1T g1Factory, - final BLSG2T g2Factory, - final BLSPairingT pairing) { + BLSFrT extends BLSFields.BLSFr, + BLSFqT extends BLSFields.BLSFq, + BLSFq2T extends BLSFields.BLSFq2, + BLSFq6T extends BLSFields.BLSFq6, + BLSFq12T extends BLSFields.BLSFq12, + BLSG1T extends BLSG1, + BLSG2T extends BLSG2, + BLSGTT extends BLSGT, + BLSG1ParametersT extends + AbstractBLSG1Parameters, + BLSG2ParametersT extends + AbstractBLSG2Parameters, + BLSGTParametersT extends + AbstractBLSGTParameters, + BLSPublicParametersT extends BLSPublicParameters, + BLSPairingT extends + BLSPairing< + BLSFrT, + BLSFqT, + BLSFq2T, + BLSFq6T, + BLSFq12T, + BLSG1T, + BLSG2T, + BLSGTT, + BLSG1ParametersT, + BLSG2ParametersT, + BLSGTParametersT, + BLSPublicParametersT>> + void SerialBLSProofSystemTest( + final int numInputs, + final int numConstraints, + final BLSFrT fieldFactory, + final BLSG1T g1Factory, + final BLSG2T g2Factory, + final BLSPairingT pairing) { final Tuple3, Assignment, Assignment> construction = - R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); + R1CSConstructor.serialConstruct(numConstraints, numInputs, fieldFactory, config); final R1CSRelation r1cs = construction._1(); final Assignment primary = construction._2(); final Assignment auxiliary = construction._3(); final CRS CRS = - SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); + SerialSetup.generate(r1cs, fieldFactory, g1Factory, g2Factory, config); // Make sure that a valid proof verifies final Proof proofValid = - SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); + SerialProver.prove(CRS.provingKey(), primary, auxiliary, fieldFactory, config); final boolean isValidProofValid = - Verifier.verify(CRS.verificationKey(), primary, proofValid, pairing, config); + Verifier.verify(CRS.verificationKey(), primary, proofValid, pairing, config); System.out.println("Verification bit of valid proof: " + isValidProofValid); assertTrue(isValidProofValid); // Make sure that an invalid/random proof does NOT verify final Proof proofInvalid = - new Proof( - g1Factory.random(config.seed(), config.secureSeed()), - g2Factory.random(config.seed(), config.secureSeed()), - g1Factory.random(config.seed(), config.secureSeed())); + new Proof( + g1Factory.random(config.seed(), config.secureSeed()), + g2Factory.random(config.seed(), config.secureSeed()), + g1Factory.random(config.seed(), config.secureSeed())); final boolean isInvalidProofValid = - Verifier.verify(CRS.verificationKey(), primary, proofInvalid, pairing, config); + Verifier.verify(CRS.verificationKey(), primary, proofInvalid, pairing, config); System.out.println("Verification bit of invalid proof: " + isInvalidProofValid); assertFalse(isInvalidProofValid); } @@ -256,6 +258,7 @@ public void SerialBLSProofSystemTest() { final BLS12_377G2 g2Factory = BLS12_377G2Parameters.ONE; final BLS12_377Pairing pairing = new BLS12_377Pairing(); - SerialBLSProofSystemTest(numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); + SerialBLSProofSystemTest( + numInputs, numConstraints, fieldFactory, g1Factory, g2Factory, pairing); } } From 55859d5c019dd92e1541a81dd2f44d3d84c1c6d3 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Fri, 22 Jan 2021 11:24:24 +0000 Subject: [PATCH 75/94] Removed spark-ec2 submodule and added flintrock --- .gitmodules | 7 +++---- depends/flintrock | 1 + depends/spark-ec2 | 1 - 3 files changed, 4 insertions(+), 5 deletions(-) create mode 160000 depends/flintrock delete mode 160000 depends/spark-ec2 diff --git a/.gitmodules b/.gitmodules index 7b37d31..7da053d 100755 --- a/.gitmodules +++ b/.gitmodules @@ -1,4 +1,3 @@ -[submodule "depends/spark-ec2"] - path = depends/spark-ec2 - url = https://github.com/amplab/spark-ec2.git - branch = branch-2.0 +[submodule "depends/flintrock"] + path = depends/flintrock + url = git@github.com:nchammas/flintrock.git diff --git a/depends/flintrock b/depends/flintrock new file mode 160000 index 0000000..6792626 --- /dev/null +++ b/depends/flintrock @@ -0,0 +1 @@ +Subproject commit 6792626956412e61db7c266305a2a0cce7ece7dd diff --git a/depends/spark-ec2 b/depends/spark-ec2 deleted file mode 160000 index e6c4e09..0000000 --- a/depends/spark-ec2 +++ /dev/null @@ -1 +0,0 @@ -Subproject commit e6c4e099d17b97b336298b2543534e5b0be5b7df From d913b2fdfc9b3441ea5e746ab589e3ebfd513059 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 27 Jan 2021 13:56:21 +0000 Subject: [PATCH 76/94] Removed submodules --- .gitmodules | 3 --- depends/flintrock | 1 - 2 files changed, 4 deletions(-) delete mode 160000 depends/flintrock diff --git a/.gitmodules b/.gitmodules index 7da053d..e69de29 100755 --- a/.gitmodules +++ b/.gitmodules @@ -1,3 +0,0 @@ -[submodule "depends/flintrock"] - path = depends/flintrock - url = git@github.com:nchammas/flintrock.git diff --git a/depends/flintrock b/depends/flintrock deleted file mode 160000 index 6792626..0000000 --- a/depends/flintrock +++ /dev/null @@ -1 +0,0 @@ -Subproject commit 6792626956412e61db7c266305a2a0cce7ece7dd From d516de01fc04d4f7b60c09faae4f1d3bfcacb9cb Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 27 Jan 2021 13:57:21 +0000 Subject: [PATCH 77/94] Added steps to run test cluster with Flinctrock --- README.md | 136 ++++++++++++++++++++++++++++-------------------------- 1 file changed, 70 insertions(+), 66 deletions(-) diff --git a/README.md b/README.md index 2804104..fd54161 100755 --- a/README.md +++ b/README.md @@ -19,7 +19,7 @@ The library is developed as part of a paper called *"[DIZK: A Distributed Zero K - [Directory structure](#directory-structure) - [Overview](#overview) - [Build guide](#build-guide) -- [Profiler](#profiler) +- [Configuring AWS and using Flintrock to manage a testing cluster](#configuring-aws-and-using-flintrock-to-manage-a-testing-cluster) - [Benchmarks](#benchmarks) - [References](#references) - [License](#license) @@ -79,8 +79,6 @@ The library has the following dependencies: - [Spark SQL 3.0.1](https://mvnrepository.com/artifact/org.apache.spark/spark-sql) - [JUnit 5.7.0](https://mvnrepository.com/artifact/org.junit.jupiter) - [Spotless with Google Java Format](https://github.com/diffplug/spotless/tree/main/plugin-maven#google-java-format) -- Fetched via Git submodules: - - [spark-ec2](https://github.com/amplab/spark-ec2/tree/branch-2.0) More information about compilation options can be found [here](http://maven.apache.org/plugins/maven-compiler-plugin/compile-mojo.html) @@ -94,17 +92,12 @@ While other libraries for zero knowledge proof systems are written in low-level Start by cloning this repository and entering the repository working directory: ```bash -git clone https://github.com/clearmatics/dizk.git -cd dizk +git clone https://github.com/clearmatics/neodizk.git +cd neodizk # Set up your environment . ./setup_env ``` -Next, fetch the dependency modules: -```bash -git submodule init && git submodule update -``` - Finally, compile the source code: ```bash mvn compile @@ -113,13 +106,13 @@ mvn compile ### Docker ```bash -docker build -t dizk-base . -docker run -it --name dizk-container dizk-base +docker build -t neodizk-base . +docker run -it --name neodizk-container neodizk-base ``` **Note**: For development purpose, you may want to develop from inside a docker container (to avoid touching your local system's configuration). To do so, you can run the following command: ```bash -docker run -ti -v "$(pwd)":/home/dizk-dev dizk-base +docker run -ti -v "$(pwd)":/home/dizk-dev neodizk-base ``` and run `cd /home/dizk-dev` in the container to start developing. @@ -150,76 +143,87 @@ Run: mvn spotless:check ``` -## Profiler +## Configuring AWS and using Flintrock to manage a testing cluster -Using Amazon EC2, the profiler benchmarks the performance of serial and distributed zero-knowledge proof systems, as well as its underlying primitives. -The profiler uses `spark-ec2` to manage the cluster compute environment and a set of provided scripts for launch, profiling, and shutdown. +### Create and configure an AWS account -### Spark EC2 +1. Create an AWS account +2. Follow the set-up instructions [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/get-set-up-for-amazon-ec2.html) + - Select the region + - Create an EC2 keypair (see [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html#retrieving-the-public-key) for more info) + - Create a security group -To manage the cluster compute environment, DIZK uses [`spark-ec2@branch-2.0`](https://github.com/amplab/spark-ec2/tree/branch-2.0). -`spark-ec2` is a tool to launch, maintain, and terminate [Apache Spark](https://spark.apache.org/docs/latest/) clusters on Amazon EC2. +Both the security group and keypair are used to secure the EC2 instances launched, as indicated [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EC2_GetStarted.html). AWS takes care of creating a [default VPC](https://docs.aws.amazon.com/vpc/latest/userguide/default-vpc.html). -To setup `spark-ec2`, run the following commands: -```bash -git clone https://github.com/amplab/spark-ec2.git -cd spark-ec2 -git checkout branch-2.0 -pwd -``` +3. Create the appropriate set of IAM users + - Create an `Administrator` as documented [here](https://docs.aws.amazon.com/IAM/latest/UserGuide/getting-started_create-admin-group.html) + - Create an IAM user for programmatic use with Flintrock. This user needs to have the following permissions: + - `AmazonEC2FullAccess `, + - `IAM.GetInstanceProfile` and `IAM.PassRole` (as documented [here](https://datawookie.dev/blog/2018/09/refining-an-aws-iam-policy-for-flintrock/)) -Remember where the directory for `spark-ec2` is located, as this will need to be provided as an environment variable for the scripts as part of the next step. - -### Profiling scripts - -To begin, set the environment variables required to initialize the profiler in [init.sh](src/main/java/profiler/scripts/init.sh). -The profiling infrastructure will require access to an AWS account access key and secret key, which can be created with -the [instructions provided by AWS](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys). - -```bash -export AWS_ACCESS_KEY_ID={Insert your AWS account access key} -export AWS_SECRET_ACCESS_KEY={Insert your AWS account secret key} +### Using Flintrock -export AWS_KEYPAIR_NAME="{Insert your AWS keypair name, e.g. spark-ec2-oregon}" -export AWS_KEYPAIR_PATH="{Insert the path to your AWS keypair .pem file, e.g. /Users/johndoe/Downloads/spark-ec2-oregon.pem}" +#### Installation -export AWS_REGION_ID={Insert your AWS cluster region choice, e.g. us-west-2} -export AWS_CLUSTER_NAME={Insert your AWS cluster name, e.g. spark-ec2} +```console +python3.7 -m venv env +source env/bin/activate +pip install --upgrade pip +pip install flintrock -export SPOT_PRICE={Insert your spot price for summoning an EC2 instance, e.g. 0.1} -export SLAVES_COUNT={Insert the number of EC2 instances to summon for the cluster, e.g. 2} -export INSTANCE_TYPE={Insert the instance type you would like to summon, e.g. r3.large} - -export DIZK_REPO_PATH="{Insert the path to your local DIZK repository, e.g. /Users/johndoe/dizk}" -export SPARK_EC2_PATH="{Insert the path to your local spark-ec2 repository, e.g. /Users/johndoe/dizk/depends/spark-ec2}" +# Now the flintrock CLI is available +flintrock --help ``` -Next, start the profiler by running: -```bash -./launch.sh +*Note 1:* Flintrock uses [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) which is the Python SDK for AWS. + +*Note 2:* The `flintrock launch` command truly corresponds to clicking the `"Launch instance"` button on the EC2 dashboard. The values of the flags of the `flintrock launch` command correspond to the values that one needs to provide at the various steps of the "Launch instance" process (see [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/launching-instance.html#step-7-review-instance-launch)) + +#### Example + +Below is an example to demonstrate how to launch a test cluster `test-cluster`. +Before doing so, we assume that: +- the private key (`.pem`) file of the created EC2 keypair (see [this step](#create-and-configure-your-aws-account)) is stored on your computer at: `~/.ssh/ec2-key.pem` +- the desired instance type is: `m4.large` +- the choosen AMI is one of the AMIs of either Amazon Linux 2 or the Amazon Linux AMI (see [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/finding-an-ami.html) to find an AMI). In fact, as documented [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/connection-prereqs.html) - the default username one can use to connect to the EC2 instance depends on the choosen AMI. For Amazon Linux (2) AMIs, this default username is `ec2-user`. For the sake of this example, we assume that the choosen AMI is: `ami-00b882ac5193044e4` +- the region is `us-east-1` + +Furthermore, before instantiating a cluster with Flintrock, it is necessary to configure the environment with the credentials ("access key ID" and "secret access key") of the IAM programmatic user created in [previous steps](#create-and-configure-your-aws-account). This can either be done by configuring environment variables, or using a configuration file (as documented [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#configuring-credentials).) + +Once the environment is configured, and assuming the example values above, the command to launch the cluster becomes: +```console +flintrock launch test-cluster \ + --num-slaves 1 \ + --java-version 11 \ + --spark-version 3.0.0 \ + --ec2-instance-type m4.large \ + --ec2-region us-east-1 \ + --ec2-key-name ec2-key \ + --ec2-identity-file ~/.ssh/ec2-key.pem \ + --ec2-ami ami-00b882ac5193044e4 \ + --ec2-instance-initiated-shutdown-behavior terminate \ + --ec2-user ec2-user ``` -The launch script uses `spark-ec2` and the environment variables to setup the initial cluster environment. -This process takes around 20-30 minutes depending on the choice of cluster configuration. +------------------- -After the launch is complete, upload the DIZK JAR file to the master node and SSH into the cluster with the following command: -```bash -./upload_and_login.sh +**TROUBLESHOOTING:** For debug purposes, it is possible to use the `aws` CLI directly. The CLI is available as a [docker container](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-docker.html), however, while running the command like `docker run --rm -ti amazon/aws-cli ` is equivalent to running `aws ` on the host, one needs to remember that no state is preserved across the commands because the containers are removed as soon as the command stops executing. Hence, for a more stateful interaction, it is possible to override the `ENTRYPOINT` of the container by doing: +```console +docker run -it --entrypoint /bin/bash amazon/aws-cli ``` -Once you have successfully logged in to the cluster, navigate to the uploaded `scripts` folder and setup the initial cluster environment. - -```bash -cd ../scripts -./setup_environment.sh -``` +Then, in the container, the `aws` CLI can be used by running `aws `. Note, that credentials need to be configured first via `aws configure`. To check the configured credentials, use `aws iam get-user"` +- If the access is denied, check: + - The aws config (in `~/.aws` or your access key credentials in the container's environment) + - The [time of your machine](https://stackoverflow.com/questions/24744205/ec2-api-error-validating-access-credential), and adjust to the same time of the AWS servers. On Debian-based distributions, this can be done via: + ```console + sudo apt-get --yes install ntpdate + sudo ntpdate 0.amazon.pool.ntp.org + ``` -This creates a logging directory for Spark events and installs requisite dependencies, such as Java 8. +------------------- -Lastly, with the cluster environment fully setup, set the desired parameters for benchmarking in [profile.sh](src/main/java/profiler/scripts/profile.sh) and run the following command to begin profiling: -```bash -./profile.sh -``` +Upon succesful deployment of the cluster, make sure to persist the Flintrock configuration in a configuration file. Then, the cluster can be inspected/stopped/started/destroyed/scaled etc by using the Flintrock commands (e.g. `flintrock describe test-cluster`, `flintrock destroy test-cluster` etc.) ## Benchmarks From 59bc3436470d2dda07074716fe139bfe0079587b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 27 Jan 2021 14:04:27 +0000 Subject: [PATCH 78/94] Fixed broken link in README --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index fd54161..9ae055f 100755 --- a/README.md +++ b/README.md @@ -183,12 +183,12 @@ flintrock --help Below is an example to demonstrate how to launch a test cluster `test-cluster`. Before doing so, we assume that: -- the private key (`.pem`) file of the created EC2 keypair (see [this step](#create-and-configure-your-aws-account)) is stored on your computer at: `~/.ssh/ec2-key.pem` +- the private key (`.pem`) file of the created EC2 keypair (see [this step](#create-and-configure-an-aws-account)) is stored on your computer at: `~/.ssh/ec2-key.pem` - the desired instance type is: `m4.large` - the choosen AMI is one of the AMIs of either Amazon Linux 2 or the Amazon Linux AMI (see [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/finding-an-ami.html) to find an AMI). In fact, as documented [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/connection-prereqs.html) - the default username one can use to connect to the EC2 instance depends on the choosen AMI. For Amazon Linux (2) AMIs, this default username is `ec2-user`. For the sake of this example, we assume that the choosen AMI is: `ami-00b882ac5193044e4` - the region is `us-east-1` -Furthermore, before instantiating a cluster with Flintrock, it is necessary to configure the environment with the credentials ("access key ID" and "secret access key") of the IAM programmatic user created in [previous steps](#create-and-configure-your-aws-account). This can either be done by configuring environment variables, or using a configuration file (as documented [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#configuring-credentials).) +Furthermore, before instantiating a cluster with Flintrock, it is necessary to configure the environment with the credentials ("access key ID" and "secret access key") of the IAM programmatic user created in [previous steps](#create-and-configure-an-aws-account). This can either be done by configuring environment variables, or using a configuration file (as documented [here](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#configuring-credentials).) Once the environment is configured, and assuming the example values above, the command to launch the cluster becomes: ```console From 7850df5cf25b9241e1fe5c934767295765f590c5 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 15:49:49 +0000 Subject: [PATCH 79/94] Modified install instructions to fetch latest flintrock version supporting custom JDKs config --- README.md | 9 ++++++--- 1 file changed, 6 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 9ae055f..dd1d6f9 100755 --- a/README.md +++ b/README.md @@ -169,15 +169,18 @@ Both the security group and keypair are used to secure the EC2 instances launche python3.7 -m venv env source env/bin/activate pip install --upgrade pip -pip install flintrock +# Install the latest develop version of flintrock +pip install git+https://github.com/nchammas/flintrock # Now the flintrock CLI is available flintrock --help ``` -*Note 1:* Flintrock uses [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) which is the Python SDK for AWS. +*Note 1:* The latest stable version of Flintrock can be installed by simply running `pip install flintrock`. However, improvements have been added (and not yet packaged in a release) since the `1.0.0` release. In the following, we make the assumption that the [support for configurable JDKs](https://github.com/nchammas/flintrock/commit/6792626956412e61db7c266305a2a0cce7ece7dd) is available in the Flintrock CLI. -*Note 2:* The `flintrock launch` command truly corresponds to clicking the `"Launch instance"` button on the EC2 dashboard. The values of the flags of the `flintrock launch` command correspond to the values that one needs to provide at the various steps of the "Launch instance" process (see [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/launching-instance.html#step-7-review-instance-launch)) +*Note 2:* Flintrock uses [boto3](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html) which is the Python SDK for AWS. + +*Note 3:* The `flintrock launch` command truly corresponds to clicking the `"Launch instance"` button on the EC2 dashboard. The values of the flags of the `flintrock launch` command correspond to the values that one needs to provide at the various steps of the "Launch instance" process (see [here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/launching-instance.html#step-7-review-instance-launch)) #### Example From 10660fa5261814852cbc19bf195f3ab285acdd18 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 16:01:39 +0000 Subject: [PATCH 80/94] Renamed artifiactId --- pom.xml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/pom.xml b/pom.xml index af00305..668cd5a 100644 --- a/pom.xml +++ b/pom.xml @@ -5,8 +5,8 @@ 4.0.0 - dizk - dizk + neodizk + neodizk 0.1.0 jar From 6a417ffbf2fef7ef48a3c3449096097d8a43262c Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 16:02:12 +0000 Subject: [PATCH 81/94] Added instructions to submit spark application to the cluster --- README.md | 26 +++++++++++++++++++++++++- 1 file changed, 25 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index dd1d6f9..52b675c 100755 --- a/README.md +++ b/README.md @@ -226,7 +226,31 @@ Then, in the container, the `aws` CLI can be used by running `aws `. No ------------------- -Upon succesful deployment of the cluster, make sure to persist the Flintrock configuration in a configuration file. Then, the cluster can be inspected/stopped/started/destroyed/scaled etc by using the Flintrock commands (e.g. `flintrock describe test-cluster`, `flintrock destroy test-cluster` etc.) +Upon successful deployment of the cluster, make sure to persist the Flintrock configuration in a configuration file (with `flintrock configure`). Then, the cluster can be inspected/stopped/started/destroyed/scaled etc by using the Flintrock commands (e.g. `flintrock describe test-cluster`, `flintrock destroy test-cluster` etc.) + +#### Running an application on the cluster + +Upon successful instantiation of the cluster, the steps to deploy an application are: +1. Package your application (create a `.jar`): +```console +mvn package +``` +2. As documented [here](https://medium.com/@jon.froiland/apache-spark-and-hadoop-on-an-aws-cluster-with-flintrock-part-4-42cf55787928): + - Move the `.jar` to the cluster via `flintrock copy-file`, e.g.: + ```console + flintrock copy-file test-cluster $DIZK/target/neodizk-0.1.0.jar /home/ec2-user/ + ``` + - Login to the cluster via `flintrock login`, e.g.: + ```console + flintrock login test-cluster + ``` + - Start the application from the master node with `spark-submit`, e.g.: + ```console + spark-submit --class profiler.Profiler /home/ec2-user/neodizk-0.1.0.jar 2 1 8G zksnark-large 15 4 + ``` +3. (Optional) Access SparkUI from your host machine: + - `:8080` + - `:4040`, where `` can be obtained by running `flintrock describe` ## Benchmarks From 444f8e19e5e7b9eb6f581be6f3bb5cb1c2bf366e Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 16:11:45 +0000 Subject: [PATCH 82/94] Added note on configuration flags for spark-submit --- README.md | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/README.md b/README.md index 52b675c..64bf03a 100755 --- a/README.md +++ b/README.md @@ -252,6 +252,24 @@ mvn package - `:8080` - `:4040`, where `` can be obtained by running `flintrock describe` +**Note:** Additional configuration parameters can be passed to the `spark-submit` command, e.g.: +```console +--conf spark.memory.fraction +--conf spark.memory.storageFraction +... +--conf spark.rdd.compress +... +--conf spark.speculation +--conf spark.speculation.interval +--conf spark.speculation.multiplier +... +--conf spark.logConf +--conf spark.eventLog.enabled +--conf spark.eventLog.dir +... +``` +See [here](https://spark.apache.org/docs/latest/configuration.html) for more information on the configuration, and see [this blog post](https://yousry.medium.com/spark-speculative-execution-in-10-lines-of-code-3c6e4815875b) for an introduction to speculative execution in Spark. + ## Benchmarks We evaluate the distributed implementation of the zkSNARK setup and prover. From cbdcb710ee20633c2d9e7527320bca5288b04327 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 17:00:56 +0000 Subject: [PATCH 83/94] Added instructions to fetch logs from cluster --- README.md | 14 +++++++++----- 1 file changed, 9 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 64bf03a..fb3a350 100755 --- a/README.md +++ b/README.md @@ -246,12 +246,20 @@ mvn package ``` - Start the application from the master node with `spark-submit`, e.g.: ```console - spark-submit --class profiler.Profiler /home/ec2-user/neodizk-0.1.0.jar 2 1 8G zksnark-large 15 4 + # Create a location to store the logs of the application and pass it to the spark-submit command + mkdir /tmp/spark-events + spark-submit --class profiler.Profiler --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=/tmp/spark-events /home/ec2-user/neodizk-0.1.0.jar 2 1 8G zksnark-large 15 4 ``` + **Note:** The above can also be carried out directly from the host (without login to the master node of the cluster) via the `flintrock run-command` command. 3. (Optional) Access SparkUI from your host machine: - `:8080` - `:4040`, where `` can be obtained by running `flintrock describe` +4. (Optional) If the `spark-submit` command is used along with the `--conf spark.eventLog.enabled=true` and `--conf spark.eventLog.dir=/tmp/spark-events` flags, the logs can be recovered on the host by running: +```console +scp -i -r ec2-user@:/tmp/spark-events/src/main/resources/logs/ $DIZK/out/ +``` + **Note:** Additional configuration parameters can be passed to the `spark-submit` command, e.g.: ```console --conf spark.memory.fraction @@ -263,10 +271,6 @@ mvn package --conf spark.speculation.interval --conf spark.speculation.multiplier ... ---conf spark.logConf ---conf spark.eventLog.enabled ---conf spark.eventLog.dir -... ``` See [here](https://spark.apache.org/docs/latest/configuration.html) for more information on the configuration, and see [this blog post](https://yousry.medium.com/spark-speculative-execution-in-10-lines-of-code-3c6e4815875b) for an introduction to speculative execution in Spark. From 24428ab46973305279dd46221e522b326d564e9b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 17:02:37 +0000 Subject: [PATCH 84/94] Removed broken legacy profiler scripts --- src/main/java/profiler/scripts/destroy.sh | 9 ---- src/main/java/profiler/scripts/fetch_logs.sh | 9 ---- src/main/java/profiler/scripts/init.sh | 19 -------- src/main/java/profiler/scripts/launch.sh | 16 ------- src/main/java/profiler/scripts/login.sh | 10 ----- src/main/java/profiler/scripts/profile.sh | 43 ------------------- .../profiler/scripts/setup_environment.sh | 16 ------- .../java/profiler/scripts/upload_and_login.sh | 16 ------- 8 files changed, 138 deletions(-) delete mode 100755 src/main/java/profiler/scripts/destroy.sh delete mode 100755 src/main/java/profiler/scripts/fetch_logs.sh delete mode 100755 src/main/java/profiler/scripts/init.sh delete mode 100755 src/main/java/profiler/scripts/launch.sh delete mode 100755 src/main/java/profiler/scripts/login.sh delete mode 100755 src/main/java/profiler/scripts/profile.sh delete mode 100755 src/main/java/profiler/scripts/setup_environment.sh delete mode 100755 src/main/java/profiler/scripts/upload_and_login.sh diff --git a/src/main/java/profiler/scripts/destroy.sh b/src/main/java/profiler/scripts/destroy.sh deleted file mode 100755 index dfc937c..0000000 --- a/src/main/java/profiler/scripts/destroy.sh +++ /dev/null @@ -1,9 +0,0 @@ -#!/usr/bin/env bash - -. init.sh - -$SPARK_EC2_PATH \ - --key-pair=$AWS_KEYPAIR_NAME \ - --identity-file=$AWS_KEYPAIR_PATH \ - --region=$AWS_REGION_ID \ - destroy $AWS_CLUSTER_NAME diff --git a/src/main/java/profiler/scripts/fetch_logs.sh b/src/main/java/profiler/scripts/fetch_logs.sh deleted file mode 100755 index 06c3bae..0000000 --- a/src/main/java/profiler/scripts/fetch_logs.sh +++ /dev/null @@ -1,9 +0,0 @@ -#!/usr/bin/env bash - -. init.sh - -# Get master node URL -readonly MASTER=`$SPARK_EC2_PATH -k $AWS_KEYPAIR_NAME -i $AWS_KEYPAIR_PATH --region=$AWS_REGION_ID get-master $AWS_CLUSTER_NAME | grep amazonaws.com` - -# Transfer logs back to local -scp -i $AWS_KEYPAIR_PATH -r ec2-user@$MASTER:/tmp/spark-events/src/main/resources/logs/ $DIZK_REPO_PATH/out/ \ No newline at end of file diff --git a/src/main/java/profiler/scripts/init.sh b/src/main/java/profiler/scripts/init.sh deleted file mode 100755 index 6b1b35e..0000000 --- a/src/main/java/profiler/scripts/init.sh +++ /dev/null @@ -1,19 +0,0 @@ -#!/usr/bin/env bash - -export AWS_ACCESS_KEY_ID= -export AWS_SECRET_ACCESS_KEY= - -export AWS_KEYPAIR_NAME="" -export AWS_KEYPAIR_PATH="" - -export AWS_REGION_ID= -export AWS_CLUSTER_NAME= - -export SPOT_PRICE= -export SLAVES_COUNT= -export INSTANCE_TYPE= - -export DIZK_REPO_PATH="" -export SPARK_EC2_PATH="" - - diff --git a/src/main/java/profiler/scripts/launch.sh b/src/main/java/profiler/scripts/launch.sh deleted file mode 100755 index 0e43933..0000000 --- a/src/main/java/profiler/scripts/launch.sh +++ /dev/null @@ -1,16 +0,0 @@ -#!/usr/bin/env bash - -. init.sh - -# Instantiate EC2 cluster -$SPARK_EC2_PATH \ - --use-existing-master \ - --key-pair=$AWS_KEYPAIR_NAME \ - --identity-file=$AWS_KEYPAIR_PATH \ - --region=$AWS_REGION_ID \ - --spot-price=$SPOT_PRICE \ - --slaves=$SLAVES_COUNT \ - --instance-type=$INSTANCE_TYPE \ - --deploy-root-dir=$DIZK_REPO_PATH/src/main/java/profiler/scripts \ - --spark-version=2.1.0 \ - launch $AWS_CLUSTER_NAME diff --git a/src/main/java/profiler/scripts/login.sh b/src/main/java/profiler/scripts/login.sh deleted file mode 100755 index f9d2ad7..0000000 --- a/src/main/java/profiler/scripts/login.sh +++ /dev/null @@ -1,10 +0,0 @@ -#!/usr/bin/env bash - -. init.sh - -# Login -$SPARK_EC2_PATH \ - --key-pair=$AWS_KEYPAIR_NAME \ - --identity-file=$AWS_KEYPAIR_PATH \ - --region=$AWS_REGION_ID \ - login $AWS_CLUSTER_NAME \ No newline at end of file diff --git a/src/main/java/profiler/scripts/profile.sh b/src/main/java/profiler/scripts/profile.sh deleted file mode 100755 index b988f33..0000000 --- a/src/main/java/profiler/scripts/profile.sh +++ /dev/null @@ -1,43 +0,0 @@ -#!/usr/bin/env bash - -# Copy project to worker nodes -./spark-ec2/copy-dir /home/ec2-user/ - -export JAVA_HOME="/usr/lib/jvm/java-1.8.0" - -for TOTAL_CORES in 8; do - for SIZE in `seq 15 25`; do - - export APP=dizk-large - export MEMORY=16G - export MULTIPLIER=2 - - export CORES=1 - export NUM_EXECUTORS=$((TOTAL_CORES / CORES)) - export NUM_PARTITIONS=$((TOTAL_CORES * MULTIPLIER)) - - /root/spark/bin/spark-submit \ - --conf spark.driver.memory=$MEMORY \ - --conf spark.driver.maxResultSize=$MEMORY \ - --conf spark.executor.cores=$CORES \ - --total-executor-cores $TOTAL_CORES \ - --conf spark.executor.memory=$MEMORY \ - --conf spark.memory.fraction=0.95 \ - --conf spark.memory.storageFraction=0.3 \ - --conf spark.kryoserializer.buffer.max=1g \ - --conf spark.rdd.compress=true \ - --conf spark.rpc.message.maxSize=1024 \ - --conf spark.executor.heartbeatInterval=30s \ - --conf spark.network.timeout=300s\ - --conf spark.speculation=true \ - --conf spark.speculation.interval=5000ms \ - --conf spark.speculation.multiplier=2 \ - --conf spark.local.dir=/mnt/spark \ - --conf spark.logConf=true \ - --conf spark.eventLog.dir=/tmp/spark-events \ - --conf spark.eventLog.enabled=false \ - --class "profiler.Profiler" \ - /home/ec2-user/dizk-1.0.jar $NUM_EXECUTORS $CORES $MEMORY $APP $SIZE $NUM_PARTITIONS - done -done - diff --git a/src/main/java/profiler/scripts/setup_environment.sh b/src/main/java/profiler/scripts/setup_environment.sh deleted file mode 100755 index f3a6ebd..0000000 --- a/src/main/java/profiler/scripts/setup_environment.sh +++ /dev/null @@ -1,16 +0,0 @@ -#!/usr/bin/env bash - -# Create logging directory -mkdir /tmp/spark-events -export SPARK_EVENTS_DIR="/tmp/spark-events" -/root/spark-ec2/copy-dir /tmp/spark-events - -# Install dependencies -sudo yum install -y tmux -sudo yum install -y java-1.8.0-openjdk-devel -sudo yum remove -y java-1.7.0-openjdk -export JAVA_HOME="/usr/lib/jvm/java-1.8.0" - -pssh -h /root/spark-ec2/slaves yum install -y java-1.8.0-openjdk-devel -pssh -h /root/spark-ec2/slaves yum remove -y java-1.7.0-openjdk -pssh -h /root/spark-ec2/slaves export JAVA_HOME="/usr/lib/jvm/java-1.8.0" diff --git a/src/main/java/profiler/scripts/upload_and_login.sh b/src/main/java/profiler/scripts/upload_and_login.sh deleted file mode 100755 index 6829d76..0000000 --- a/src/main/java/profiler/scripts/upload_and_login.sh +++ /dev/null @@ -1,16 +0,0 @@ -#!/usr/bin/env bash - -. init.sh - -# Get master node URL -readonly MASTER=`$SPARK_EC2_PATH -k $AWS_KEYPAIR_NAME -i $AWS_KEYPAIR_PATH --region=$AWS_REGION_ID get-master $AWS_CLUSTER_NAME | grep amazonaws.com` - -# Transfer project to master node -scp -i $AWS_KEYPAIR_PATH $DIZK_REPO_PATH/target/dizk-1.0.jar ec2-user@$MASTER:/home/ec2-user/ - -# Login -$SPARK_EC2_PATH \ - --key-pair=$AWS_KEYPAIR_NAME \ - --identity-file=$AWS_KEYPAIR_PATH \ - --region=$AWS_REGION_ID \ - login $AWS_CLUSTER_NAME \ No newline at end of file From 1b97a722a52a90f1f0fdd4fa02e86e404a96c2d4 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 3 Feb 2021 17:08:08 +0000 Subject: [PATCH 85/94] Added TODO to rename class --- src/main/java/profiler/utils/SparkUtils.java | 1 + 1 file changed, 1 insertion(+) diff --git a/src/main/java/profiler/utils/SparkUtils.java b/src/main/java/profiler/utils/SparkUtils.java index 31f1150..5e91337 100755 --- a/src/main/java/profiler/utils/SparkUtils.java +++ b/src/main/java/profiler/utils/SparkUtils.java @@ -56,6 +56,7 @@ public static String appName(final String s) { } } + // TODO: Rename this class public static Class[] zksparkClasses() { return new Class[] { Fp.class, From 1af7023720249d8b064825b0e1a48d9600f1b7ff Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:03:57 +0000 Subject: [PATCH 86/94] Added ganglia configuration scripts --- scripts/ganglia_setup_master.sh | 51 +++++++++++++++++++++++++++++++++ scripts/ganglia_setup_worker.sh | 28 ++++++++++++++++++ 2 files changed, 79 insertions(+) create mode 100755 scripts/ganglia_setup_master.sh create mode 100755 scripts/ganglia_setup_worker.sh diff --git a/scripts/ganglia_setup_master.sh b/scripts/ganglia_setup_master.sh new file mode 100755 index 0000000..bfa1bdf --- /dev/null +++ b/scripts/ganglia_setup_master.sh @@ -0,0 +1,51 @@ +#!/usr/bin/env bash + +# Expect 2 non-empty arguments +if [ "$#" -ne 1 ] || [ "$1" == "" ] ; then + echo "Error: invalid number of arguments" + echo "Usage: $0 " + echo "" + exit 1 +fi + +CLUSTER_NAME=$1 + +# Install ganglia dependencies +echo "INFO: install ganglia dependencies" +yum update -y +amazon-linux-extras install -y epel +yum install -y ganglia rrdtool ganglia-gmetad ganglia-gmond ganglia-web + +echo "INFO: edit /etc/ganglia/gmetad.conf" +# Keep the default port (8649) and default polling interval (15s) +sed -i "s/my cluster/$CLUSTER_NAME/g" /etc/ganglia/gmetad.conf + +echo "INFO: edit /etc/ganglia/gmond.conf" +sed -i 's/ name = "unspecified"/ name = '\"$CLUSTER_NAME\"'/g' /etc/ganglia/gmond.conf +# Ports in gmond are already set by default so no need to change +sed -i "0,/ mcast_join = /! {0,/ mcast_join = / s/ mcast_join =/ #mcast_join = /}" /etc/ganglia/gmond.conf +sed -i "s/ mcast_join = .*/ host = localhost/g" /etc/ganglia/gmond.conf +sed -i "s/ bind = / #bind = /g" /etc/ganglia/gmond.conf +sed -i "s/ retry_bind = / #retry_bind = /g" /etc/ganglia/gmond.conf + +# Restart the services after editing the config files +echo "INFO: restarting the services" +service httpd restart +service gmond restart +service gmetad restart + +# Edit the default config `/etc/httpd/conf.d/ganglia.conf` +echo "INFO: make sure to correctly configure the webserver (then restart the httpd service)" +# E.g. set up some auth to access the ganglia dashboard +# $ htpasswd -c /etc/httpd/auth.basic adminganglia +# $ vi /etc/httpd/conf.d/ganglia.conf +## Alias /ganglia /usr/share/ganglia +## +## AuthType basic +## AuthName "Ganglia web UI" +## AuthBasicProvider file +## AuthUserFile "/etc/httpd/auth.basic" +## Require user adminganglia +## + +echo "INFO: make sure to correctly configure the AWS security-group to be able to access the ganglia dashboard" diff --git a/scripts/ganglia_setup_worker.sh b/scripts/ganglia_setup_worker.sh new file mode 100755 index 0000000..1583546 --- /dev/null +++ b/scripts/ganglia_setup_worker.sh @@ -0,0 +1,28 @@ +#!/usr/bin/env bash + +# Expect 2 non-empty arguments +if [ "$#" -ne 2 ] || [ "$1" == "" ] || [ "$2" == "" ] ; then + echo "Error: invalid number of arguments" + echo "Usage: $0 " + echo "" + exit 1 +fi + +CLUSTER_NAME=$1 +GMETAD_HOST=$2 + +# Install ganglia dependencies +echo "INFO: install ganglia dependencies" +yum update -y +amazon-linux-extras install -y epel +yum install -y ganglia ganglia-gmond + +echo "INFO: edit gmond.conf" +sed -i 's/ name = "unspecified"/ name = '\"$CLUSTER_NAME\"'/g' /etc/ganglia/gmond.conf +sed -i "0,/ mcast_join = /! {0,/ mcast_join = / s/ mcast_join = /#mcast_join = /}" /etc/ganglia/gmond.conf +sed -i "s/ mcast_join = .*/ host = $GMETAD_HOST/g" /etc/ganglia/gmond.conf +sed -i "s/ bind = / #bind = /g" /etc/ganglia/gmond.conf +sed -i "s/ retry_bind = / #retry_bind = /g" /etc/ganglia/gmond.conf + +echo "INFO: restarting the services" +service gmond restart From 708d79c5d65d67b53e62b3d5fba665db4b725ae9 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:04:10 +0000 Subject: [PATCH 87/94] Added ganglia configuration instructions --- README.md | 63 +++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 63 insertions(+) diff --git a/README.md b/README.md index fb3a350..ad94b9e 100755 --- a/README.md +++ b/README.md @@ -274,6 +274,69 @@ scp -i -r ec2-user@:/tmp/spark-events/src/main/res ``` See [here](https://spark.apache.org/docs/latest/configuration.html) for more information on the configuration, and see [this blog post](https://yousry.medium.com/spark-speculative-execution-in-10-lines-of-code-3c6e4815875b) for an introduction to speculative execution in Spark. +### Setup monitoring infrastructure + +To have more metrics about the cluster's health and usage, monitoring tools like [ganglia](http://ganglia.sourceforge.net/) can be used. This is particularly important to carry out meaningful optimizations. + +To use ganglia with apache spark, spark needs to be compiled from source due to [license mismatch](https://github.com/apache/spark/blob/master/pom.xml#L3162-L3168). To install a ganglia-compatible version of spark on the cluster, you can modify flintrock [here](https://github.com/nchammas/flintrock/blob/master/flintrock/services.py#L342) as follows: +```diff +-./dev/make-distribution.sh -Phadoop-{hadoop_short_version} ++./dev/make-distribution.sh -Pspark-ganglia-lgpl -Phadoop-{hadoop_short_version} +``` + +and make sure to build spark from a specific commit, by using: `flintrock launch --spark-git-commit 97340c1e34cfd84de445b6b7545cfa466a1baaf6 [other flags]` (here commit `97340c1e34cfd84de445b6b7545cfa466a1baaf6` corresponds to apache version [3.1.0](https://github.com/apache/spark/releases/tag/v3.1.0)). + +#### Configure the master node + +Once the cluster is started: +1. Configure the master node to run ganglia: +```console +flintrock copy-file scripts/ganglia_setup_master.sh /home/ec2-user/ --master-only +flintrock run-command test-cluster --master-only 'sudo /home/ec2-user/ganglia_setup_master.sh test-cluster' +``` +2. Make sure to configure the webserver appropriately by editing `/etc/httpd/conf/httpd.conf` as desired (e.g. change default port) +3. Edit `/etc/httpd/conf.d/ganglia.conf` as desired (for e.g. write the auth configuration to access the dashboard) +4. Restart the `httpd` service: `service httpd restart` +5. Double check the AWS rules of the relevant security groups and make sure they align with the configuration above. + +#### Configure the worker nodes + +After configuring the master node, configure the worker nodes to send their metrics information to the master/reporting node (since flintrock only has a flag `--master-only` for the `copy-file` and `run-command` commands - and no flag `--workers-only`, we use ssh/scp commands to achieve the same thing below): +```console +# Copy the configuration script to the each worker node +scp -i $AWS_KEYPAIR_PATH scripts/ganglia_setup_worker.sh ec2-user@:/home/ec2-user/ +# Connect to each worker node +ssh -i $AWS_KEYPAIR_PATH ec2-user@ +# On the node execute the following commands +sudo ./ganglia_setup_worker.sh +``` + +#### Configure Spark to use the GangliaSink + +Write a spark metrics configuration file. To do so, paste the following configuration +```console +*.sink.ganglia.class = org.apache.spark.metrics.sink.GangliaSink +*.sink.ganglia.host = +*.sink.ganglia.port = 8649 +*.sink.ganglia.period = 10 +*.sink.ganglia.unit = seconds +*.sink.ganglia.ttl = 1 +*.sink.ganglia.mode = unicast +*.sink.ganglia.name = Spark-name + +*.sink.console.class = org.apache.spark.metrics.sink.ConsoleSink +*.sink.console.period = 10 +*.sink.console.unit = seconds + +master.source.jvm.class = org.apache.spark.metrics.source.JvmSource +worker.source.jvm.class = org.apache.spark.metrics.source.JvmSource +driver.source.jvm.class = org.apache.spark.metrics.source.JvmSource +executor.source.jvm.class = org.apache.spark.metrics.source.JvmSource +``` +in `$SPARK_HOME/conf/metrics.properties` on all nodes of the cluster (make sure to replace `` by the actual host node IP in the cluster). + +After these steps, one can access the ganglia dashboard from the master/host node. Upon submission of a job on the cluster via `spark-submit`, the metrics of the various spark cluster nodes can be monitored on the dashboard - in addition to the SparkUI. + ## Benchmarks We evaluate the distributed implementation of the zkSNARK setup and prover. From d8d1f130c5af81bcd753ce182db61a7950119a22 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:12:06 +0000 Subject: [PATCH 88/94] Fixed typo in README --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index ad94b9e..32921b5 100755 --- a/README.md +++ b/README.md @@ -311,7 +311,7 @@ ssh -i $AWS_KEYPAIR_PATH ec2-user@ sudo ./ganglia_setup_worker.sh ``` -#### Configure Spark to use the GangliaSink +#### Configure Spark to use GangliaSink Write a spark metrics configuration file. To do so, paste the following configuration ```console From d424da6a41e473d02e62b92d899af233ae661195 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:14:52 +0000 Subject: [PATCH 89/94] Fixed typo in script ganglia master --- scripts/ganglia_setup_master.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scripts/ganglia_setup_master.sh b/scripts/ganglia_setup_master.sh index bfa1bdf..cb67032 100755 --- a/scripts/ganglia_setup_master.sh +++ b/scripts/ganglia_setup_master.sh @@ -1,6 +1,6 @@ #!/usr/bin/env bash -# Expect 2 non-empty arguments +# Expect 1 non-empty argument if [ "$#" -ne 1 ] || [ "$1" == "" ] ; then echo "Error: invalid number of arguments" echo "Usage: $0 " From 62b2f6e241918e4b083f3945bbeda4745a18351d Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:20:36 +0000 Subject: [PATCH 90/94] Fixed typo in command --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 32921b5..4b23640 100755 --- a/README.md +++ b/README.md @@ -292,7 +292,7 @@ Once the cluster is started: 1. Configure the master node to run ganglia: ```console flintrock copy-file scripts/ganglia_setup_master.sh /home/ec2-user/ --master-only -flintrock run-command test-cluster --master-only 'sudo /home/ec2-user/ganglia_setup_master.sh test-cluster' +flintrock run-command --master-only 'sudo /home/ec2-user/ganglia_setup_master.sh ' ``` 2. Make sure to configure the webserver appropriately by editing `/etc/httpd/conf/httpd.conf` as desired (e.g. change default port) 3. Edit `/etc/httpd/conf.d/ganglia.conf` as desired (for e.g. write the auth configuration to access the dashboard) From 17d843c9655a5718e43179da57144a2b95566087 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:41:44 +0000 Subject: [PATCH 91/94] Added Clearmatics copyright --- LICENSE | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/LICENSE b/LICENSE index 1afe90f..17c2579 100755 --- a/LICENSE +++ b/LICENSE @@ -1,6 +1,7 @@ The DIZK library is developed by SCIPR Lab (http://scipr-lab.org) and contributors. +Copyright (c) 2020-2021 Clearmatics Copyright (c) 2018 SCIPR Lab and contributors (see AUTHORS file). All files, with the exceptions below, are released under the MIT License: @@ -21,4 +22,4 @@ All files, with the exceptions below, are released under the MIT License: AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN - THE SOFTWARE. \ No newline at end of file + THE SOFTWARE. From 4a7745ac160b817cae3ee73174225b0c1e5a1910 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 14:47:46 +0000 Subject: [PATCH 92/94] Removed dated sections of the README --- README.md | 57 ++++--------------------------------------------------- 1 file changed, 4 insertions(+), 53 deletions(-) diff --git a/README.md b/README.md index 4b23640..71dedf0 100755 --- a/README.md +++ b/README.md @@ -1,18 +1,11 @@ -

DIZK

-

- - - -

+

(NEO)DIZK

Java library for DIstributed Zero Knowledge proof systems

-___DIZK___ (pronounced */'dizək/*) is a Java library for distributed zero knowledge proof systems. The library implements distributed polynomial evaluation/interpolation, computation of Lagrange polynomials, and multi-scalar multiplication. Using these scalable arithmetic subroutines, the library provides a distributed zkSNARK proof system that enables verifiable computations of up to billions of logical gates, far exceeding the scale of previous state-of-the-art solutions. +Java library for distributed zero knowledge proof systems. The library implements distributed polynomial evaluation/interpolation, computation of Lagrange polynomials, and multi-scalar multiplication. Using these scalable arithmetic subroutines, the library provides a distributed zkSNARK proof system that enables verifiable computations of up to billions of logical gates, far exceeding the scale of previous state-of-the-art solutions. -The library is developed by [SCIPR Lab](http://www.scipr-lab.org/) and contributors (see [AUTHORS](AUTHORS) file) and is released under the MIT License (see [LICENSE](LICENSE) file). +:rotating_light: **WARNING:** This is an academic proof-of-concept prototype. This implementation is not ready for production use. It does not yet contain all the features, careful code review, tests, and integration that are needed for a deployment! -The library is developed as part of a paper called *"[DIZK: A Distributed Zero Knowledge Proof System](https://eprint.iacr.org/2018/691)"*. - -**WARNING:** This is an academic proof-of-concept prototype. This implementation is not ready for production use. It does not yet contain all the features, careful code review, tests, and integration that are needed for a deployment! +**Disclaimer:** This work is derived from the SCIPR-Lab's library [DIZK](https://github.com/scipr-lab/dizk) which was developed as part of a paper called *"[DIZK: A Distributed Zero Knowledge Proof System](https://eprint.iacr.org/2018/691)"*. ## Table of contents @@ -20,7 +13,6 @@ The library is developed as part of a paper called *"[DIZK: A Distributed Zero K - [Overview](#overview) - [Build guide](#build-guide) - [Configuring AWS and using Flintrock to manage a testing cluster](#configuring-aws-and-using-flintrock-to-manage-a-testing-cluster) -- [Benchmarks](#benchmarks) - [References](#references) - [License](#license) @@ -337,40 +329,6 @@ in `$SPARK_HOME/conf/metrics.properties` on all nodes of the cluster (make sure After these steps, one can access the ganglia dashboard from the master/host node. Upon submission of a job on the cluster via `spark-submit`, the metrics of the various spark cluster nodes can be monitored on the dashboard - in addition to the SparkUI. -## Benchmarks - -We evaluate the distributed implementation of the zkSNARK setup and prover. -Below we use *instance size* to denote the number of constraints in an R1CS instance. - -### libsnark *vs* DIZK - -We measure the largest instance size (as a power of 2) that is supported by: - -- the serial implementation of PGHR’s protocol in [libsnark](https://github.com/scipr-lab/libsnark) -- the serial implementation of Groth’s protocol in [libsnark](https://github.com/scipr-lab/libsnark) -- the distributed implementation of Groth's protocol in **DIZK** - -

- -We see that using more executors allows us to support larger instance sizes, -in particular supporting billions of constraints with sufficiently many executors. -Instances of this size are much larger than what was previously possible via serial techniques. - -### Distributed zkSNARK - -We benchmark the running time of the setup and the prover on an increasing number of constraints and with an increasing number of executors. -Note that we do not need to evaluate the zkSNARK verifier as it is a simple and fast algorithm that can be run even on a smartphone. - -

- -

- -Our benchmarks of the setup and the prover show us that: - -1. For a given number of executors, running times increase nearly linearly as expected, demonstrating scalability over a wide range of instance sizes. - -2. For a given instance size, running times decrease nearly linearly as expected, demonstrating parallelization over a wide range of number of executors. - ## References [Apa17] [_Apache Spark_](http://spark.apache.org/), @@ -433,13 +391,6 @@ Robert A. van de Geijn and Jerrell Watts, Ryan Williams, *Conference on Computational Complexity*, 2016 -## Acknowledgements - -This work was supported by Intel/NSF CPS-Security grants, -the [UC Berkeley Center for Long-Term Cybersecurity](https://cltc.berkeley.edu/), -and gifts to the [RISELab](https://rise.cs.berkeley.edu/) from Amazon, Ant Financial, CapitalOne, Ericsson, GE, Google, Huawei, IBM, Intel, Microsoft, and VMware. -The authors thank Amazon for donating compute credits to RISELab, which were extensively used in this project. - ## License [MIT License](LICENSE) From 56c49246a9f8883eaa5251b2bceb56288cb4ae3b Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 18:37:59 +0000 Subject: [PATCH 93/94] Pruned troubleshooting --- troubleshooting.md | 43 +------------------------------------------ 1 file changed, 1 insertion(+), 42 deletions(-) diff --git a/troubleshooting.md b/troubleshooting.md index 5400932..b38c713 100644 --- a/troubleshooting.md +++ b/troubleshooting.md @@ -1,28 +1,5 @@ # DIZK profiling -## macOS issues - -Some users have reported issues running DIZK on macOS. A Docker container can be used to work around this issue. - -``` -cd your_dizk_project_directory - -docker build -t dizk-container . -docker run -it dizk-container bash -``` - -- Launch script requires editing `dizk/depends/spark-ec2/spark_ec2.py' because of depreciated boto version. -- Launching spark on AWS-EC2 requires specific region where spark images are deployable: - - See [AMI-LIST](https://github.com/amplab/spark-ec2/tree/branch-2.0/ami-list) -- Also the launch script expects to have credentials for using **AWS EC2 spot instances** (not just EC2). -- profiler.sh had wrong specification path (depending on your `pwd`) and line 4 was changed to - -``` -/root/spark-ec2/copy-dir /home/ec2-user/ -``` -- Line 11 in [profile.sh](https://github.com/scipr-lab/dizk/blob/e98dd9cba0a3ec99403b191133003aeef94b1e8a/src/main/java/profiler/scripts/profile.sh#L11) should be set to a string from the list described in the switch statement in [Profiler.java](https://github.com/scipr-lab/dizk/blob/e98dd9cba0a3ec99403b191133003aeef94b1e8a/src/main/java/profiler/Profiler.java#L41-L71) - - [FixMe] This should throw an error. - ## Profiler hanging - When running the distributed zkSNARK algorithms, the profiler (and your interactions from the terminal) may hang. This is often a sign that your cluster has hit its memory limit (read: you are running an instance size that is too big for the given memory resources). You should try summoning larger instances or specify more memory for each instance in order to progress from this issue. @@ -35,34 +12,16 @@ It could be useful to clone and recompile the project right inside the server Install Maven (from .tar) ``` -export MAVEN_VERSION=3.3.9 +export MAVEN_VERSION=3.6.0 mkdir -p /usr/share/maven curl -fsSL http://apache.osuosl.org/maven/maven-3/$MAVEN_VERSION/binaries/apache-maven-$MAVEN_VERSION-bin.tar.gz | tar -xzC /usr/share/maven --strip-components=1 ln -s /usr/share/maven/bin/mvn /usr/bin/mvn export MAVEN_HOME=/usr/share/maven ``` -## Setting `JAVA_HOME` - -Because of how `spark-ec2` instantiates the environment, an older version of Java is installed by default. We override this by introducing Java 8 (see `setup_environment.sh`). Some users have reported that the setup script fails to set the correct `JAVA_HOME`. If you run into this issue, set the following command: - -``` -export JAVA_HOME="/usr/lib/jvm/java-1.8.0" -``` - ## Compile without running or testing Compile the project without running or compiling unit tests. - To skip running and compiling `mvn -Dmaven.test.skip=true install` - To compile, but not run unit tests `mvn install -DskipTests` - -## Miscellaneous - -Copy the **jar** file to `/home/ec2-user/` (because this is where the profiler scripts are looking for it. - -Notice that the $SIZE in profile.sh must be greater than the hard-coded values for number of inputs (1023) that are provided in the R1CSConstructor class. - -[This](https://spark.apache.org/docs/1.6.2/ec2-scripts.html) is useful. - -Mostly because you can view the status of the cluster via: `http://:8080` From 2b9c70029e7641e773df0c72938d6052a474a5e7 Mon Sep 17 00:00:00 2001 From: Antoine Rondelet Date: Wed, 10 Feb 2021 18:38:19 +0000 Subject: [PATCH 94/94] Added missing chmod --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 71dedf0..251a28f 100755 --- a/README.md +++ b/README.md @@ -284,7 +284,7 @@ Once the cluster is started: 1. Configure the master node to run ganglia: ```console flintrock copy-file scripts/ganglia_setup_master.sh /home/ec2-user/ --master-only -flintrock run-command --master-only 'sudo /home/ec2-user/ganglia_setup_master.sh ' +flintrock run-command --master-only 'sudo chmod +x /home/ec2-user/ganglia_setup_master.sh && sudo /home/ec2-user/ganglia_setup_master.sh ' ``` 2. Make sure to configure the webserver appropriately by editing `/etc/httpd/conf/httpd.conf` as desired (e.g. change default port) 3. Edit `/etc/httpd/conf.d/ganglia.conf` as desired (for e.g. write the auth configuration to access the dashboard)