Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add keccak transcript #27

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ ark-std = "^0.4.0"
ark-crypto-primitives = { version = "^0.4.0", default-features = false, features = ["r1cs", "sponge"] }
ark-relations = { version = "^0.4.0", default-features = false }
ark-r1cs-std = { version = "^0.4.0", default-features = false }
tiny-keccak = { version = "2.0", features = ["keccak"] }
sha3 = "0.10.8"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
sha3 = "0.10.8"
sha3 = "0.10"

Let's allow the minor to automatically update (although IIRC Cargo did it anyways). But JIC.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, do we use sha3 instead of sha256 for some reason?

thiserror = "1.0"
rayon = "1.7.0"

Expand Down
24 changes: 16 additions & 8 deletions src/pedersen.rs
Original file line number Diff line number Diff line change
Expand Up @@ -93,22 +93,20 @@ impl<C: CurveGroup> Pedersen<C> {
mod tests {
use super::*;
use crate::transcript::poseidon::{tests::poseidon_test_config, PoseidonTranscript};
use crate::transcript::keccak::{tests::keccak_test_config, KeccakTranscript};
use crate::transcript::sha3::{tests::sha3_test_config, SHA3Transcript};
use crate::transcript::Transcript;

use ark_pallas::{Fr, Projective};

#[test]
fn test_pedersen_vector() {
fn test_pedersen_vector_with<T: Transcript<Projective>>(config: T::TranscriptConfig) {
let mut transcript_p: T = Transcript::<Projective>::new(&config);
let mut transcript_v: T = Transcript::<Projective>::new(&config);
let mut rng = ark_std::test_rng();

const n: usize = 10;
// setup params
let params = Pedersen::<Projective>::new_params(&mut rng, n);
let poseidon_config = poseidon_test_config::<Fr>();

// init Prover's transcript
let mut transcript_p = PoseidonTranscript::<Projective>::new(&poseidon_config);
// init Verifier's transcript
let mut transcript_v = PoseidonTranscript::<Projective>::new(&poseidon_config);

let v: Vec<Fr> = vec![Fr::rand(&mut rng); n];
let r: Fr = Fr::rand(&mut rng);
Expand All @@ -117,4 +115,14 @@ mod tests {
let v = Pedersen::<Projective>::verify(&params, &mut transcript_v, cm, proof);
assert!(v);
}

#[test]
fn test_pedersen_vector() {
// Test for Poseidon
test_pedersen_vector_with::<PoseidonTranscript<Projective>>(poseidon_test_config::<Fr>());
// Test for Keccak
test_pedersen_vector_with::<KeccakTranscript<Projective>>(keccak_test_config::<Fr>());
// Test for SHA3
test_pedersen_vector_with::<SHA3Transcript<Projective>>(sha3_test_config::<Fr>());
}
}
92 changes: 92 additions & 0 deletions src/transcript/keccak.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
use std::marker::PhantomData;
use tiny_keccak::{Keccak, Hasher};
use ark_ec::CurveGroup;
use ark_ff::{BigInteger, PrimeField};

use crate::transcript::Transcript;

/// KecccakTranscript implements the Transcript trait using the Keccak hash
pub struct KeccakTranscript<C: CurveGroup> {
sponge: Keccak,
phantom: PhantomData<C>,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
phantom: PhantomData<C>,
_marker: PhantomData<C>,

This is how usually PhantomData is set inside of structs in rust.

}

#[derive(Debug)]
pub struct KeccakConfig {}

impl<C: CurveGroup> Transcript<C> for KeccakTranscript<C> {
type TranscriptConfig = KeccakConfig;
fn new(config: &Self::TranscriptConfig) -> Self {
let _ = config;
Comment on lines +19 to +20
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we're going to drop the config, why getting it as a parameter in first place?

Copy link
Collaborator

@arnaucube arnaucube Oct 12, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comes from the Transcript trait, which in cases like Poseidon needs the config. For Keccak is not used, but needed in the function signature to fit with the Transcript trait, so happy with the most rust-style approach.

What I see that is done in other similar arkworks cases is for example:

In our case for this Keccak256, we could do something like is done in the last link of sha256, something like:

Suggested change
fn new(config: &Self::TranscriptConfig) -> Self {
let _ = config;
fn new(_config: &Self::TranscriptConfig) -> Self {

let sponge = Keccak::v256();
Self {
sponge,
phantom: PhantomData,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
phantom: PhantomData,
_marker: PhantomData,

}
}

fn absorb(&mut self, v: &C::ScalarField) {
self.sponge.update(&(v.into_bigint().to_bytes_le()));
}
fn absorb_vec(&mut self, v: &[C::ScalarField]) {
for _v in v {
self.sponge.update(&(_v.into_bigint().to_bytes_le()));
}
}
fn absorb_point(&mut self, p: &C) {
let mut serialized = vec![];
p.serialize_compressed(&mut serialized).unwrap();
self.sponge.update(&(serialized))
Comment on lines +38 to +39
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to make sure that the point is encoded as [bytes_x, bytes_y].flatten() format. If we're doing any size-optimizations like just storing one coordinate and the sign of the other, simulating the transcript inside of a SNARK (if we needed it at any point) wouldn't work.

Copy link
Collaborator

@arnaucube arnaucube Oct 12, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding simulating the transcript inside a SNARK, I currently think (=I might be wrong) that we will not need it for the Keccak transcript as mentioned in this comment, since the keccak transcript is intended to be used only in non-circuit cases (eg. in Nova's main transcript), but not in-circuit (eg. HyperNova's SumCheck transcript, where we would use for example Poseidon's Transcript to later reproduce it in-circuit) (more details in the original comment itself).

}
fn get_challenge(&mut self) -> C::ScalarField {
let mut output = [0u8; 32];
self.sponge.clone().finalize(&mut output);
self.sponge.update(&[output[0]]);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@han0110 Thanks for pointing out!

You mean we don't need to call absorb() (or update()) after squeeze() here, right?

I just checked, and it seems that since finalize() on tiny-keccak/SHA3 does not update the original sponge, thus we no need to call absorb() afterwards as you mentioned.

I have pushed a fix and new tests which can make sure that. ✅ 0685d37

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm not really, I mean we should re-initialize self.sponge after we've called finalize, otherwise next time we call get_challenge it's equal to hashing all the previous thing again, for evm verifier it seems inefficient.

so it would look like:

         let mut output = [0u8; 32];
-        self.sponge.clone().finalize(&mut output);
+        std::mem::replace(&mut self.sponge, Keccak::v256()).finalize(&mut output);
         self.sponge.update(&[output[0]]);

C::ScalarField::from_le_bytes_mod_order(&[output[0]])
}
fn get_challenge_nbits(&mut self, nbits: usize) -> Vec<bool> {
// TODO
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be addressed.

vec![]
}
fn get_challenges(&mut self, n: usize) -> Vec<C::ScalarField> {
let mut output = [0u8; 32];
self.sponge.clone().finalize(&mut output);
self.sponge.update(&[output[0]]);

let c: Vec<C::ScalarField> = output
.iter()
.map(|c| C::ScalarField::from_le_bytes_mod_order(&[*c]))
.collect();
c[..n].to_vec()
}
}

#[cfg(test)]
pub mod tests {
use super::*;
use ark_pallas::{
// constraints::GVar,
Fr, Projective
};
use ark_std::UniformRand;

/// WARNING the method poseidon_test_config is for tests only
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
/// WARNING the method poseidon_test_config is for tests only
/// WARNING the method `keccak_test_config` is for tests only

#[cfg(test)]
pub fn keccak_test_config<F: PrimeField>() -> KeccakConfig {
KeccakConfig {}
}

#[test]
fn test_transcript_get_challenge() {
let mut rng = ark_std::test_rng();

const n: usize = 10;
let config = keccak_test_config::<Fr>();

// init transcript
let mut transcript = KeccakTranscript::<Projective>::new(&config);
let v: Vec<Fr> = vec![Fr::rand(&mut rng); n];
let challenges = transcript.get_challenges(v.len());
assert_eq!(challenges.len(), n);
}
}
2 changes: 2 additions & 0 deletions src/transcript/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ use ark_ec::CurveGroup;
use ark_std::fmt::Debug;

pub mod poseidon;
pub mod keccak;
pub mod sha3;

pub trait Transcript<C: CurveGroup> {
type TranscriptConfig: Debug;
Expand Down
91 changes: 91 additions & 0 deletions src/transcript/sha3.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
use std::marker::PhantomData;
use sha3::{Shake256, digest::*};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why using Shake256 instead of Sha3_256???

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I presume is either for the capacity difference or the security difference after Berstein's discussions with NIST.
Can you bring some light? I just don't know why we don't use sha256 instead.

use ark_ec::CurveGroup;
use ark_ff::{BigInteger, PrimeField};

use crate::transcript::Transcript;

/// SHA3Transcript implements the Transcript trait using the Keccak hash
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This says using the Keccak hash but we are using Shake256.

pub struct SHA3Transcript<C: CurveGroup> {
sponge: Shake256,
phantom: PhantomData<C>,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
phantom: PhantomData<C>,
_marker: PhantomData<C>,

}

#[derive(Debug)]
pub struct SHA3Config {}

impl<C: CurveGroup> Transcript<C> for SHA3Transcript<C> {
type TranscriptConfig = SHA3Config;
fn new(config: &Self::TranscriptConfig) -> Self {
let _ = config;
let sponge = Shake256::default();
Self {
sponge,
phantom: PhantomData,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
phantom: PhantomData,
_marker: PhantomData,

}
}

fn absorb(&mut self, v: &C::ScalarField) {
self.sponge.update(&(v.into_bigint().to_bytes_le()));
}
fn absorb_vec(&mut self, v: &[C::ScalarField]) {
for _v in v {
self.sponge.update(&(_v.into_bigint().to_bytes_le()));
}
}
fn absorb_point(&mut self, p: &C) {
let mut serialized = vec![];
p.serialize_compressed(&mut serialized).unwrap();
self.sponge.update(&(serialized))
}
Comment on lines +36 to +40
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto as with keccak.

fn get_challenge(&mut self) -> C::ScalarField {
let output = self.sponge.clone().finalize_boxed(200);
self.sponge.update(&[output[0]]);
C::ScalarField::from_le_bytes_mod_order(&[output[0]])
}
fn get_challenge_nbits(&mut self, nbits: usize) -> Vec<bool> {
// TODO
// should call finalize() then slice the output to n bit challenge
vec![]
}
Comment on lines +45 to +49
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto as with keccak

fn get_challenges(&mut self, n: usize) -> Vec<C::ScalarField> {
let output = self.sponge.clone().finalize_boxed(n);
self.sponge.update(&[output[0]]);

let c = output
.iter()
.map(|c| C::ScalarField::from_le_bytes_mod_order(&[*c]))
.collect();
c
}
}

#[cfg(test)]
pub mod tests {
use super::*;
use ark_pallas::{
// constraints::GVar,
Fr, Projective
};
use ark_std::UniformRand;

/// WARNING the method poseidon_test_config is for tests only
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
/// WARNING the method poseidon_test_config is for tests only
/// WARNING the method `sha3_256_test_config` is for tests only

Use sha3 or whatever is decided at the end. But this is definitely not Poseidon.

#[cfg(test)]
pub fn sha3_test_config<F: PrimeField>() -> SHA3Config {
SHA3Config {}
}

#[test]
fn test_transcript_get_challenge() {
let mut rng = ark_std::test_rng();

const n: usize = 10;
let config = sha3_test_config::<Fr>();

// init transcript
let mut transcript = SHA3Transcript::<Projective>::new(&config);
let v: Vec<Fr> = vec![Fr::rand(&mut rng); n];
let challenges = transcript.get_challenges(v.len());
assert_eq!(challenges.len(), n);
}
}