You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This implements the identification scheme from https://eprint.iacr.org/2019/262.pdf see §5.2 (pg 24) "Construction". It is a generic sigma protocol.
I am using the parameters you generated, in order to generate a fake signature you need to find an answer r whereA*r == T*c + a without knowing S (the secret key). You can find this in sage using A.solve_right(T*c + a), however it is bigger than the beta parameter so will fail.
The beta parameter can be reduced significantly, which will make finding a fake r parameter even more difficult.
Using the ~70 bit prime, the public key is ~145 kilobytes, and the transcript (excluding the public key) is ~85 kilobytes (but can possibly be reduced by half).
RandomBinaryVector=lambdalength: vector(G, [randint(0,1) for_inrange(length)])
EuclideanNorm=lambda_: int(sqrt(sum([int(i)**2foriin_])))
n=k=128q=1244142437461793964053m=9088beta=9.71468927453313e18G=GF(q)
A=MatrixSpace(G, n, m).random_element()
S=MatrixSpace(GF(2), m, k).random_element().change_ring(G)
T=A*Spk= (A, T)
sk= (A, S)
# Commitment stagey=RandomBinaryVector(m)
a=A*y# send `a` to verifierassertlen(a) ==k# Challenge stage# To make this non-interactive, you'd derive `c` from (some hash function which puts bits)# `c = H(A, T, a)`c=RandomBinaryVector(k) # send `c` to prover# Response stage, send `r` to verifierr=S*c+yassertlen(r) ==m# Verification stageassertEuclideanNorm(r) <betaassertA*r==T*c+a# Statsq_bits=ceil(log(q, 2))
q_bytes= (q_bits+ (8- (q_bits%8))) /8transcript_size=len(r) +len(a) +len(c)
transcript_size_bytes=transcript_size*q_bytespublic_key_bytes= (T.dimensions()[0] *T.dimensions()[1]) *q_bytes
Notice that sending Sc without adding it to y would completely reveal S, and so the job of y is to somehow mask the exact value of Sc
...
A way to do masking in this scheme is to pick a y in a range that is much larger than the range of Sc. So for example, if 0 ≤ Sc ≤ R, then one could pick a random y from the range [0, ..., 2^64 R]. Then, with very high probability, the value of Sc + y will be in [R, ..., 2^64 R], in which case it will be impossible to determine anything about Sc if nothing is known about y.
...
Our solution is to instead pick y from a much smaller set, something analogous to [0, . . . , 2R], but only reveal Sc + y if it falls into the range [R, . . . , 2R]. If the range is picked carefully and the function h is a homomorphism that has “small” elements in its kernel, then one can show that if the prover only reveals values in this range and aborts otherwise, the protocol will be perfectly witness-indistinguishable. The witness-indistinguishability is then used to prove security of the protocol by showing that a forger can be used for extracting collisions in h.
So, we sample Sc with random bits c to determine an upper bound for R = max(S*c), then choose values of y in 0..2R, and only reveal r if the result falls into the range R..2R. This means the identification scheme needs to loop, picking new values of y then deriving c and verifying the range of min(S*c + y) > R, subsequently revealing r if it is in the appropriate range so as to not reveal any information about S. Being non-interactive all the verifier sees is a single instance of the protocol where the prover is confident that they are not revealing their secret.
Where S, c and y are all ones, the maximum and minimum for S*c + y will be k+1, where y is a vector of [k]*m the maximum and minimum will be 2*k.
Deterministically, given a matrix of all ones S (of m x k) and a vector of all ones c of length k, the maximum range of S*c will be k where both S and c consist of all ones. The problem is sampling y in such a way that the probability of min(S*c + y) will be larger than k with a high probability. Sampling y from [0...nm] seems to work with good probability, and [0..nmk] works with high probability, although the choices nm and nmk increase the size of each element r and reduces the 'tightness' of the beta parameter (increasing communication cost and transcript size). Ideally we want to reduce the range of y so there's a 50% or 33% chance of a successful try on the first-time, otherwise cost of signing/identifying increases significantly. [0..nm] seems to be about 50%.
e.g.
whileTrue:
y=RandomRangeVector(m, n*m) # vector of m elements, in range [0...nm]a=A*yc=H_k(a) # hash `a`, to a vector of `k` bitsr=S*c+yifmin(r) >k:
break
The text was updated successfully, but these errors were encountered:
This implements the identification scheme from https://eprint.iacr.org/2019/262.pdf see §5.2 (pg 24) "Construction". It is a generic sigma protocol.
I am using the parameters you generated, in order to generate a fake signature you need to find an answer
r
whereA*r == T*c + a
without knowingS
(the secret key). You can find this in sage usingA.solve_right(T*c + a)
, however it is bigger than thebeta
parameter so will fail.The
beta
parameter can be reduced significantly, which will make finding a faker
parameter even more difficult.Using the ~70 bit prime, the public key is ~145 kilobytes, and the transcript (excluding the public key) is ~85 kilobytes (but can possibly be reduced by half).
From Lyubashevsky's paper Fiat-Shamir With Aborts: Applications to Lattice and Factoring-Based Signatures
See §1.3 pg 7, "Intuition on Aborting":
So, we sample
Sc
with random bitsc
to determine an upper bound forR = max(S*c)
, then choose values ofy
in0..2R
, and only revealr
if the result falls into the rangeR..2R
. This means the identification scheme needs to loop, picking new values ofy
then derivingc
and verifying the range ofmin(S*c + y) > R
, subsequently revealingr
if it is in the appropriate range so as to not reveal any information aboutS
. Being non-interactive all the verifier sees is a single instance of the protocol where the prover is confident that they are not revealing their secret.Where
S
,c
andy
are all ones, the maximum and minimum forS*c + y
will bek+1
, wherey
is a vector of[k]*m
the maximum and minimum will be2*k
.Deterministically, given a matrix of all ones
S
(ofm
xk
) and a vector of all onesc
of lengthk
, the maximum range ofS*c
will bek
where bothS
andc
consist of all ones. The problem is samplingy
in such a way that the probability ofmin(S*c + y)
will be larger thank
with a high probability. Samplingy
from[0...nm]
seems to work with good probability, and[0..nmk]
works with high probability, although the choicesnm
andnmk
increase the size of each elementr
and reduces the 'tightness' of thebeta
parameter (increasing communication cost and transcript size). Ideally we want to reduce the range ofy
so there's a 50% or 33% chance of a successful try on the first-time, otherwise cost of signing/identifying increases significantly.[0..nm]
seems to be about 50%.e.g.
The text was updated successfully, but these errors were encountered: