Skip to content

Commit

Permalink
refactor!: identifier to ident and update docs (#441)
Browse files Browse the repository at this point in the history
Please be sure to look over the pull request guidelines here:
https://github.com/spaceandtimelabs/sxt-proof-of-sql/blob/main/CONTRIBUTING.md#submit-pr.

# Please go through the following checklist
- [ ] The PR title and commit messages adhere to guidelines here:
https://github.com/spaceandtimelabs/sxt-proof-of-sql/blob/main/CONTRIBUTING.md.
In particular `!` is used if and only if at least one breaking change
has been introduced.
- [ ] I have run the ci check script with `source
scripts/run_ci_checks.sh`.

# Rationale for this change

To update docs to reflect the migration of Identifier -> Ident
<!--
Why are you proposing this change? If this is already explained clearly
in the linked issue then this section is not needed.
Explaining clearly why changes are proposed helps reviewers understand
your changes and offer better suggestions for fixes.

 Example:
 Add `NestedLoopJoinExec`.
 Closes #345.

Since we added `HashJoinExec` in #323 it has been possible to do
provable inner joins. However performance is not satisfactory in some
cases. Hence we need to fix the problem by implement
`NestedLoopJoinExec` and speed up the code
 for `HashJoinExec`.
-->

# What changes are included in this PR?

- Updated docs in the proof-of-sql crate with Ident
<!--
There is no need to duplicate the description in the ticket here but it
is sometimes worth providing a summary of the individual changes in this
PR.

Example:
- Add `NestedLoopJoinExec`.
- Speed up `HashJoinExec`.
- Route joins to `NestedLoopJoinExec` if the outer input is sufficiently
small.
-->

# Are these changes tested?
<!--
We typically require tests for all PRs in order to:
1. Prevent the code from being accidentally broken by subsequent changes
2. Serve as another way to document the expected behavior of the code

If tests are not included in your PR, please explain why (for example,
are they covered by existing tests)?

Example:
Yes.
-->
Part of  #235
  • Loading branch information
iajoiner authored Dec 18, 2024
2 parents 43a2a3d + cff9523 commit c83b298
Show file tree
Hide file tree
Showing 19 changed files with 99 additions and 147 deletions.
2 changes: 1 addition & 1 deletion crates/proof-of-sql/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ Parsing Query... 1.870256ms
Generating Proof... 467.45371ms
Verifying Proof... 7.106864ms
Valid proof!
Query result: OwnedTable { table: {Identifier { name: "b" }: VarChar(["hello", "world"])} }
Query result: OwnedTable { table: {Ident { value: "b", quote_style: None }: VarChar(["hello", "world"])} }
```
For a detailed explanation of the example and its implementation, refer to the [README](https://github.com/spaceandtimelabs/sxt-proof-of-sql/blob/main/crates/proof-of-sql/examples/hello_world/README.md) and source code in [hello_world/main.rs](https://github.com/spaceandtimelabs/sxt-proof-of-sql/blob/main/crates/proof-of-sql/examples/hello_world/main.rs).
Expand Down
1 change: 0 additions & 1 deletion crates/proof-of-sql/benches/scaffold/random_util.rs
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ pub type OptionalRandBound = Option<fn(usize) -> i64>;
/// # Panics
///
/// Will panic if:
/// - The provided identifier cannot be parsed into an `Identifier` type.
/// - An unsupported `ColumnType` is encountered, triggering a panic in the `todo!()` macro.
#[allow(clippy::cast_sign_loss, clippy::cast_possible_truncation)]
pub fn generate_random_columns<'a, S: Scalar>(
Expand Down
2 changes: 1 addition & 1 deletion crates/proof-of-sql/examples/hello_world/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,5 +30,5 @@ Parsing Query... 1.870256ms
Generating Proof... 467.45371ms
Verifying Proof... 7.106864ms
Valid proof!
Query result: OwnedTable { table: {Identifier { name: "b" }: VarChar(["hello", "world"])} }
Query result: OwnedTable { table: {Ident { value: "b", quote_style: None }: VarChar(["hello", "world"])} }
```
9 changes: 6 additions & 3 deletions crates/proof-of-sql/examples/posql_db/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,8 @@ use proof_of_sql::{
},
sql::{parse::QueryExpr, proof::VerifiableQueryResult},
};
use proof_of_sql_parser::{Identifier, SelectStatement};
use proof_of_sql_parser::SelectStatement;
use sqlparser::ast::Ident;
use std::{
fs,
io::{prelude::Write, stdout},
Expand Down Expand Up @@ -78,7 +79,7 @@ enum Commands {
table: TableRef,
/// The comma delimited column names of the table.
#[arg(short, long, value_parser, num_args = 0.., value_delimiter = ',')]
columns: Vec<Identifier>,
columns: Vec<Ident>,
/// The comma delimited data types of the columns.
#[arg(short, long, value_parser, num_args = 0.., value_delimiter = ',')]
data_types: Vec<CsvDataType>,
Expand Down Expand Up @@ -175,7 +176,9 @@ fn main() {
columns
.iter()
.zip_eq(data_types.iter())
.map(|(name, data_type)| Field::new(name.as_str(), data_type.into(), false))
.map(|(name, data_type)| {
Field::new(name.value.as_str(), data_type.into(), false)
})
.collect::<Vec<_>>(),
);
let batch = RecordBatch::new_empty(Arc::new(schema));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,10 +49,10 @@ pub enum OwnedArrowConversionError {
/// The unsupported datatype
datatype: DataType,
},
/// This error occurs when trying to convert from a record batch with duplicate identifiers (e.g. `"a"` and `"A"`).
#[snafu(display("conversion resulted in duplicate identifiers"))]
DuplicateIdentifiers,
/// This error occurs when convering from a record batch name to an identifier fails. (Which may my impossible.)
/// This error occurs when trying to convert from a record batch with duplicate idents(e.g. `"a"` and `"A"`).
#[snafu(display("conversion resulted in duplicate idents"))]
DuplicateIdents,
/// This error occurs when convering from a record batch name to an idents fails. (Which may my impossible.)
#[snafu(transparent)]
FieldParseFail {
/// The underlying source error
Expand Down Expand Up @@ -311,7 +311,7 @@ impl<S: Scalar> TryFrom<RecordBatch> for OwnedTable<S> {
if num_columns == owned_table.num_columns() {
Ok(owned_table)
} else {
Err(OwnedArrowConversionError::DuplicateIdentifiers)
Err(OwnedArrowConversionError::DuplicateIdents)
}
}
}
4 changes: 2 additions & 2 deletions crates/proof-of-sql/src/base/arrow/record_batch_conversion.rs
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ impl<C: Commitment> TableCommitment<C> {
panic!("RecordBatches cannot have columns of mixed length")
}
Err(AppendTableCommitmentError::AppendColumnCommitments {
source: AppendColumnCommitmentsError::DuplicateIdentifiers { .. },
source: AppendColumnCommitmentsError::DuplicateIdents { .. },
}) => {
panic!("RecordBatches cannot have duplicate identifiers")
}
Expand Down Expand Up @@ -92,7 +92,7 @@ impl<C: Commitment> TableCommitment<C> {
Err(TableCommitmentFromColumnsError::MixedLengthColumns { .. }) => {
panic!("RecordBatches cannot have columns of mixed length")
}
Err(TableCommitmentFromColumnsError::DuplicateIdentifiers { .. }) => {
Err(TableCommitmentFromColumnsError::DuplicateIdents { .. }) => {
panic!("RecordBatches cannot have duplicate identifiers")
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ use alloc::string::{String, ToString};
use snafu::Snafu;
use sqlparser::ast::Ident;

/// Mapping of column identifiers to column metadata used to associate metadata with commitments.
/// Mapping of column idents to column metadata used to associate metadata with commitments.
pub type ColumnCommitmentMetadataMap = IndexMap<Ident, ColumnCommitmentMetadata>;

/// During commitment operation, metadata indicates that operand tables cannot be the same.
Expand All @@ -22,16 +22,14 @@ pub enum ColumnCommitmentsMismatch {
/// Commitments with different column counts cannot operate with each other.
#[snafu(display("commitments with different column counts cannot operate with each other"))]
NumColumns,
/// Columns with mismatched identifiers cannot operate with each other.
/// Columns with mismatched idents cannot operate with each other.
///
/// Strings are used here instead of Identifiers to decrease the size of this variant
#[snafu(display(
"column with identifier {id_a} cannot operate with column with identifier {id_b}"
))]
/// Strings are used here instead of Idents to decrease the size of this variant
#[snafu(display("column with ident {id_a} cannot operate with column with ident {id_b}"))]
Ident {
/// The first column identifier
/// The first column ident
id_a: String,
/// The second column identifier
/// The second column ident
id_b: String,
},
}
Expand All @@ -42,7 +40,7 @@ pub trait ColumnCommitmentMetadataMapExt {
/// the widest possible bounds for the column type.
fn from_column_fields_with_max_bounds(columns: &[ColumnField]) -> Self;

/// Construct this mapping from an iterator of column identifiers and columns.
/// Construct this mapping from an iterator of column ident and columns.
fn from_columns<'a>(
columns: impl IntoIterator<Item = (&'a Ident, &'a CommittableColumn<'a>)>,
) -> Self
Expand Down
41 changes: 19 additions & 22 deletions crates/proof-of-sql/src/base/commitment/column_commitments.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,10 @@ use serde::{Deserialize, Serialize};
use snafu::Snafu;
use sqlparser::ast::Ident;

/// Cannot create commitments with duplicate identifier.
/// Cannot create commitments with duplicate ident.
#[derive(Debug, Snafu)]
#[snafu(display("cannot create commitments with duplicate identifier: {id}"))]
pub struct DuplicateIdentifiers {
#[snafu(display("cannot create commitments with duplicate ident: {id}"))]
pub struct DuplicateIdents {
id: String,
}

Expand All @@ -32,11 +32,11 @@ pub enum AppendColumnCommitmentsError {
/// The underlying source error
source: ColumnCommitmentsMismatch,
},
/// New columns have duplicate identifiers.
/// New columns have duplicate idents.
#[snafu(transparent)]
DuplicateIdentifiers {
DuplicateIdents {
/// The underlying source error
source: DuplicateIdentifiers,
source: DuplicateIdents,
},
}

Expand Down Expand Up @@ -97,15 +97,15 @@ impl<C: Commitment> ColumnCommitments<C> {
self.column_metadata.is_empty()
}

/// Returns the commitment with the given identifier.
/// Returns the commitment with the given ident.
#[must_use]
pub fn get_commitment(&self, identifier: &Ident) -> Option<C> {
self.column_metadata
.get_index_of(identifier)
.map(|index| self.commitments[index].clone())
}

/// Returns the metadata for the commitment with the given identifier.
/// Returns the metadata for the commitment with the given ident.
#[must_use]
pub fn get_metadata(&self, identifier: &Ident) -> Option<&ColumnCommitmentMetadata> {
self.column_metadata.get(identifier)
Expand All @@ -121,19 +121,19 @@ impl<C: Commitment> ColumnCommitments<C> {
columns: impl IntoIterator<Item = (&'a Ident, COL)>,
offset: usize,
setup: &C::PublicSetup<'_>,
) -> Result<ColumnCommitments<C>, DuplicateIdentifiers>
) -> Result<ColumnCommitments<C>, DuplicateIdents>
where
COL: Into<CommittableColumn<'a>>,
{
// Check for duplicate identifiers
// Check for duplicate idents
let mut unique_identifiers = IndexSet::default();
let unique_columns = columns
.into_iter()
.map(|(identifier, column)| {
if unique_identifiers.insert(identifier) {
Ok((identifier, column))
} else {
Err(DuplicateIdentifiers {
Err(DuplicateIdents {
id: identifier.to_string(),
})
}
Expand Down Expand Up @@ -178,15 +178,15 @@ impl<C: Commitment> ColumnCommitments<C> {
where
COL: Into<CommittableColumn<'a>>,
{
// Check for duplicate identifiers.
// Check for duplicate idents.
let mut unique_identifiers = IndexSet::default();
let unique_columns = columns
.into_iter()
.map(|(identifier, column)| {
if unique_identifiers.insert(identifier) {
Ok((identifier, column))
} else {
Err(DuplicateIdentifiers {
Err(DuplicateIdents {
id: identifier.to_string(),
})
}
Expand Down Expand Up @@ -221,7 +221,7 @@ impl<C: Commitment> ColumnCommitments<C> {
columns: impl IntoIterator<Item = (&'a Ident, COL)>,
offset: usize,
setup: &C::PublicSetup<'_>,
) -> Result<(), DuplicateIdentifiers>
) -> Result<(), DuplicateIdents>
where
COL: Into<CommittableColumn<'a>>,
{
Expand All @@ -236,7 +236,7 @@ impl<C: Commitment> ColumnCommitments<C> {
.into_iter()
.map(|(identifier, column)| {
if self.column_metadata.contains_key(identifier) {
Err(DuplicateIdentifiers {
Err(DuplicateIdents {
id: identifier.to_string(),
})
} else {
Expand Down Expand Up @@ -498,10 +498,7 @@ mod tests {
0,
&(),
);
assert!(matches!(
from_columns_result,
Err(DuplicateIdentifiers { .. })
));
assert!(matches!(from_columns_result, Err(DuplicateIdents { .. })));

let mut existing_column_commitments =
ColumnCommitments::<NaiveCommitment>::try_from_columns_with_offset(
Expand All @@ -518,7 +515,7 @@ mod tests {
.try_extend_columns_with_offset([(&duplicate_identifier_a, &empty_column)], 0, &());
assert!(matches!(
extend_with_existing_column_result,
Err(DuplicateIdentifiers { .. })
Err(DuplicateIdents { .. })
));

let extend_with_duplicate_columns_result = existing_column_commitments
Expand All @@ -532,7 +529,7 @@ mod tests {
);
assert!(matches!(
extend_with_duplicate_columns_result,
Err(DuplicateIdentifiers { .. })
Err(DuplicateIdents { .. })
));

let append_result = existing_column_commitments.try_append_rows_with_offset(
Expand All @@ -546,7 +543,7 @@ mod tests {
);
assert!(matches!(
append_result,
Err(AppendColumnCommitmentsError::DuplicateIdentifiers { .. })
Err(AppendColumnCommitmentsError::DuplicateIdents { .. })
));
}

Expand Down
4 changes: 1 addition & 3 deletions crates/proof-of-sql/src/base/commitment/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,7 @@ pub use column_commitment_metadata_map::{
};

mod column_commitments;
pub use column_commitments::{
AppendColumnCommitmentsError, ColumnCommitments, DuplicateIdentifiers,
};
pub use column_commitments::{AppendColumnCommitmentsError, ColumnCommitments, DuplicateIdents};

mod table_commitment;
pub use table_commitment::{
Expand Down
Loading

0 comments on commit c83b298

Please sign in to comment.