-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PECO-1109] Parameterized Query: add suport for inferring decimal types #228
Conversation
Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
3a84db3
to
99f1364
Compare
src/databricks/sql/utils.py
Outdated
STRING = "STRING" | ||
DATE = "DATE" | ||
TIMESTAMP = "TIMESTAMP" | ||
FLOAT = "FLOAT" | ||
DECIMAL = "DECIMAL" | ||
DECIMAL = "DECIMAL(6,2)" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we keep this as just DECIMAL
then DBR will create a decimal with zero precision after the decimal point because in Databricks SQL
SELECT CAST("1234.56" as DECIMAL)
returns 1235
Whereas
SELECT CAST("1234.56" as DECIMAL(6,2))
returns 1234.56
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As of 1565700 I've implemented dynamic casting and removed the DECIMAL(6,2)
default
Signed-off-by: Jesse Whitehouse <[email protected]>
…provided Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This lets us make an easier to understand series of assertions about Decimal inference Note that `test_infer_types_decimal` tests a slightly different thing than its predecessor: the predecessor would pass a Python float into the inference but force its DbsqlParameter type to DECIMAL This causes the float value to be coerced. But this is an antipattern in Python. If the user explicitly sets a DbsqlParameter object to type DECIMAL then the user should also pass it a Python Decimal() object as its value, not a float Signed-off-by: Jesse Whitehouse <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
The python 3.8 lint check is hanging. Merging without this check passing as it passes for numerous other versions and appears to just be flaky. |
Also, we need to update the changelog to discuss the new parameterisation features. But we can punt this to a subsequent PR as there is more work to do before this releases. |
Description
This connector has strong support for Python Decimal types. But the parameterized query implementation merged last week is missing the ability to infer them. This pull request implements type inference for decimals with e2e tests for four different scenarios:
Decimal()
object (type must be inferred)Decimal()
parameter value is wrapped inDbsqlParameter
with thetype
set toNone
(type must be inferred)Decimal()
parameter value is wrapped inDbsqlParameter
and the type is set toDbsqlType.DECIMAL
(type is not inferred but cast expression must be calculated)Decimal()
parameter value is wrapped inDbsqlParameter
and the type is set to a custom implementation (type is not inferred and cast expression is accepted from the user as-is)In the first three scenarios, pysql will inspect the passed Decimal and provide the appropriate Databricks SQL cast expression to contain that decimal (such as
DECIMAL(18,2)
orDECIMAL(38,5)
)I also split out the
infer_types
tests so we can make special assertions about decimals and ranisort
over a couple of files.I'll open a subsequent PR with a full example of how users can work with these APIs.