-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Number returned in scientific notation instead of standard decimal format #460
Comments
Hello @adnan-awan, what you are mentioning is a behavior of Python itself. |
Hi @VMRuiz ,
|
Hey @adnan-awan, I see where you're coming from. It seems like what you're really looking for is a tolerance level – a margin of error within which the results are considered acceptable. So instead of strictly requiring 0.0001, maybe anything >= 0.000090 would pass. Perhaps we could add a configuration option to specify this 'tolerance' value for field coverage tests? That would give users more control over how strict the checks are |
When performing a division operation, the result is returned in scientific notation, even though it could be represented in standard decimal format for better readability. For example, the result of
1 / 11091
returns9.01631953836444e-05
, while the expected result is something like 0.0000901 in decimal format.Steps to Reproduce:
Perform the division operation:
1 / 11091
.Observe the result is in scientific notation.
Expected Behavior: The result should be displayed in a readable decimal format (e.g.,
0.0000901
).Actual Behavior: The result is returned in scientific notation:
9.01631953836444e-05
.Additional Information: It would be helpful if there was an option to toggle between scientific notation and standard decimal format for small numbers.
https://github.com/scrapinghub/spidermon/blob/master/spidermon/utils/field_coverage.py#L41
The text was updated successfully, but these errors were encountered: