You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What happened: When converting Parquet Statistics::FixedLenByteArray value for a Decimal(scale, precision) to an internal representation based on f64, a rounding error can sometimes lead to the output value whose integer part exceeds the allotted space (i.e. the number of digits is larger than precision - scale).
In turn this will result in an error such as Parser error: parse decimal overflow (1e32) when trying to parse the stats from the logs.
What you expected to happen: The conversion should respect the Decimal's precision/scale (even it means it's slightly less precise than with the overflow).
How to reproduce it: The following cases (in test_stats_scalar_serialization) should pass
as otherwise arrow would raise a parse decimal overflow error for 1e32/-1e32.
More details: Coincidentally, this also revealed a related issue whereby the commit effectively succeeds, meaning the new table version is successfully promoted, but the error is thrown somewhere around running post-commit hooks since the faulty stat gets parsed then.
The text was updated successfully, but these errors were encountered:
Environment
Delta-rs version: 0.20.1
Binding: Rust
Environment:
Bug
What happened: When converting Parquet
Statistics::FixedLenByteArray
value for aDecimal(scale, precision)
to an internal representation based on f64, a rounding error can sometimes lead to the output value whose integer part exceeds the allotted space (i.e. the number of digits is larger than precision - scale).In turn this will result in an error such as
Parser error: parse decimal overflow (1e32)
when trying to parse the stats from the logs.What you expected to happen: The conversion should respect the Decimal's precision/scale (even it means it's slightly less precise than with the overflow).
How to reproduce it: The following cases (in
test_stats_scalar_serialization
) should passas otherwise arrow would raise a
parse decimal overflow
error for1e32
/-1e32
.More details: Coincidentally, this also revealed a related issue whereby the commit effectively succeeds, meaning the new table version is successfully promoted, but the error is thrown somewhere around running post-commit hooks since the faulty stat gets parsed then.
The text was updated successfully, but these errors were encountered: