hey, looks like some rate values might be too high for the new spec, causing overflow. try checking your data and recasting or filtering numbers that exceed the allowed range
hey, i wonder if the issue might be due to underlying schema details in the target db causing misalignmnt in numric precision. has anyone tried converting values before the altarnative? curious what other folks think about this debugg process
Based on experience, altering a column’s precision in Spark SQL requires careful validation of the existing data. The error may be due to values in the data that exceed the bounds of the new DECIMAL precision and scale you are targeting. I encountered similar issues by not examining the original range of values in the column and attempting a direct schema change. It is advisable to run exploratory queries to obtain the maximum and minimum values in the column, and if necessary, perform an intermediate conversion or data sanitization step before the alteration.