Multiple issues while reading data from snowflake
Description
Release Notes
Fixed an issue where Snowflake source pipelines were failing where data was having backslash in it, added a runtime argument `cdap.snowflake.source.escape` which can be used to set a different escape character to make it work.
Activity
Show:
Snowflake source plugin is having below issues:
For some of the decimal fields, getting this error:
java.lang.ArithmeticException: Rounding necessary at java.math.BigDecimal.commonNeedIncrement(BigDecimal.java:4653) at java.math.BigDecimal.needIncrement(BigDecimal.java:4709) at java.math.BigDecimal.divideAndRound(BigDecimal.java:4617) at java.math.BigDecimal.setScale(BigDecimal.java:2905) at java.math.BigDecimal.setScale(BigDecimal.java:2965) at io.cdap.plugin.snowflake.source.batch.SnowflakeMapToRecordTransformer.convertValue(SnowflakeMapToRecordTransformer.java:86) at io.cdap.plugin.snowflake.source.batch.SnowflakeMapToRecordTransformer.convertValue(SnowflakeMapToRecordTransformer.java:64)
Solution: Add handling for rounding mode for Decimal fields.
Customer data is having '\' in the data which is causing pipeline to fail. It is the default escape char set in CSV parser.
Solution: Make escape character for csv parser configurable through runtime arguments, which can be set to something else if they have '\' in their data.
Name of runtime argument will be :
cdap.snowflake.source.escape