Hi,
I am using Kettle version 5.0.1.-stable.
An Insert / Update step sporadically fails with this error message:
org.pentaho.di.core.exception.KettleValueException:
BigNumber : Unable to compare with value [BigNumber(17, 2)]
In the beginning it was failing very sporadically. The Transformation is re-started in a job after 10 seconds again.
After a while the row is written to the target.
In the meantime it's happening with every 10th record.
We are experiencing the problem in production only (Ubuntu 12.04.4 LTS) .
It does not happen on my Windows development machine nor in our Ubuntu Beta environment.
Even if I copy the same source row into the beta database it will not fail.
Because of the fact that the record IS written after several retries (sometimes it takes hours, but finally the row will be written), I assume, that the error is not in the Update / Insert statement itself.
Can it be something in the synchronization/preparation in the dataflow? That the data type isn't clear in the moment when the update is checked?
I put a Block until step in front, to wait for all lookups and calculations, but it didnt change anything.
The Job did run perfectly for some weeks.
Then we added some calculated currency fields to the target table (decimal 17,2).
Each row always had (and has) an individual (local) currency.
In a first step I convert them to USD after a Lookup step for the conversion.
In a second step I get EUR, GBP, INR and JPY to USD courses in 1 Table Input, to calculate these values.
Since we a doing this, we have the Update problems.
What can cause that it sometimes works and sometimes does not for the same imput rows?
Thanks for any help.
I am using Kettle version 5.0.1.-stable.
An Insert / Update step sporadically fails with this error message:
org.pentaho.di.core.exception.KettleValueException:
BigNumber : Unable to compare with value [BigNumber(17, 2)]
In the beginning it was failing very sporadically. The Transformation is re-started in a job after 10 seconds again.
After a while the row is written to the target.
In the meantime it's happening with every 10th record.
We are experiencing the problem in production only (Ubuntu 12.04.4 LTS) .
It does not happen on my Windows development machine nor in our Ubuntu Beta environment.
Even if I copy the same source row into the beta database it will not fail.
Because of the fact that the record IS written after several retries (sometimes it takes hours, but finally the row will be written), I assume, that the error is not in the Update / Insert statement itself.
Can it be something in the synchronization/preparation in the dataflow? That the data type isn't clear in the moment when the update is checked?
I put a Block until step in front, to wait for all lookups and calculations, but it didnt change anything.
The Job did run perfectly for some weeks.
Then we added some calculated currency fields to the target table (decimal 17,2).
Each row always had (and has) an individual (local) currency.
In a first step I convert them to USD after a Lookup step for the conversion.
In a second step I get EUR, GBP, INR and JPY to USD courses in 1 Table Input, to calculate these values.
Since we a doing this, we have the Update problems.
What can cause that it sometimes works and sometimes does not for the same imput rows?
Thanks for any help.