Hi everyone,
I need to read a huge csv file and I need to do it in differents executions. So, my idea is read the first 10.000 lines in the first execution, in the second one read from 10.001 until 20.000 and so on.
I´ve been trying to use Set Variables step , where I set the default value of two Params (init_value = 0, end_value=10000) whose are defined in the Parameters sections of the Transformation, to update that default values of this parameters to 10001 to 20000 and so on in each execution.
Then, using the Sample Rows step, I try to read the lines of the file using that parameters in this way:${init_value}..${end_value} , the first time it works with that values (0,10000) but then I don´t know how to update that values to (init_value=10001,end_value=20000) and so on.
Someone have any idea how I could do this or a new idea that I could use to read the file in this way?
I appreciate any idea!
Thanks!!
I need to read a huge csv file and I need to do it in differents executions. So, my idea is read the first 10.000 lines in the first execution, in the second one read from 10.001 until 20.000 and so on.
I´ve been trying to use Set Variables step , where I set the default value of two Params (init_value = 0, end_value=10000) whose are defined in the Parameters sections of the Transformation, to update that default values of this parameters to 10001 to 20000 and so on in each execution.
Then, using the Sample Rows step, I try to read the lines of the file using that parameters in this way:${init_value}..${end_value} , the first time it works with that values (0,10000) but then I don´t know how to update that values to (init_value=10001,end_value=20000) and so on.
Someone have any idea how I could do this or a new idea that I could use to read the file in this way?
I appreciate any idea!
Thanks!!