Quantcast
Channel: Pentaho Community Forums
Viewing all articles
Browse latest Browse all 16689

how to load huge data, using sync after merge or filter step

$
0
0
Hi All,

I am trying to load just 52 lakhs(5200000) of records from MySql oltp to MySql olap. I am getting issue at insert/update step because of this reason i have changed my .trn from insert/update step to synchronize after merge step and another approach filter rows step( it will write the rows if those are new then using table output step) if those are old then will update using update step.

But still my jobs is very slow and it is failing and getting error as stack over flow and java heap space error due to that i have implemented java script logic and it will loop the job with 5000 records. finally i am able to run successfully but it is too much slow.

i used below steps , can someone suggest me in better way to improve my jobs performance.

using PDI CE 5.3 with java 1.8 64 bit

connections.jpg

filter step.png

Sync merge.jpg
Attached Images

Viewing all articles
Browse latest Browse all 16689

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>