Quantcast
Channel: Pentaho Community Forums
Viewing all articles
Browse latest Browse all 16689

JVM Memory Usage and Performance

$
0
0
I'm new to Kettle and I've setup a test ETL environment with pdi-ce-5.0.0.209 on Windows Server 2008 R2(Java 8u31, 4 CPUs, 64GB RAM). I've setup a transform that takes a table input from MS SQL 2008 R2(only 4 fields, but 500 million rows) and a table output to Postgres 9.3.4

In Spoon.bat I've set Java options =-Xmx60416m MaxPermSize1024m
Transform setting Nr Rows in rowset: 100000
Table Output: Nr of copies:3 Commit size:100000

After everything gets "warmed up" I can get 80,000+ r/s transferred into PGSQL! Once the JVM consumes it's allocation of RAM(about 45mins) performance hits the floor and CPU on the machine running Spoon.bat bounces 70-100%. It doesn't crash or lock up, but it's really slow.

Can someone explain why this is happening? Do I have a bad setting somewhere? Java seems to really like all that memory, but I'm afraid that's all I can give it. What do I need to tweak?

Viewing all articles
Browse latest Browse all 16689

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>