Hi, I am trying to use R script executor using Pentaho EE 7.0.0.0-25 (not free Execute R script) in my ETL. I already have prepared model from R and now I just want to apply it for new data.
The model is read using this statement
but the model is quite soficticated and therefore has over 400MB. So when I press "Test script" (or launch the transformation) after few second whole Pentaho crashes with classic Java(TM) Platform SE binary has stopped working.
The problem is most probably in size of the model because when I switch it to more simple one (with 1MB) it works.
My first hint was not enough memory so I tried it on another machine but memory usage didn't go over 8,2GB when PDI crashed - and yes, I have change the Xmx parameter in spoon.bat to 32048MB.
Running the script straight in R works as well, so my second hint is "isn't there any other memory limit for plugins?"
And for the record the plugin was installed using Marketplace so no dlls copying. Environment variables are probably set up correct if the script works for small model.
Does anyone have any idea how to fix this?
The model is read using this statement
Code:
fit <-
readRDS(file = qq("c:\\path\\to\\model.rds"))
The problem is most probably in size of the model because when I switch it to more simple one (with 1MB) it works.
My first hint was not enough memory so I tried it on another machine but memory usage didn't go over 8,2GB when PDI crashed - and yes, I have change the Xmx parameter in spoon.bat to 32048MB.
Running the script straight in R works as well, so my second hint is "isn't there any other memory limit for plugins?"
And for the record the plugin was installed using Marketplace so no dlls copying. Environment variables are probably set up correct if the script works for small model.
Does anyone have any idea how to fix this?