Hi,
I´m currently using this step to modularize my transformations so I don´t have to repeat same designs.
Problem is that I´m using a lot of them (maybe 25 Mapping modules) and also working with near of 200 variables. There´s a point when I get out of memory (I´ve declared up to 1GB but I´m just testing with a single row!!) and I wanted to know if it is a question of number of declared modules, or it is caused by all the parameters that are being passed between them, or if memory usage could be reduced somehow.
Any help would be appreciated.
Regards!!
EDIT: Execution thowed a "java.lang.OutOfMemoryError: unable to create new native thread" exception. It´s not my intention to configure Pentaho to avoid this: I see I have to design the transformation in a different fashion. I didn´t know the limitations of Mapping step. It´s a pity :(
I´m currently using this step to modularize my transformations so I don´t have to repeat same designs.
Problem is that I´m using a lot of them (maybe 25 Mapping modules) and also working with near of 200 variables. There´s a point when I get out of memory (I´ve declared up to 1GB but I´m just testing with a single row!!) and I wanted to know if it is a question of number of declared modules, or it is caused by all the parameters that are being passed between them, or if memory usage could be reduced somehow.
Any help would be appreciated.
Regards!!
EDIT: Execution thowed a "java.lang.OutOfMemoryError: unable to create new native thread" exception. It´s not my intention to configure Pentaho to avoid this: I see I have to design the transformation in a different fashion. I didn´t know the limitations of Mapping step. It´s a pity :(