Hi there--
I'm running into Java memory heap space issues in PDI ...
It's odd, because on Computer A, there is no issue (2 GB allocated to PDI). But on computer B, there is an issue (also 2 GB allocated to PDI). Computer B has less memory overall though, so that's a potential factor.
It's a job that has a "loop" with probably a small memory leak. However, it used to take 100 or 200 loops to cause this error --- now it takes 4 loops ... four!
I'm guessing - since it's a shared computer -- external processes are coming in and sucking up memory during the job.
Is there a tool that can monitor memory usage in PDI and other processes when it occurs? (at midnight). Aka not a live monitor necessarily, but something records memory usage of different components - not sure.
By the way --- I also added a "dummy transformation" in the middle of the loop in my job. It does nothing but has the 'Advanced' settings checked: Clear list of result rows, clear the list of result files before execution. I thought this would be enough to plug the memory leak, but perhaps not.
Anyone have any advice here?
I'm not sure exactly how to scale back the memory usage of PDI here, other than making the job a "repeat every 2 seconds" instead of a loop, and killing PDI after the success condition, though that's definitely an extreme work-around.
I'm running into Java memory heap space issues in PDI ...
It's odd, because on Computer A, there is no issue (2 GB allocated to PDI). But on computer B, there is an issue (also 2 GB allocated to PDI). Computer B has less memory overall though, so that's a potential factor.
It's a job that has a "loop" with probably a small memory leak. However, it used to take 100 or 200 loops to cause this error --- now it takes 4 loops ... four!
I'm guessing - since it's a shared computer -- external processes are coming in and sucking up memory during the job.
Is there a tool that can monitor memory usage in PDI and other processes when it occurs? (at midnight). Aka not a live monitor necessarily, but something records memory usage of different components - not sure.
By the way --- I also added a "dummy transformation" in the middle of the loop in my job. It does nothing but has the 'Advanced' settings checked: Clear list of result rows, clear the list of result files before execution. I thought this would be enough to plug the memory leak, but perhaps not.
Anyone have any advice here?
I'm not sure exactly how to scale back the memory usage of PDI here, other than making the job a "repeat every 2 seconds" instead of a loop, and killing PDI after the success condition, though that's definitely an extreme work-around.