I'm using Redhat linux 2.6.32-358.el6.x86_64 running PDI 5.1.0
I got the 'REST Client' working (reading the request data from a file and setting the rest of needed info as variables and using get variables).
At a low volume it works fine but at a high volume i'm getting 'failed java.net.SocketException:Too many open files'.
At this point I turned off everything past the actual request (ie. turned off processing the received response) but I still get the error.
I tried varying the number of copies I launch but after a certain point it still errors out.
I can have the linux settings changes to some extent but not enough that can be sure of handling the max possible requests which is unknown
Is there something I am missing that is causing the transformation to accumulate open files?
Is there a java setting that someone knows of that will do garbage collection more frequently if is a garbage collection problem?
Would using partitioning rather than 'copies to start' be better?
Thanks for any help
I got the 'REST Client' working (reading the request data from a file and setting the rest of needed info as variables and using get variables).
At a low volume it works fine but at a high volume i'm getting 'failed java.net.SocketException:Too many open files'.
At this point I turned off everything past the actual request (ie. turned off processing the received response) but I still get the error.
I tried varying the number of copies I launch but after a certain point it still errors out.
I can have the linux settings changes to some extent but not enough that can be sure of handling the max possible requests which is unknown
Is there something I am missing that is causing the transformation to accumulate open files?
Is there a java setting that someone knows of that will do garbage collection more frequently if is a garbage collection problem?
Would using partitioning rather than 'copies to start' be better?
Thanks for any help