Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Load Files from Directory into DB

$
0
0
Hi Guys,


I'm trying to set up a job and transformation that loads all the files in a directory into a database. It seems so simple, but I just can't get it to work.
I've create a simple job and transformation where the job reads the files names from a directory and passes each file path and name to my transformation where it will load it into my database. (In essence, merging all files into one database table)


I tried to attach screenshots - not working. So here goes...


*******Job contains******
""""Add filenames to result""""
- (Added Folder = /tmp/Generate)
- No Wildcard


""""Run transformation""""
- Advanced Tab
ticked - Copy previous results to args
ticked - Execute for every input row
- Arguments Tab
Argument - "ParsedFileName"
- Parameters Tab
ticked - Pass all parameter values down to the sub-transformation
Parameter - "ParsedFileName"




******Transformation contains******
""""Transformation properties""""
- Parameters Tab
ParsedFileName

""""CSV file input""""
- Filename - ${ParsedFileName}


Please help

Dividing two columns from different Sub reports

$
0
0
I have a main report and few sub reports. In the main report I have a total(sum) for one of the column as 'Total A' And in one of the sub reports I have a total for another column as 'Total B'

Now i want a divide for those two columns i.e ([Total A]/[Total B])in one feild.

Any Idea?
Thanks.

Kettle Java API in a software product

$
0
0
If someone can answer a question, Does it cost anything to use Kettle Java API to call some ETL transformations in an OLTP software product that is commercial use?

Tks

Merge or normalize the row.

$
0
0
Hi guys I have two streaming data i am getting from xml, how to normalize it, it is a simple one one but i not able to do it.

partyReference tradeId
party1 123456
party2 789562

partyreference partyname PartyId
party1 AbCD 12356
party2 GHI 562
party3 MAK 714

I want data as like in column
partyReference partyid tradeID partyName.

How shall i achieve it ? And does anyone how to fetch XML saved in CLOB column in DB and parse it element by element and then again storing it to the database tables. Thanks in advance

CCC Destroy Method

$
0
0
Is there a Destroy method on the CCC. If we just empty out the canvas there are still event listeners attached, How can we get rid of them.

saiku analytics

$
0
0
Will Saiku analytics work in pentaho community edition ? I am finding issues in installing "Saiku Analytics" plugin from marketplace......Any Help would be appreciated.......

Error while updating marketplace in Pentaho BI 4.8.0 stable

$
0
0
Hello,

I have an error while trying to upgrade Pentaho Marketplace, like the next image:

marketplace error.jpg


Console output shows the next error:



10:41:51,298 ERROR [MarketplaceService]
No se ha podido cargar el trabajo desde el fichero XML [/usr/share/biserver-ce/biserver-ce/pentaho-solutions/system/marketplace/processes/download_and_install_plugin.kjb]


Error reading information from input stream
zip file closed




org.pentaho.di.core.exception.KettleXMLException:
No se ha podido cargar el trabajo desde el fichero XML [/usr/share/biserver-ce/biserver-ce/pentaho-solutions/system/marketplace/processes/download_and_install_plugin.kjb]


Error reading information from input stream
zip file closed




at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:879)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:831)
at org.pentaho.marketplace.MarketplaceService.installPlugin(MarketplaceService.java:274)
at org.pentaho.marketplace.MarketplaceService.installPluginJson(MarketplaceService.java:330)
at org.pentaho.marketplace.MarketplaceContentGenerator.installpluginjson(MarketplaceContentGenerator.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at pt.webdetails.cpf.SimpleContentGenerator.createContent(SimpleContentGenerator.java:58)
at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:142)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:60)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.pentaho.di.core.exception.KettleXMLException:
Error reading information from input stream
zip file closed


at org.pentaho.di.core.xml.XMLHandler.loadXMLFile(XMLHandler.java:588)
at org.pentaho.di.core.xml.XMLHandler.loadXMLFile(XMLHandler.java:508)
at org.pentaho.di.core.xml.XMLHandler.loadXMLFile(XMLHandler.java:494)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:863)
... 72 more
Caused by: java.lang.IllegalStateException: zip file closed
at java.util.zip.ZipFile.ensureOpen(ZipFile.java:415)
at java.util.zip.ZipFile.getEntry(ZipFile.java:160)
at java.util.jar.JarFile.getEntry(JarFile.java:209)
at java.util.jar.JarFile.getJarEntry(JarFile.java:192)
at sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:757)
at sun.misc.URLClassPath$JarLoader.findResource(URLClassPath.java:735)
at sun.misc.URLClassPath.findResource(URLClassPath.java:146)
at java.net.URLClassLoader$2.run(URLClassLoader.java:385)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findResource(URLClassLoader.java:382)
at java.lang.ClassLoader.getResource(ClassLoader.java:1002)
at java.lang.ClassLoader.getResourceAsStream(ClassLoader.java:1192)
at org.apache.xerces.parsers.SecuritySupport$6.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.xerces.parsers.SecuritySupport.getResourceAsStream(Unknown Source)
at org.apache.xerces.parsers.ObjectFactory.findJarServiceProvider(Unknown Source)
at org.apache.xerces.parsers.ObjectFactory.createObject(Unknown Source)
at org.apache.xerces.parsers.ObjectFactory.createObject(Unknown Source)
at org.apache.xerces.parsers.DOMParser.<init>(Unknown Source)
at org.apache.xerces.parsers.DOMParser.<init>(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.<init>(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)
at org.pentaho.di.core.xml.XMLHandler.loadXMLFile(XMLHandler.java:546)
... 75 more
10:41:51,302 ERROR [TelemetryHelper] Error while trying to read plugin version for marketplace
java.lang.IllegalStateException: zip file closed
at java.util.zip.ZipFile.ensureOpen(ZipFile.java:415)
at java.util.zip.ZipFile.getEntry(ZipFile.java:160)
at java.util.jar.JarFile.getEntry(JarFile.java:209)
at java.util.jar.JarFile.getJarEntry(JarFile.java:192)
at sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:757)
at sun.misc.URLClassPath$JarLoader.findResource(URLClassPath.java:735)
at sun.misc.URLClassPath.findResource(URLClassPath.java:146)
at java.net.URLClassLoader$2.run(URLClassLoader.java:385)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findResource(URLClassLoader.java:382)
............





Please, I need help!

Thank you very much!!!
Attached Images

Pentaho and CAS Issue

$
0
0
Hello,

I'm trying to configure Pentaho BI Server with CAS and I have some problems with it.

Versions:

Pentaho BI Server : biserver-ce-5.2.0.0-209
CAS Server: cas-server-4.0.0-release

I used default configuration and installation process in both servers. I went step by step with "Implement SSO -> Switch to Central Authentication Service" found on page:

http://infocenter.pentaho.com/help/i...h_cas_sso.html

Currently I can login with configured user (i.e. Admin/password) to CAS Server with success. When I'm going to url: http://localhost:8080/pentaho it redirect me to https://localhost:8443/cas-server-we...security_check where I can login successfully (which means that redirecting is configured good), but it redirect me to http://localhost:8080/pentaho/j_spri...01.example.org where I get error:
Sorry. We really did try.

Something went wrong. Please try again

or contact your administrator.



I attached logs and my configs in this post.
I went through google and forums, I found some tutorials, but they didn't work at all. I found some post that they works for older version of Pentaho.

Did someone configure Pentaho and CAS in version that I'm trying ? Any ideas ? I googled that the reason can be in applicationContext-spring-security.xml . I found used by me on http://forums.pentaho.com/showthread...tion-Server%29 and I went throught same error as author, but I figured out how to fixed it. But still have problem with properly loggin in pentaho+cas.

Best regards,
Emil Jaszczuk

Sorry for my language, english isn't my native language.

Login Error in Pentaho BI Server

$
0
0
From this forum, http://forums.pentaho.com/showthread...a-on-Windows-7, where I had an issue starting Catalina.

When I try to use the admin/password to login, I am getting an error message "a login error occurred. Please try again."

I have check if at all I have multiple version of Java installed on my machine, it is only 1.7.

Code:

C:\Program Files\Java>echo "%PATH%""C:\Python27-64\python.exe;C:\Program Files\Lenovo\Fingerprint Manager Pro\;C:\Program Files (x86)\Lenovo\Access Connections\;C:\P
rogram Files\TortoiseHg\;C:\Program Files (x86)\GitExtensions\;C:\Program Files (x86)\GTK2-Runtime\bin;C:\Windows\system32;C:\Prog
ram Files (x86)\Java\jdk1.7.0_51\bin; c:\python27-64\;C:\Program Files (x86)\Git\bin;C:\Installs\Pentaho\biserver-ce\tomcat\bin"

What can I do resolve this error?

Any suggestions are welcome.

Thanks,

Ron

Preview doesn't work 5.0,5.2

$
0
0
When attempting to preview query results in an 'input from table' I get errors.
'One or more errors occurred during preview! Examine the log file to see what went wrong.

I don't see a log file anywhere and the steps that are shown don't give me any real information for a solution.

The transformation runs fine but not the preview.
I read there may be a bug...Using mac OSx 10.8.5

Any other solutions? Suggestions?
Thanks.

automatically assign parameter value on run "error processing component (prompt72892)

$
0
0
greetings. after successfully read/using a few excellent threads on solving a CSV parameter issue (I am using the CSVARRAY function in a 2nd hidden parameter), I am getting an error when I try to run the report from the BI Server:

"error processing component (prompt72892)"

BI SERVER 5.1.0.0.752
PRD 5.1.0.0-752

what i want to is 'auto-assign' the value to the parameter.

in other words :
1. i have a query that returns some values in CSV format
2. i set that to be parm1
3. parm1 is 'hidden' and 'use first value if default value formula results in NA'
4. i set parm2 to also be hidden and uses the CSVARRAY function

when i run, manually, from PRD, it does not prompt at all and runs ok.

i am (pretty sure) that checking the 'use first value if default value formula results in NA' box is NOT the right way to auto-assign the query result to a parameter.

any help is appreciated.

Kettle latest from Git with Eclipse Luna SR 1 (4.4.1)

$
0
0
I've cloned Kettle from the GIT repo, have ant and ivy installed and configured.

I can run:

Code:

ant clean-all resolve create-dot-classpath
It works fine, and `ant dist` works and compiles everything.

However, I cannot get the project to import into eclipse. I end up with thousands of "X cannot be resolved to a type" errors. I've tried doing a clean, refresh and rebuild, and all sorts of things with Eclipse.

Does anyone have any ideas? I've been at this for hours.

PDI 5.2 keeps crashing

$
0
0
I get the following error messages:

Code:

java: cairo-misc.c:380: _cairo_operator_bounded_by_source: Assertion `NOT_REACHED' failed.
./spoon.sh: line 202: 30088 Aborted                "$_PENTAHO_JAVA" $OPT -jar "$STARTUP" -lib $LIBPATH "${1+$@}"

I can reproduce it when needed, such as by trying to edit the query of a table input step, or by testing a connection to a database.
Why is this happening and how can I resolve it?

I also get the following warning repeatedly, if it's relevant:
Code:

(SWT:29909): Gtk-WARNING **: gtk_widget_size_allocate(): attempt to allocate widget with width -5 and height 18
Attempting to load ESAPI.properties via file I/O.
Attempting to load ESAPI.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/data-integration/ESAPI.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/ESAPI.properties
Not found in 'user.home' (/root) directory: /root/esapi/ESAPI.properties
Loading ESAPI.properties via file I/O failed. Exception was: java.io.FileNotFoundException
Attempting to load ESAPI.properties via the classpath.
SUCCESSFULLY LOADED ESAPI.properties via the CLASSPATH from '/ (root)' using current thread context class loader!
SecurityConfiguration for Validator.ConfigurationFile not found in ESAPI.properties. Using default: validation.properties
Attempting to load validation.properties via file I/O.
Attempting to load validation.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/data-integration/validation.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/validation.properties
Not found in 'user.home' (/root) directory: /root/esapi/validation.properties
Loading validation.properties via file I/O failed.
Attempting to load validation.properties via the classpath.
validation.properties could not be loaded by any means. fail. Exception was: java.lang.IllegalArgumentException: Failed to load ESAPI.properties as a classloader resource.
SecurityConfiguration for Logger.LogServerIP not either "true" or "false" in ESAPI.properties. Using default: true

Pentaho De-Duplicate Step

$
0
0
Hello, I am working on a project where we are getting customer data from two different databases. In most of the cases the same customer exists in both the databases. My goal is to de-duplicate these customers and create a master customer and associate all the related aka slaves customers to the master. A customer is identified by name, addressline1, city, state, zip, phone attributes. So I need to match the records using all these attributes. Pentaho website mentions about Data Quality and Profiling capabilities using which we could Standardize, validate, de-duplicate and cleanse inconsistent or redundant data. We are currently cleansing the addresses using a API which is being invoked from PDI. What are the steps or patterns I could use for deduplicating customers? Please share your ideas on how you would approach this use case. Thanks.

Saiku Plugin Hang on Creation

$
0
0
All,

From the main dashboard, File > New > Saiku Analytics causes an endles "Just a few moments please. Operation in progress..." message with an endless "Loading" message beneath it (with the saiku logo).

I'm at a bit of a loss in a few areas. Deploying the CE Bi Server as normal resulted in a marketplace that didn't work, so I downloaded and replaced the marketplace. At that point I was able to install the Saiku plugin, which on the main marketplace page says "up to date" at version 2.6, yet when clicked the link shows 3.0.8.

tomcat/logs/pentaho.log doesn't show much short of a few missing file references. Can anyone give me any pointers?

CentOS 6.5 x64 with the installer here: http://sourceforge.net/projects/pent...9.zip/download

How to customise the design component in pentaho to use other custom compression type

$
0
0
Hi,


I am using Pentaho(PDI) 5.2 version with Linux. And also Hadoop(HDP 2.1) is installed in Linux OS .PDI and Hadoop are working fine. I am transferring and loading data to and fro to HDFS using "Hadoop file Input" and "Hadoop file output".
If i want to compress the data then compression support provided at - Hadoop file input/output -> content -> compression (GZIP,Snappy, Hadoop-snappy, Zip). But if i want to provide the compression type other than present in drop down list, How to customise the componenet?


Other any other way to do it?


If any one know then kindly provide some solution.


Thanks in Advance

Java code for Printing predicted output ( PlainText output)

$
0
0
In Weka GUI, after doing any classifier operation on dataset it will give the result. By using the output predictions we can get the output in PlainText format.
for example,
=== Predictions on training set ===


inst# actual predicted error prediction
1 1:N 1:N 1
2 1:N 1:N 1
3 2:Y 2:Y 1
4 2:Y 2:Y 1
5 1:N 1:N 1

The same format i need to print using Java code. So, what all things i need to do to get that??
Thanks in advance.

Pentaho Cache mechanism

$
0
0
Hello All,

I have implemented CDC as a caching mechanism for my mondrian server but I want to move now from hazlecast to gigaspace as IMDG's.

Any way to move from one caching mechanism to another ??


Regards,Nitesh

Unable to get value 'string (50) "from database resultset (ERRORCODE = -4220, SQLSTAT

$
0
0
I do not speak English so excuse the writing.

I'm doing an ETL and am having the following error: Unable to get value 'string (50) "from database resultset (ERRORCODE = -4220, SQLSTATE = null)

From what I researched, it is a DB2 driver error. This IBM page deals with the subject, but do not know how to apply for Pentaho:
http://www-01.ibm.com/support/docvie...id=swg1IC83414

Below seggue error screen:
error.jpg
Attached Images

OutOfMemory errors after migrating from 4.4 to 5.x

$
0
0
Hi!
We have a job with multiple sequential SQL steps, each of them invokes a MySQL query to create a new table. The create script is similar to:

Code:

CREATE TABLE new AS
SELECT
    *
FROM
    old
;

This job has been running daily for months without any issues so far. However, after upgrading from pdi 4.4 to 5.0.1 (or 5.2 as well) the memory usage keeps growing and it eventually fail with:

Code:

java.lang.OutOfMemoryError: Java heap space
Moreover, I would assume MySQL to be the one using more memory, not Kettle, since it is just an SQL step with no results.


Things that haven't worked:
  • Increase VM memory. The job keeps using more and more memory and eventually fails. The tables do not even contain that much data, so Spoon's default 512MB should suffice.
  • Update MySQL connector to the latest version. Same results.
  • Decrease logging level and tinker the KETTLE_MAX_LOG* properties.
  • We are using a database repository to store the jobs, and we tried migrating both using the "Create or Update" repository option, and creating a new empty repository and importing the jobs.


Any help will be appreciated.

Thanks! :)
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>