Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

PDI 6.0.0.0-353 - error when starting job from shell

$
0
0
The job runs, but the script hangs for much longer than expected. Looks like a karaf error. Any and all help is appreciated.


Trying to run the following script from command line:
echo off
"C:\Program Files\data-integration\kitchen" /file:C:\Documents\PDI_Repo\TestEmail.kjb /level:basic
> C:\Documents\Scripts\Logs\TestingEmail_log.txt



Receive the following error:
Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.
>echo off
"C:\Program Files\data-integration\kitchen" /file:C:\Documents\PDI_Repo\TestEmai
l.kjb /level:basic
DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.7.0_51
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jdk1.7.0_51\bin\java.exe
C:\Program Files\data-integration>"C:\Program Files\Java\jdk1.7.0_51\bin\java.ex
e" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dhttps.protocols=TLSv1,TLSv
1.1,TLSv1.2" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOS
ITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKET
TLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-lau
ncher-6.0.0.0-353.jar -lib ..\libswt\win64 -main org.pentaho.di.kitchen.Kitchen
/file:C:\Documents\PDI_Repo\TestEmail.kjb /level:basic
12:52:31,876 ERROR [KarafBoot] Error starting Karaf
java.io.IOException: Destination 'C:\Program Files\data-integration\.\karaf-copy
' directory cannot be created
at org.apache.commons.io.FileUtils.doCopyDirectory(FileUtils.java:1213)
at org.apache.commons.io.FileUtils.copyDirectory(FileUtils.java:1186)
at org.apache.commons.io.FileUtils.copyDirectory(FileUtils.java:1058)
at org.apache.commons.io.FileUtils.copyDirectory(FileUtils.java:1027)
at org.pentaho.platform.osgi.KarafBoot.startup(KarafBoot.java:88)
at org.pentaho.di.osgi.registryExtension.OSGIPluginRegistryExtension.ini
t(OSGIPluginRegistryExtension.java:105)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:5
58)
at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironm
ent.java:101)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:89)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:615)
at java.lang.Thread.run(Thread.java:744)
log4j:ERROR Could not create an Appender. Reported error follows.
java.lang.ClassCastException: org.apache.log4j.ConsoleAppender cannot be cast to
org.apache.log4j.Appender
at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.ja
va:248)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurat
or.java:176)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfi
gurator.java:191)
at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOM
Configurator.java:523)
at org.apache.log4j.xml.DOMConfigurator.parseCategory(DOMConfigurator.ja
va:436)
at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1004)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java
:872)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java
:755)
at org.apache.log4j.xml.DOMConfigurator.configure(DOMConfigurator.java:8
96)
at org.pentaho.di.core.logging.log4j.Log4jLogging.applyLog4jConfiguratio
n(Log4jLogging.java:85)
at org.pentaho.di.core.logging.log4j.Log4jLogging.init(Log4jLogging.java
:71)
at org.pentaho.di.core.KettleClientEnvironment.initLogginPlugins(KettleC
lientEnvironment.java:141)
at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironm
ent.java:104)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:89)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:615)
at java.lang.Thread.run(Thread.java:744)
log4j:ERROR Could not create an Appender. Reported error follows.
java.lang.ClassCastException: org.apache.log4j.ConsoleAppender cannot be cast to
org.apache.log4j.Appender
at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.ja
va:248)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurat
or.java:176)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfi
gurator.java:191)
at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOM
Configurator.java:523)
at org.apache.log4j.xml.DOMConfigurator.parseRoot(DOMConfigurator.java:4
92)
at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1006)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java
:872)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java
:755)
at org.apache.log4j.xml.DOMConfigurator.configure(DOMConfigurator.java:8
96)
at org.pentaho.di.core.logging.log4j.Log4jLogging.applyLog4jConfiguratio
n(Log4jLogging.java:85)
at org.pentaho.di.core.logging.log4j.Log4jLogging.init(Log4jLogging.java
:71)
at org.pentaho.di.core.KettleClientEnvironment.initLogginPlugins(KettleC
lientEnvironment.java:141)
at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironm
ent.java:104)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:89)
at org.pentaho.di.kitchen.Kitchen$1.call(Kitchen.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:615)
at java.lang.Thread.run(Thread.java:744)
2015/11/11 12:52:32 - Kitchen - Logging is at level : Basic logging
2015/11/11 12:52:32 - Kitchen - Start of run.
12:52:45,775 ERROR [KarafLifecycleListener] The Kettle Karaf Lifycycle Listener
failed to execute properly. Releasing lifecycle hold, but some services may be u
navailable.
2015/11/11 12:52:46 - TestEmail - Start of job execution
2015/11/11 12:52:46 - TestEmail - Starting entry [TEST_EMAIL]

Sometimes Error "Unexpected error during transformation metadata load" in CARTE

$
0
0
Hello,

I have the phenomenon that SOMETIMES (I did not find a rule yet) a Job which also calls some transfomations has Problems to call its transformations.

Then I get the following Errors:

2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - Starting entry [Perform data import]
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - Unexpected error during transformation metadata load
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY -
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - Unable to get object information for object with id=68
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY -
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - Couldn't get row from result set
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY -
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - Unable to get value 'Integer(38)' from database resultset, index 2
2015/11/11 17:06:14 - SCOP_MINIMUM_ORDER_QTY - Fail to convert to internal representation


Rebooting Carte solves it, the Job runs fine until sometimes will fail again (also other Jobs)
Seems that Carte looses the Information of the structure of the repository sometimes.

How can this be solved ?

Using Kettle 5.0.1 and the repository under Oracle DB.

Thx and best regards,

Andrej

Problems setting MySQL + Pentaho 4.8.0

$
0
0
Hello

Hope you can help me. I'm new in Pentaho and trying to configure AdminConsole and MySQL have some errors.

I change all the configs file to use driver=com.mysql.jdbc.Driver
Also set the correct port url=jdbc:mysql://localhost:3306/hibernate


Administration Console runs correctly in localhost:8099, but trying to create a database connection, error: "Connection attempted failed..." appears
error.jpg

MySQL service is running correctly, also JAVA

Any ideas??
Thanks a lot for your help.
Leticia Simental. :D
Attached Images
Attached Files

External conexion to Pentaho BI Server 6.0

$
0
0
Hi everyone!
Actually I've developed a Big Data environment on centos 7 server.
At the end, I'm using pentaho in a mysql DB. However, I cant access to pentaho remotely by IP:8080 from remote computer although ir works well on localhost. I spend a lot of days trying and searching any solutions.
Do anyone know a solution?
thank you so much for your time.

Pentaho 6 repo password

$
0
0
Pentaho 6 data integration installed.

Mysql 5.5 database running on debian 8.2

Created a database connection ok. ( Test connection connects)

Created a repo ok (I think).

When I try to connect to the repo using admin/password I get 'incorrect username or password' messages.

I can see the R_USER table with the admin user and an encrypted password is in the mysql database.

Any idea what I have done wrong? I can connect ok to mysql from the command line and inspect the pentaho repo database tables fine. Just cannot connect to the repo from spoon.

Without connecting to the repo spoon runs successfully and looks functional. I can create transforms and jobs.

Java version installed: 1.7.0_85

Any advice/clues?

Thanks,

Josh

Carte queueing

$
0
0
Hello,

one question:

Is it possible to configure CARTE, that it executes one Job/Transformation after the other and not parallel (if multilple Jobs/Tansformations are adressed to CARTE more or less at the same time) ?

Thx and best regards,

Andrej

Not able to process 4300000 records

$
0
0
Hi ,

I am trying to load 4300000 records(with 60 columns) using PDI-CE-6.0 ,Java 1.7, MySQL 5.6, finally i am not able to retrieve total records just got 600000 records only why because my extraction query is not able to process 4300000 records in MySQL temp table.

java heap size values in spoon.bat file am i need to increase size here ?
PENTAHO_DI_JAVA_OPTIONS= "-Xms1024m" "-Xmx4096m" "-XX:MaxPermSize=2048m"

here is the my process : Table Input -> Text file Output then Text file input -> Table Output

below is my SOURCE database details which is available in Amazon Cloud server, am i need to increase RAM/CPU Processor size here ? , i need to run jobs for millions of records. Please suggest me.

Am i need to load data chunk wise using java script ?, is this best approach
oltp.jpg
Attached Images

spoon - receiving error connecting to mssql database

$
0
0
hi - newbie here ..

receiving the following error when attempting a JDBI connection to a mssql database

Error connecting to database: (using class com.microsoft.sqlserver.jdbc.SQLServerDriver)
The property integratedSecurity does not contain a valid boolean value. Only true or false can be used.

Any ideas?

Thanks.

J0anne

Charts in email without logging in?

$
0
0
I have an email report with a bunch of charts, the problem is that they don't display unless you are logged into the BI server. Is there a way to set permissions so that the charts are visible to anyone without logging in?

Delete or replace line breaks on excel

$
0
0
Hi everyone!

I'm using 6.0 community edition, and working on a transformation, taking data from several excel files to create one file with the combined data.
So far I find how to do what I need, but i've been hitting a rock with this:

Some of the cells on the input files have line breaks or new line characters. I'm trying to find a way to automatically delete those characters or replace them with a space, but no luck.
I tried with Replace in a String and String Operations, but even if I don't mind about having to manually type more than a thousand of Search/Replace strings, the problem is that when I type ALT+010 or ALT+013, nothing happens on the Search field.

Is there any other way to find and replace those characters on an Excel file, or a way to enter them into the Search field?

Thanks!!

kettle - changing column names between input and output tables

$
0
0
Hi - kettle/spoon

My input table has a field (physical/virtual)

My output table (postgres)does not like dashes so it has a correponding (physicalVirtual) field.

How do I use spoon to map the old column name with the new column name.

Thx.

data integration - converting mssql datetimestamp to posgresql timestamp

$
0
0
Hi .. me again ...

Is there a transform that can convert MSSQL character type datetimestamp into postresql timestamp?

Thx.

Joanne

BI Exception Reporting

$
0
0
I was wondering if anyone has implemented a solution for exception reporting from the BA server.

E.g. A query/report that is scheduled to run daily and usually does not have any results (checking for some rare anomaly). Rather than requiring end users to remember to check an empty report on a daily basis, they only get the report if there is actual results to look at.

Yellowfin has similar feature
http://wiki.yellowfin.com.au/display...mail+Broadcast

Run Python code from Kettle PDI

$
0
0
Looking for Kettle sample/example that will do the following...

Step 1 - Kettle passes a large file (cvs/txt) to a python module
Step 2 - Python Module reads the (whole/complete) large file, adds several more columns, writes out revised file
Step 3 - Kettle reads revised file (line by line) and outputs to Oracle db (each line is one row).

Any help appreciated.

Pentaho 6.0 CE - DB Error with PDI & CSV File via User Console

$
0
0
Hi,

I'm trying to get the BI Server to work on my Localhost.
After weeks of pain and tears the tomcat finally boots without any Exceptions and the User Console shows in my Browser.
I can look at the sample Data, Dashboards and everything.

The next Step would be for me to get some of my own sample Data into the DB.
I loaded a Very simple File into the Wizard for creating a Data Source.
When Finished it showed me That Error:

2015/11/12 23:22:35 - output.0 - Connected to database [Hibernate] (commit=1000)
23:22:37,843 ERROR [CsvTransformGenerator] Error Start: Pentaho Pentaho Open Source BA Server 6.0.0.0-353
23:22:37,848 ERROR [CsvTransformGenerator] ::: Error executing DDL
org.pentaho.di.core.exception.KettleDatabaseException:
Couldn't execute SQL: CREATE TABLE "201511122322"
(
"Name" VARCHAR(10)
, "NachName" VARCHAR(9)
, "Geburtsdatum" VARCHAR()
, "Gewicht" DOUBLE PRECISION
)


unexpected token: ) : line: 5

How can it be that Pentaho creates SQL that the Database does not understand?!

My System as used:
Win10 Notebook / gerDE Locale
Java jdk1.7.0_80
Pentaho biserver-ce-6.0.0.0-353
Database postgresql-9.4.5-1-windows-x64
Browser Chrome Version 46.0.2490.80 m

Denormalizer Attribute

Pentaho 6 - ERROR [Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem

$
0
0
Estoy intentando instalar Pentaho 6 CE con mysql y al querer correrlo me salen los siguientes errores y no me permite levantar el servicio:
1 - Cannot use configuration org.pentaho.requirejs for [org.osgi.service.cm.ManagedService
2 - ERROR [Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error mientras se intentaba ejecutar la secuencia de arranque por org.pentaho.platform.engine.services.connection.datasource.dbcp.DynamicallyPooledDatasourceSystemListener
No logro ver donde esta el error. Adjunto archivo calalina.out
Attached Files

Unablle to publish my report from Report Designer

$
0
0
Hello,

I have just install Pentaho BI Server and Report Designer.

I can go to the BI Server, I can also create report on Report Designer, whereas I can't publish a report on server from the report designer.

I have this error : 'Unable to publish your file'

This is localhost logs:

ov. 12, 2015 11:40:57 PM org.apache.catalina.core.StandardWrapperValve invoke
GRAVE: "Servlet.service()" pour la servlet default a généré une exception
java.lang.IllegalArgumentException: username cannot be null or empty
at org.springframework.util.Assert.hasLength(Assert.java:136)
at org.pentaho.platform.engine.security.userroledao.hibernate.HibernateUserRoleDao.getUser(HibernateUserRoleDao.java:121)
at org.pentaho.platform.engine.security.userroledao.userdetailsservice.UserRoleDaoUserDetailsService.loadUserByUsername(UserRoleDaoUserDetailsService.java:62)
at org.springframework.security.providers.dao.DaoAuthenticationProvider.retrieveUser(DaoAuthenticationProvider.java:83)
at org.springframework.security.providers.dao.AbstractUserDetailsAuthenticationProvider.authenticate(AbstractUserDetailsAuthenticationProvider.java:121)
at org.springframework.security.providers.ProviderManager.doAuthentication(ProviderManager.java:188)
at org.springframework.security.AbstractAuthenticationManager.authenticate(AbstractAuthenticationManager.java:46)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:139)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:60)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:745)

Have you a idea?

Thanks!!!

Double Byte issue with CE 6.0 Repository

$
0
0
Issue reported:
PDI version: CE 6.0.0.0 - 353
Repository DB: MySQL - AWS RDS

Fill in double byte characters (like "帕金森") in Filter Row, save to Repository, close Spoon and reload the same transformer from repository, then Double Byte character filled become "???" thus Filter Row will no longer functional.

Investigation Made:
1. Double byte value stored into field VALUE_STR in repository table R_VALUE become "???" right after transform is saved.
2. Manually update VALUE_STR field data to proper double byte Chinese and can be load correct into Spoon FOR THE FIRST TIME, but once the transform is save to repository again, one more row will be added into R_VALUE table with VALUE_STR filled with "???".
Please refer to attached image about how to duplicate this issue.

Suspect program for repository update has a issue to write double byte data back to database. Can we have a quick fix on this?

CE6DoubleByteIssue.jpg
Attached Images

Visibility of a published cube

$
0
0
Hi,
It is possible that a scheme published by " scheme workbench " are visible from a specific user ?
When I publish using as URL " http://localhost:8080/pentaho ", the cube it is visible from all user.
I have to change the URL or should I do something else such as the user roles :confused: ??

Thank you in advance and sorry for my English :o
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>