I have a Text file input step, which reads from several files under the same directory, and produce a huge dataset. I need to produce an excel file output for each case in a Switch / Case statement and give each excel file a name based on the value of known field in the dataset. Is this possible with Spoon ?
↧
Produce an excel ouput file step for each case in a Switch/Case step
↧
log4j Could not parse url [file:/data/pentaho/data-integration/./system/osgi/log4j.xm
Hi, I executes a basic transformation that operates on Pentaho 5.4. Since I switched to v6 I get an error like this :
I use debian and postgreSQL
I use the command ./pan.sh -file : matransformation.ktr
Can someone help me ?
I use debian and postgreSQL
I use the command ./pan.sh -file : matransformation.ktr
Can someone help me ?
Code:
log4j:ERROR Could not parse url [file:/data/pentaho/data-integration/./system/osgi/log4j.xml].
java.io.FileNotFoundException: /data/pentaho/data-integration/./system/osgi/log4j.xml (Aucun fichier ou dossier de
ce type)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:146)
at java.io.FileInputStream.<init>(FileInputStream.java:101)
at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
at org.apache.log4j.xml.DOMConfigurator$2.parse(DOMConfigurator.java:765)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:871)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:778)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.log4j.Logger.getLogger(Logger.java:104)
at org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:262)
at org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:108)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.commons.logging.impl.LogFactoryImpl.createLogFromClass(LogFactoryImpl.java:1025)
at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:844)
at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
at org.springframework.osgi.extender.internal.activator.ContextLoaderListener.<clinit>(ContextLoaderListener.java:253)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:383)
at org.apache.felix.framework.Felix.createBundleActivator(Felix.java:4336)
at org.apache.felix.framework.Felix.activateBundle(Felix.java:2141)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2064)
at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1291)
at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:304)
at java.lang.Thread.run(Thread.java:745)
log4j:WARN No appenders could be found for logger (org.springframework.osgi.extender.internal.activator.ContextLoaderListener).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
janv. 29, 2016 12:14:38 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFOS: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/core
janv. 29, 2016 12:14:38 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFOS: Registered blueprint namespace handler for http://cxf.apache.org/configuration/beans
janv. 29, 2016 12:14:38 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFOS: Registered blueprint namespace handler for http://cxf.apache.org/configuration/parameterized-types
janv. 29, 2016 12:14:38 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
↧
↧
Postgres Bulk Loader hangs after few records - No error or output shown
I am running the 5.4.0.1 version of PDI CE.
I have 5 parallel transformation to extract data from a Postgres Database and write it to another Postgres DB using Table Input -> Postgres BulkLoader in the transformation.
However, the postgress bulk loader insertion is hanging after few records (about 100-200 records).
There is no error thrown and the transformation just hangs for several hours/minutes.
After checking on the Postgres server side, there seems to be no pending requests sent.
I created alternative transformations to insert data using Table Output and these ones succeds without any problems.
Anyone has any leads on what can be the root cause of this issue?
It is really dificulty to report this issue on JIRA.
I have 5 parallel transformation to extract data from a Postgres Database and write it to another Postgres DB using Table Input -> Postgres BulkLoader in the transformation.
However, the postgress bulk loader insertion is hanging after few records (about 100-200 records).
There is no error thrown and the transformation just hangs for several hours/minutes.
After checking on the Postgres server side, there seems to be no pending requests sent.
I created alternative transformations to insert data using Table Output and these ones succeds without any problems.
Anyone has any leads on what can be the root cause of this issue?
It is really dificulty to report this issue on JIRA.
↧
Cannot connect to databases - gives hsqldb error
We are configuring a new workstation (CentOS) with pdi-ce-6.0.0.0-353 and encountered an error connecting to any of the database connections in spoon. The error message (shown below) mentions hsqldb even though the connection we are testing is to a Microsoft Dynamics GP instance on SQL Server. We receive an equivalent message for Oracle and MySQL connections. All of these work just fine on a different CentOS workstation and we copied over all the applications including pdi-ce-6.0.0.0-353 from that installation.
The one glaring difference between the functional CentOS and the failing one is that the failing instance had java8 installed using yum. Although spoon is clearly being executed with Java JDK 1.7.0_80-b15 it might be possible that the path is somehow picking up the 1.8 java components.
Does anyone recognize this error or have any insight?
Error connecting to database [greatplains] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.pentaho.di.core.database.Database.normalConnect(Database.java:459)
at org.pentaho.di.core.database.Database.connect(Database.java:357)
at org.pentaho.di.core.database.Database.connect(Database.java:328)
at org.pentaho.di.core.database.Database.connect(Database.java:318)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2719)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:588)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:136)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:389)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:318)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.editConnection(SpoonDBDelegate.java:89)
at org.pentaho.di.ui.spoon.Spoon.editConnection(Spoon.java:2671)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1339)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7939)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9214)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:653)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:574)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:443)
... 54 more
Caused by: java.lang.NoSuchMethodError: org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.hsqldb.jdbc.JDBCDriver.getConnection(Unknown Source)
at org.hsqldb.jdbc.JDBCDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:554)
The one glaring difference between the functional CentOS and the failing one is that the failing instance had java8 installed using yum. Although spoon is clearly being executed with Java JDK 1.7.0_80-b15 it might be possible that the path is somehow picking up the 1.8 java components.
Does anyone recognize this error or have any insight?
Error connecting to database [greatplains] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.pentaho.di.core.database.Database.normalConnect(Database.java:459)
at org.pentaho.di.core.database.Database.connect(Database.java:357)
at org.pentaho.di.core.database.Database.connect(Database.java:328)
at org.pentaho.di.core.database.Database.connect(Database.java:318)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2719)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:588)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:136)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:389)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:318)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.editConnection(SpoonDBDelegate.java:89)
at org.pentaho.di.ui.spoon.Spoon.editConnection(Spoon.java:2671)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1339)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7939)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9214)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:653)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:574)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:443)
... 54 more
Caused by: java.lang.NoSuchMethodError: org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.hsqldb.jdbc.JDBCDriver.getConnection(Unknown Source)
at org.hsqldb.jdbc.JDBCDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:554)
↧
POST a csv file using Http Post step
Hi,
I'm finding it difficult to post a csv file to some service using Http Post step. I tried with Content-Type as text/csv in request header params but nothing works for me. Can anyone please provide me an sample script?
Thanks,
Murali
I'm finding it difficult to post a csv file to some service using Http Post step. I tried with Content-Type as text/csv in request header params but nothing works for me. Can anyone please provide me an sample script?
Thanks,
Murali
↧
↧
Storing Output Value Table Input running once per row
Hi,
I have a transformation that is executed for every input row. This transformation has only one step: Table Input. My idea is to execute the mysql query once per row. The Table Input step always returns only one value and I would like to make a stream with all these values that are results of that step, but I don't know exactly how to do it. Someone could give me an idea?
All the files are attached.
Thanks and I'm sorry for the bad English! =(
I have a transformation that is executed for every input row. This transformation has only one step: Table Input. My idea is to execute the mysql query once per row. The Table Input step always returns only one value and I would like to make a stream with all these values that are results of that step, but I don't know exactly how to do it. Someone could give me an idea?
All the files are attached.
Thanks and I'm sorry for the bad English! =(
↧
Pentaho Kettle/PDI fails on second request
I have the latest version of Kettle/PDI (6.0.1.0-386). Carte is running locally on Windows with the following configuration:
And in .kettle/repositories.xml:
You'll notice these are pretty much the default with some specific configuration for the repository database. Through Spoon, I've created a simple transformation that runs a select on a database table, performs some simple calculations on the columns, and returns some JSON.
If I tell the transformation to run on master1, it works and spits out the JSON.
If I run exactly the same command again, it errors:
I don't understand why the connection to the repository database fails after the first request. Carte continues to run, despite this error, but will throw errors like this when accessed via URL:
I dug through the code for that stack trace, and that means the Repository object is null. So, for some reason, Carte can connect to the PDI repository, but when it succeeds once, something errors and it drops the connection and can no longer find the transformations.
I've attached the .ktr file for the transformation I'm using: fact_rollup.ktr
Code:
<slave_config>
<slaveserver>
<name>master1</name>
<hostname>localhost</hostname>
<port>8081</port>
<master>Y</master>
</slaveserver>
<repository>
<name>PDI Repo</name>
<username>username</username>
<password>password</password>
</repository>
</slave_config>
Code:
<repositories>
<connection>
<name>PDI Repo</name>
<server>127.0.0.1</server>
<type>MYSQL</type>
<access>Native</access>
<database>pdi</database>
<port>3306</port>
<username>username</username>
<password>Encrypted password</password>
<servername/>
<data_tablespace/>
<index_tablespace/>
<attributes>
<attribute><code>EXTRA_OPTION_MYSQL.defaultFetchSize</code><attribute>500</attribute></attribute>
<attribute><code>EXTRA_OPTION_MYSQL.useCursorFetch</code><attribute>true</attribute></attribute>
<attribute><code>FORCE_IDENTIFIERS_TO_LOWERCASE</code><attribute>N</attribute></attribute>
<attribute><code>FORCE_IDENTIFIERS_TO_UPPERCASE</code><attribute>N</attribute></attribute>
<attribute><code>IS_CLUSTERED</code><attribute>N</attribute></attribute>
<attribute><code>PORT_NUMBER</code><attribute>3306</attribute></attribute>
<attribute><code>PRESERVE_RESERVED_WORD_CASE</code><attribute>N</attribute></attribute>
<attribute><code>QUOTE_ALL_FIELDS</code><attribute>N</attribute></attribute>
<attribute><code>STREAM_RESULTS</code><attribute>Y</attribute></attribute>
<attribute><code>SUPPORTS_BOOLEAN_DATA_TYPE</code><attribute>Y</attribute></attribute>
<attribute><code>SUPPORTS_TIMESTAMP_DATA_TYPE</code><attribute>Y</attribute></attribute>
<attribute><code>USE_POOLING</code><attribute>N</attribute></attribute>
</attributes>
</connection>
<repository>
<id>KettleDatabaseRepository</id>
<name>PDI Repo</name>
<description>PDI Repo</description>
<connection>PDI Repo</connection>
</repository>
</repositories>
If I tell the transformation to run on master1, it works and spits out the JSON.
If I run exactly the same command again, it errors:
Code:
2016/01/29 10:05:53 - PDI Repo - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Error disconnecting from database :
2016/01/29 10:05:53 - PDI Repo - Unable to commit repository connection
2016/01/29 10:05:53 - PDI Repo -
2016/01/29 10:05:53 - PDI Repo - Error comitting connection
2016/01/29 10:05:53 - PDI Repo - at java.lang.Thread.run (Thread.java:745)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.trans.step.RunThread.run (RunThread.java:121)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.trans.step.BaseStep.markStop (BaseStep.java:2992)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.trans.Trans$1.stepFinished (Trans.java:1233)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.trans.Trans.fireTransFinishedListeners (Trans.java:1478)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.www.BaseJobServlet$3.transFinished (BaseJobServlet.java:170)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.repository.kdr.KettleDatabaseRepository.disconnect (KettleDatabaseRepository.java:1655)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.repository.kdr.delegates.KettleDatabaseRepositoryConnectionDelegate.disconnect(KettleDatabaseRepositoryConnectionDelegate.java:257)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.repository.kdr.delegates.KettleDatabaseRepositoryConnectionDelegate.commit(KettleDatabaseRepositoryConnectionDelegate.java:283)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.core.database.Database.commit (Database.java:738)
2016/01/29 10:05:53 - PDI Repo - at org.pentaho.di.core.database.Database.commit (Database.java:757)
Code:
<webresult>
<result>ERROR</result>
<message>Unexpected error executing the transformation:
java.lang.NullPointerException
at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:128)
at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:106)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2716)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2684)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2661)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2641)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2606)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2569)
at org.pentaho.di.www.ExecuteTransServlet.loadTransformation(ExecuteTransServlet.java:316)
at org.pentaho.di.www.ExecuteTransServlet.doGet(ExecuteTransServlet.java:232)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:668)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:770)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:503)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:229)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:429)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:522)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:370)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:644)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
at org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)</message>
<id />
</webresult>
I've attached the .ktr file for the transformation I'm using: fact_rollup.ktr
↧
"Could not load dashboard" Error
I am using Pentaho CE BA 5.3.
We keep hitting this error "Could not load dashboard: exception while getting tree rooted at path "/public/cde/components" Reference number: a5076a77-e543-4ba6-b88c-1422f1e9e5cc" very often. The referenced class is different most of the time.
Per suggestion from another thread, we deleted the /public/cde folder few time. However we have also noticed that the issue goes away automatically, which makes me think it could be a resourcing issue.
Any pointers on what might be causing this ? Resource issues, contention or an actual missing reference ? When the error pops up, it would pop up on most of the reports.
TIA
Ben
We keep hitting this error "Could not load dashboard: exception while getting tree rooted at path "/public/cde/components" Reference number: a5076a77-e543-4ba6-b88c-1422f1e9e5cc" very often. The referenced class is different most of the time.
Per suggestion from another thread, we deleted the /public/cde folder few time. However we have also noticed that the issue goes away automatically, which makes me think it could be a resourcing issue.
Any pointers on what might be causing this ? Resource issues, contention or an actual missing reference ? When the error pops up, it would pop up on most of the reports.
TIA
Ben
↧
Failed to registred AAAR and Saiku plugins
Good day.
After setup and settings pentaho-ce-4.8 server, I setup Alfresko (AAAR) and Saiku plagins, copy files to \biserver-ce\pentaho-solutions\system\ and copy lib to \biserver-ce\tomcat\webapps\pentaho\WEB-INF\lib\.After reboot pentaho server I got next log, Pentaho server can't registred components:
2016-01-30 15:36:57,137 WARN [org.pentaho.hadoop.shim.HadoopConfigurationLocator] Unable to load Hadoop Configuration from "file:///C:/Pentaho/biserver-ce/pentaho-solutions/system/kettle/plugins/pentaho-big-data-plugin/hadoop-configurations/mapr". For more information enable debug logging.2016-01-30 15:36:58,132 WARN [org.pentaho.reporting.libraries.base.boot.PackageManager] Unresolved dependency for package: org.pentaho.reporting.engine.classic.extensions.datasources.cda.CdaModule
2016-01-30 15:36:58,152 WARN [org.pentaho.reporting.libraries.base.boot.PackageSorter] A dependent module was not found in the list of known modules.
2016-01-30 15:37:00,995 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-01-30 15:37:00,996 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0011 - Failed to register plugin AAAR
org.pentaho.platform.api.engine.PlatformPluginRegistrationException: PluginManager.ERROR_0017 - Could not load lifecycle listener [pt.webdetails.cpk.CpkLifecycleListener] for plugin AAAR
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.bootStrapPlugin(DefaultPluginManager.java:163)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerPlugin(DefaultPluginManager.java:197)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.reload(DefaultPluginManager.java:128)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:42)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:342)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:324)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:291)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:208)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:137)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4135)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4630)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:637)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:563)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:498)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:581)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.NoClassDefFoundError: org/pentaho/platform/api/engine/IPlatformReadyListener
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2733)
at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1124)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1612)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1491)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2733)
at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1124)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1612)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.platform.plugin.services.pluginmgr.PluginClassLoader.loadClass(PluginClassLoader.java:214)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.bootStrapPlugin(DefaultPluginManager.java:160)
... 32 more
Caused by: java.lang.ClassNotFoundException: org.pentaho.platform.api.engine.IPlatformReadyListener
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1645)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1491)
... 49 more
2016-01-30 15:37:01,000 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2016-01-30 15:37:01,202 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-01-30 15:37:01,202 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0011 - Failed to register plugin saiku
org.pentaho.platform.api.engine.PlatformPluginRegistrationException: PluginManager.ERROR_0013 - Failed to set a solution file meta provider for file type saiku
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerContentTypes(DefaultPluginManager.java:268)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerPlugin(DefaultPluginManager.java:201)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.reload(DefaultPluginManager.java:128)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:42)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:342)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:324)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:291)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:208)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:137)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4135)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4630)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:637)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:563)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:498)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:581)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.pentaho.platform.api.engine.ObjectFactoryException: java.lang.IllegalAccessException: Class org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator can not access a member of class org.saiku.plugin.SaikuContentTypeMetaProvider with modifiers "private"
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.createObject(StandaloneObjectFactory.java:134)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.getLocalInstance(StandaloneObjectFactory.java:180)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.getInstance(StandaloneObjectFactory.java:116)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory.retreiveObject(StandaloneObjectFactory.java:86)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory.get(StandaloneObjectFactory.java:44)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerContentTypes(DefaultPluginManager.java:266)
... 32 more
Caused by: java.lang.IllegalAccessException: Class org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator can not access a member of class org.saiku.plugin.SaikuContentTypeMetaProvider with modifiers "private"
at sun.reflect.Reflection.ensureMemberAccess(Unknown Source)
at java.lang.Class.newInstance(Unknown Source)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.createObject(StandaloneObjectFactory.java:129)
... 37 more
2016-01-30 15:37:01,204 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2016-01-30 15:37:01,322 WARN [org.apache.commons.httpclient.HttpMethodBase] Going to buffer response body of large or unknown size. Using getResponseBodyAsStream instead is recommended.
2016-01-30 15:37:01,735 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]
After setup and settings pentaho-ce-4.8 server, I setup Alfresko (AAAR) and Saiku plagins, copy files to \biserver-ce\pentaho-solutions\system\ and copy lib to \biserver-ce\tomcat\webapps\pentaho\WEB-INF\lib\.After reboot pentaho server I got next log, Pentaho server can't registred components:
2016-01-30 15:36:57,137 WARN [org.pentaho.hadoop.shim.HadoopConfigurationLocator] Unable to load Hadoop Configuration from "file:///C:/Pentaho/biserver-ce/pentaho-solutions/system/kettle/plugins/pentaho-big-data-plugin/hadoop-configurations/mapr". For more information enable debug logging.2016-01-30 15:36:58,132 WARN [org.pentaho.reporting.libraries.base.boot.PackageManager] Unresolved dependency for package: org.pentaho.reporting.engine.classic.extensions.datasources.cda.CdaModule
2016-01-30 15:36:58,152 WARN [org.pentaho.reporting.libraries.base.boot.PackageSorter] A dependent module was not found in the list of known modules.
2016-01-30 15:37:00,995 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-01-30 15:37:00,996 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0011 - Failed to register plugin AAAR
org.pentaho.platform.api.engine.PlatformPluginRegistrationException: PluginManager.ERROR_0017 - Could not load lifecycle listener [pt.webdetails.cpk.CpkLifecycleListener] for plugin AAAR
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.bootStrapPlugin(DefaultPluginManager.java:163)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerPlugin(DefaultPluginManager.java:197)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.reload(DefaultPluginManager.java:128)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:42)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:342)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:324)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:291)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:208)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:137)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4135)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4630)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:637)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:563)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:498)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:581)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.NoClassDefFoundError: org/pentaho/platform/api/engine/IPlatformReadyListener
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2733)
at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1124)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1612)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1491)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2733)
at org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1124)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1612)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.platform.plugin.services.pluginmgr.PluginClassLoader.loadClass(PluginClassLoader.java:214)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.bootStrapPlugin(DefaultPluginManager.java:160)
... 32 more
Caused by: java.lang.ClassNotFoundException: org.pentaho.platform.api.engine.IPlatformReadyListener
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1645)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1491)
... 49 more
2016-01-30 15:37:01,000 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2016-01-30 15:37:01,202 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-01-30 15:37:01,202 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0011 - Failed to register plugin saiku
org.pentaho.platform.api.engine.PlatformPluginRegistrationException: PluginManager.ERROR_0013 - Failed to set a solution file meta provider for file type saiku
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerContentTypes(DefaultPluginManager.java:268)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerPlugin(DefaultPluginManager.java:201)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.reload(DefaultPluginManager.java:128)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:42)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:342)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:324)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:291)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:208)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:137)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4135)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4630)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:637)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:563)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:498)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:581)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.pentaho.platform.api.engine.ObjectFactoryException: java.lang.IllegalAccessException: Class org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator can not access a member of class org.saiku.plugin.SaikuContentTypeMetaProvider with modifiers "private"
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.createObject(StandaloneObjectFactory.java:134)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.getLocalInstance(StandaloneObjectFactory.java:180)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.getInstance(StandaloneObjectFactory.java:116)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory.retreiveObject(StandaloneObjectFactory.java:86)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory.get(StandaloneObjectFactory.java:44)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.registerContentTypes(DefaultPluginManager.java:266)
... 32 more
Caused by: java.lang.IllegalAccessException: Class org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator can not access a member of class org.saiku.plugin.SaikuContentTypeMetaProvider with modifiers "private"
at sun.reflect.Reflection.ensureMemberAccess(Unknown Source)
at java.lang.Class.newInstance(Unknown Source)
at org.pentaho.platform.engine.core.system.objfac.StandaloneObjectFactory$ObjectCreator.createObject(StandaloneObjectFactory.java:129)
... 37 more
2016-01-30 15:37:01,204 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2016-01-30 15:37:01,322 WARN [org.apache.commons.httpclient.HttpMethodBase] Going to buffer response body of large or unknown size. Using getResponseBodyAsStream instead is recommended.
2016-01-30 15:37:01,735 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]
↧
↧
java.lang.exception error when using CVParameterSelection
Hi, guys. I need your help to use CVParameterSelection. I tried to find some optimized paramters for K-nearest neighbor (ibk) but i've got an error message "java.lang.Exception while updating CVParameters: CVParameter 1 java.lang.String: Four or Five components expected!" when i tried to change [0 java.lang.String -> 1 java.lang.String]. How can I do this? Thanks.
capture.jpg
capture.jpg
↧
MS Sql connection problem
Hi,
I was playing a lot with version 6.0.0.0.353 of bi-server on Mac OS without any problems. Today i had to install it on Windows 7 (jdk 1.7_80). It run smoothly, i was able to log in (no concrete errors in logs only some marketplace error). I wanted to start from adding a connection to MS SQL server. But when i click on Test nothing happens. No error, no feedback, nothing. When i want to save this connection, dialog disappear but connection is not added. Also no feedback anywhere. On Mac Os it works.
Any Hint ? I also tried newest version 6.0.1.0.386.
I was playing a lot with version 6.0.0.0.353 of bi-server on Mac OS without any problems. Today i had to install it on Windows 7 (jdk 1.7_80). It run smoothly, i was able to log in (no concrete errors in logs only some marketplace error). I wanted to start from adding a connection to MS SQL server. But when i click on Test nothing happens. No error, no feedback, nothing. When i want to save this connection, dialog disappear but connection is not added. Also no feedback anywhere. On Mac Os it works.
Any Hint ? I also tried newest version 6.0.1.0.386.
↧
Transformation that pulls specific Row/Columns from Excel
Hello,
Can anyone provide direction as to how I can access specific row/columns from an Excel? For instance is there a way to say give me A2, E6, F6,G6, A3, E7, F7, G7?
Thanks
Can anyone provide direction as to how I can access specific row/columns from an Excel? For instance is there a way to say give me A2, E6, F6,G6, A3, E7, F7, G7?
Thanks
↧
ETL results are different on BI server (from Spoon)
I have an ETL transformation that includes building a 90 column fact table. There are 32 foreign key lookup steps. There's a problem with one of the lookup steps. It always works from Spoon, but not from the BI server, where the ETL job completes, rows are written to the tables, but the one foreign key in question is always populated with the default value in the lookup step. Again, that same foreign key is correctly populated from Spoon. There are no errors in the log.
↧
↧
Is there a way to use properly an insert/update step with decimal numbers?
Hello,
I have a transformation with an insert/update step. Some columns are decimal numbers. I use a select values step to set the format (length and precision). My database is Mysql, so I have tried with float and decimal formats with the same length and precision, but I can't get that two numbers were detected as equals.
Is there any way to make it works?
I have a transformation with an insert/update step. Some columns are decimal numbers. I use a select values step to set the format (length and precision). My database is Mysql, so I have tried with float and decimal formats with the same length and precision, but I can't get that two numbers were detected as equals.
Is there any way to make it works?
↧
Now to capture the table output error
Hi All,
I using the Table output step. If one rows is wrong , I hope the job will go on. but I want to capture which rows cause the error.
test.jpg
I do the test like above case. The job will finish , But We can not capture any record that errors.
I test. If we using the insert/update step. It works. But you know this step have wrong performance.
test_2.jpg
How Can I do. Thanks for your help.
I using the Table output step. If one rows is wrong , I hope the job will go on. but I want to capture which rows cause the error.
test.jpg
I do the test like above case. The job will finish , But We can not capture any record that errors.
I test. If we using the insert/update step. It works. But you know this step have wrong performance.
test_2.jpg
How Can I do. Thanks for your help.
↧
couldn't convert string to a date using format
Hi,
I have a CSV file with a column called DATE that is filled with a time-series data and the date format is dd/MM/yyyy HH:mm:ss.This file is imported with CSV Input step. When I try to use Calculator Step to calculate the values of Days, Months and Years from this column, it returns the following error:
From 01/01/2015 00:00:00 to 17/10/2015 23:45:00 it works fine, but when gets to 18/10/2015 00:00:00 it breaks. Also, trying with the 2012's days, it breaks when it gets to 21/10/2015 00:00:00.
PS: Trying with the Text Input Step it worked fine.
I have a CSV file with a column called DATE that is filled with a time-series data and the date format is dd/MM/yyyy HH:mm:ss.This file is imported with CSV Input step. When I try to use Calculator Step to calculate the values of Days, Months and Years from this column, it returns the following error:
Code:
couldn't convert string [18/10/2015 00:00:00] to a date using format [dd/MM/yyyy HH:mm:ss]
PS: Trying with the Text Input Step it worked fine.
↧
exception when using xml as datasource in java sdk
Hi,
when I open in my Java program a Report which contains an XML-datasource, I get an exception
exception in thread "main" org.pentaho.reporting.libraries.resourceloader.ResourceCreationException
: Unable to parse
the document: ResourceKey{schema=org.pentaho.reporting.libraries.docbundle.bundleloader.ZipResourceBundleLoader, identifier=content.xml, factoryParameters={org.pentaho.reporting.libraries.resourceloader.FactoryParameterKey{name=repository}=org.pentaho.reporting.libraries.repository.zipreader.ZipReadRepository@3672276e, org.pentaho.reporting.libraries.resourceloader.FactoryParameterKey{name=repository-loader}=org.pentaho.reporting.libraries.docbundle.bundleloader.ZipResourceBundleLoader@4248b963}, parent=ResourceKey{schema=org.pentaho.reporting.libraries.resourceloader.loader.URLResourceLoader, identifier=file:E:/pentaho/XMLBeispiel.prpt, factoryParameters={}, parent=null}}
at org.pentaho.reporting.libraries.xmlns.parser.AbstractXmlResourceFactory.create(AbstractXmlResourceFactory.java:214)
Can you help me?
Thanks
Norbert
when I open in my Java program a Report which contains an XML-datasource, I get an exception
exception in thread "main" org.pentaho.reporting.libraries.resourceloader.ResourceCreationException
: Unable to parse
the document: ResourceKey{schema=org.pentaho.reporting.libraries.docbundle.bundleloader.ZipResourceBundleLoader, identifier=content.xml, factoryParameters={org.pentaho.reporting.libraries.resourceloader.FactoryParameterKey{name=repository}=org.pentaho.reporting.libraries.repository.zipreader.ZipReadRepository@3672276e, org.pentaho.reporting.libraries.resourceloader.FactoryParameterKey{name=repository-loader}=org.pentaho.reporting.libraries.docbundle.bundleloader.ZipResourceBundleLoader@4248b963}, parent=ResourceKey{schema=org.pentaho.reporting.libraries.resourceloader.loader.URLResourceLoader, identifier=file:E:/pentaho/XMLBeispiel.prpt, factoryParameters={}, parent=null}}
at org.pentaho.reporting.libraries.xmlns.parser.AbstractXmlResourceFactory.create(AbstractXmlResourceFactory.java:214)
Can you help me?
Thanks
Norbert
↧
↧
CE 6.0 with PostgreSQL as repository - 404 error
I'm trying to run biserver-ce-6.0.1.0-386 with repository on PostgreSQL. I've done all changes like in documentation https://help.pentaho.com/Documentation/6.0/0F0/0P0 but my Pentaho doesn't work properly.
System: Windows Server 2012 64bit
When I run it without changing repo to PostgreSQL then it starts and works.
With repo on PostgreSQL I have errors:
catalina.2016-01-31.log
pentaho.log
System: Windows Server 2012 64bit
When I run it without changing repo to PostgreSQL then it starts and works.
With repo on PostgreSQL I have errors:
catalina.2016-01-31.log
pentaho.log
↧
Using cassandra Output
Hi,
I got some problems using cassandra output with spoon pdi-ce-6.0.1.0-386.
I´m using cassandra 3.0.2.
When I try to get the column families, there are two outcomes.
Using Thift IO I get the error
InvalidRequestException(why:CQL2 has been removed in Cassandra 3.0. Please use CQL3 instead)
When I use cql3.0, the error is:
InvalidRequestException(why:unconfigured table schema_columnfamilies)
The table schema_columnfamilies has replaced some time ago.
Is the cassandra plugin outdated?
Many thanks,
Uli
Update: Seems to run with cassandra 1.2.19 which is rather old.
I got some problems using cassandra output with spoon pdi-ce-6.0.1.0-386.
I´m using cassandra 3.0.2.
When I try to get the column families, there are two outcomes.
Using Thift IO I get the error
InvalidRequestException(why:CQL2 has been removed in Cassandra 3.0. Please use CQL3 instead)
When I use cql3.0, the error is:
InvalidRequestException(why:unconfigured table schema_columnfamilies)
The table schema_columnfamilies has replaced some time ago.
Is the cassandra plugin outdated?
Many thanks,
Uli
Update: Seems to run with cassandra 1.2.19 which is rather old.
↧
Reports with differents layout on windows and Linux
Hi,
I created a .prpt file (on Linux) and shared it with a colleague working with windows so he can adapt some labels, font types, margins and perform some cosmetic changes.
When my colleague has send it back the report doesn't look the same. The labels and the text fields are overlapped.
As an additional test, I have opened the report in another windows machine and it is OK, it appears as it should
It seems that there is some misunderstanding at format level between windows and Linux .prpt versions
Any Idea what is happening?
PD: we are using the same report-designer version 6.0.0.0-353 in all computers
I created a .prpt file (on Linux) and shared it with a colleague working with windows so he can adapt some labels, font types, margins and perform some cosmetic changes.
When my colleague has send it back the report doesn't look the same. The labels and the text fields are overlapped.
As an additional test, I have opened the report in another windows machine and it is OK, it appears as it should
It seems that there is some misunderstanding at format level between windows and Linux .prpt versions
Any Idea what is happening?
PD: we are using the same report-designer version 6.0.0.0-353 in all computers
↧