Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

maximum open cursors exceeded

$
0
0
I am getting the below error during the Execute SQL Statements step. Our number of open cursors is set to 300 and the DBA is reluctant to increase this. Is there a way I can manage the open cursors?

ORA-00604: error occurred at recursive SQL level 1
ORA-01000: maximum open cursors exceeded

Thanks!

repository syncronizer very slow

$
0
0
i'm trying to use the repository syncronizer to upload file in the jackrabbit repository .
it seems rather slow to detect differences and present them to the user .
is there any other tool to quickly upload files to the jackrabbit repository .
thanks for any answer

Not able to connect to Hive with Pentaho Report Designer 5.0.1

$
0
0
Hi, I have installed PRD 5.0.1 on mac os x. I am running Cloudera CDH4.2 and my hadoop, hbase and hiveserver is up and running. In my PRD report designer wizard I am trying to create a connection to hive. The connection is localhost, default (database), 10000 (port number). After a while it comes back with this error, it takes about 2-3 minutes and then says connection timeout. My configuration in plugin.properties is set to cdh42 for hadoop version. I have all the config log files core-site.xml, hdfs-site.xml etc. in the hadoop-configurations directory. I am running mrv2 (resourcemanager, nodemanager).

Error connecting to database [hive] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database

Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
Could not establish connection to jdbc:hive2://localhost:10000/default: java.net.ConnectException: Operation timed out


org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database

Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
Could not establish connection to jdbc:hive2://localhost:10000/default: java.net.ConnectException: Operation timed out


at org.pentaho.di.core.database.Database.normalConnect(Database.java:415)
at org.pentaho.di.core.database.Database.connect(Database.java:353)
at org.pentaho.di.core.database.Database.connect(Database.java:306)
at org.pentaho.di.core.database.Database.connect(Database.java:294)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:84)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2459)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:541)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
at org.pentaho.ui.xul.swing.tags.SwingButton$OnClickRunnable.run(SwingButton.java:58)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:251)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:727)
at java.awt.EventQueue.access$200(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:688)
at java.awt.EventQueue$3.run(EventQueue.java:686)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:697)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:154)
at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:182)
at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:221)
at java.security.AccessController.doPrivileged(Native Method)
at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:219)
at java.awt.Dialog.show(Dialog.java:1082)
at java.awt.Component.show(Component.java:1651)
at java.awt.Component.setVisible(Component.java:1603)
at java.awt.Window.setVisible(Window.java:1014)
at java.awt.Dialog.setVisible(Dialog.java:1005)
at org.pentaho.ui.xul.swing.tags.SwingDialog.show(SwingDialog.java:238)
at org.pentaho.reporting.ui.datasources.jdbc.ui.XulDatabaseDialog.open(XulDatabaseDialog.java:254)
at org.pentaho.reporting.ui.datasources.jdbc.ui.ConnectionPanel$AddDataSourceAction.actionPerformed(ConnectionPanel.java:252)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2018)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2341)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:289)
at java.awt.AWTEventMulticaster.mouseReleased(AWTEventMulticaster.java:289)
at java.awt.Component.processMouseEvent(Component.java:6505)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3321)
at java.awt.Component.processEvent(Component.java:6270)
at java.awt.Container.processEvent(Container.java:2229)
at java.awt.Component.dispatchEventImpl(Component.java:4861)
at java.awt.Container.dispatchEventImpl(Container.java:2287)
at java.awt.Component.dispatchEvent(Component.java:4687)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4832)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4492)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4422)
at java.awt.Container.dispatchEventImpl(Container.java:2273)
at java.awt.Window.dispatchEventImpl(Window.java:2719)
at java.awt.Component.dispatchEvent(Component.java:4687)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:729)
at java.awt.EventQueue.access$200(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:688)
at java.awt.EventQueue$3.run(EventQueue.java:686)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:87)
at java.awt.EventQueue$4.run(EventQueue.java:702)
at java.awt.EventQueue$4.run(EventQueue.java:700)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:699)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:154)
at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:182)
at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:221)
at java.security.AccessController.doPrivileged(Native Method)
at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:219)
at java.awt.Dialog.show(Dialog.java:1082)
at java.awt.Component.show(Component.java:1651)
at java.awt.Component.setVisible(Component.java:1603)
at java.awt.Window.setVisible(Window.java:1014)
at java.awt.Dialog.setVisible(Dialog.java:1005)
at org.pentaho.reporting.libraries.designtime.swing.CommonDialog.setVisible(CommonDialog.java:281)
at org.pentaho.reporting.libraries.designtime.swing.CommonDialog.performEdit(CommonDialog.java:193)
at org.pentaho.reporting.ui.datasources.jdbc.ui.JdbcDataSourceDialog.performConfiguration(JdbcDataSourceDialog.java:788)
at org.pentaho.reporting.ui.datasources.jdbc.JdbcDataSourcePlugin.performEdit(JdbcDataSourcePlugin.java:71)
at org.pentaho.reporting.engine.classic.wizard.ui.xul.steps.DataSourceAndQueryStep.editOrCreateDataFactory(DataSourceAndQueryStep.java:426)
at org.pentaho.reporting.engine.classic.wizard.ui.xul.steps.DataSourceAndQueryStep.createDataFactory(DataSourceAndQueryStep.java:405)
at org.pentaho.reporting.engine.classic.wizard.ui.xul.steps.DataSourceAndQueryStep$DatasourceAndQueryStepHandler.doCreateDataFactory(DataSourceAndQueryStep.java:165)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
at org.pentaho.ui.xul.swing.tags.SwingButton$OnClickRunnable.run(SwingButton.java:58)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:251)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:727)
at java.awt.EventQueue.access$200(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:688)
at java.awt.EventQueue$3.run(EventQueue.java:686)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:697)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:154)
at java.awt.WaitDispatchSupport$2.run(WaitDispatchSupport.java:182)
at java.awt.WaitDispatchSupport$4.run(WaitDispatchSupport.java:221)
at java.security.AccessController.doPrivileged(Native Method)
at java.awt.WaitDispatchSupport.enter(WaitDispatchSupport.java:219)
at java.awt.Dialog.show(Dialog.java:1082)
at java.awt.Component.show(Component.java:1651)
at java.awt.Component.setVisible(Component.java:1603)
at java.awt.Window.setVisible(Window.java:1014)
at java.awt.Dialog.setVisible(Dialog.java:1005)
at org.pentaho.ui.xul.swing.tags.SwingDialog.show(SwingDialog.java:238)
at org.pentaho.reporting.engine.classic.wizard.ui.xul.EmbeddedWizard.run(EmbeddedWizard.java:152)
at org.pentaho.reporting.designer.extensions.wizard.NewWizardReportAction.actionPerformed(NewWizardReportAction.java:81)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2018)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2341)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
at java.awt.Component.processMouseEvent(Component.java:6505)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3321)
at java.awt.Component.processEvent(Component.java:6270)
at java.awt.Container.processEvent(Container.java:2229)
at java.awt.Component.dispatchEventImpl(Component.java:4861)
at java.awt.Container.dispatchEventImpl(Container.java:2287)
at java.awt.Component.dispatchEvent(Component.java:4687)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4832)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4492)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4422)
at java.awt.Container.dispatchEventImpl(Container.java:2273)
at java.awt.Window.dispatchEventImpl(Window.java:2719)
at java.awt.Component.dispatchEvent(Component.java:4687)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:729)
at java.awt.EventQueue.access$200(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:688)
at java.awt.EventQueue$3.run(EventQueue.java:686)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:87)
at java.awt.EventQueue$4.run(EventQueue.java:702)
at java.awt.EventQueue$4.run(EventQueue.java:700)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:699)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
Could not establish connection to jdbc:hive2://localhost:10000/default: java.net.ConnectException: Operation timed out

at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:540)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:399)
... 151 more
Caused by: java.sql.SQLException: Could not establish connection to jdbc:hive2://localhost:10000/default: java.net.ConnectException: Operation timed out
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:161)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:98)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:108)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.pentaho.hadoop.shim.common.DriverProxyInvocationChain$DriverInvocationHandler.invoke(DriverProxyInvocationChain.java:135)
at com.sun.proxy.$Proxy13.connect(Unknown Source)
at org.apache.hive.jdbc.HiveDriver$1.call(HiveDriver.java:135)
at org.apache.hive.jdbc.HiveDriver$1.call(HiveDriver.java:132)
at org.pentaho.hadoop.hive.jdbc.JDBCDriverCallable.callWithDriver(JDBCDriverCallable.java:57)
at org.apache.hive.jdbc.HiveDriver.callWithActiveDriver(HiveDriver.java:121)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:132)
at java.sql.DriverManager.getConnection(DriverManager.java:579)
at java.sql.DriverManager.getConnection(DriverManager.java:243)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:528)
... 152 more

Hostname : localhost
Port : 10000
Database name : default

Kettle Logging capability

$
0
0
We have a requirement to read the data from JSON file and load into oracle stage database. How do we get a count of records read from JSON and written to LOG table in oracle. I am using Kettle/spoon ver 5.0,2.
In prodding for a solution found the "Aggregate Rows" is deprecated. I could use that to get count of number of records and may have to call DB procedure (should be custom written by us) to get the record count to logi table. Is there any other alternative way to achive this within pentaho tranformation?

Thanks

Getting un-sampled Google Analytics data? Setting Precision?

$
0
0
Hi All,

I've been using PDI to pull Google Analytics data into a MySQL database. I'm requesting records for each transaction (transactionId), with dimensions such as source, medium, referral path, etc.

My problem is that I get an incomplete set of records. My understanding is that Google Analytics defaults to using a sample of the full dataset, but that this can be changed (or at least improved) by setting the sampling level to higher precision (I'm basing this on this page: https://developers.google.com/analyt...#samplingLevel)

Right now I'm trying to overcome this by running multiple requests in small, partially overlapping time periods, and using an insert-update statement to avoid row duplication, but this is incredibly time consuming and it doesn't guarantee that I'll get all of the records (some are certain to escape!)

Setting the sampling to HIGHER_PRECISION would, but I can't find a way to pass the sampling level parameter using the Google Analytics step in PDI.

Can anyone offer advice?

Thanks!

Scott

How to call procedure with parameters coming from table input

$
0
0
HI,

I have a one MySQL query that returns 40 rows as result , I put this query into Table Input step.

Now by using this result , I want to execute one Procedure and pass this result as parameters , How to do it ?

My procedure should be run for 40 times with each different value in Query.

How so solve this ?

Thanks in Advance !!!

Calling a Java Class From a Java Package (.jar)

$
0
0
Hi

I have developed a java class exported into a .jar library to be called by a Pentaho 'modified java script'. The .jar is compiled in Eclipse with JDC Compliance level 1.7).

When I try to use this class inside a 'modified java script', I get the error: ReferenceError: “xeroCallPackage” is not defined. I have tried lots of things without much luck so far.

My file xeroCallPackage.jar is in the path with the other *.jar files in Pentaho (..\data-integration\lib\)

For info:

1. The stripped down (for simplicity) java library code is here:

package xeroCallPackage;


import java.io.IOException;
import java.net.URISyntaxException;
import java.util.Collections;
import java.util.Map;


public class xeroURLCall {



public String getResponse(String CONSUMER_KEY, String CONSUMER_PRIVATE_KEY, String URL) throws IOException, OAuthException, URISyntaxException {

// stripped out code here


return response.readBodyAsString();
}

}


2. The stripped down Pentaho 'modified java script' is here:



java


var CONSUMER_KEY = "ffffff";
var CONSUMER_PRIVATE_KEY = "aaaaa";
var URL = "https://api.xero.com/api.xro/2.0/Organisation";


var ResponseAsString;



ResponseAsString = new xeroCallPackage.xeroURLCall.getResponse(CONSUMER_KEY,CONSUMER_PRIVATE_KEY,URL);

How to use table column as parameter for barchart component

$
0
0
Hi All,
How to use table column as parameter for barchart component. If i click on the table row, the bar chart will be displayed with that column data. what is the click action function for this?




Thanks & Regards,
Sowjanya.

Pass Merge rows diff keys and values as parameterized.

$
0
0
Hi,

I have a requirement wherein i need to pass merge rows(diff) keys and values as parameters instead of specifying it manually can anyone please help on how should i do this?? as ETL Metadata Injection also doesnt supports it.

Thanks,
Priyanka

Use created cube in MSSQL

$
0
0
Hi guys

I have a cube in MSSQL and I want to use it as data source in Pentaho Analysis Report like 'sampledata:Quaderant Analysis'.
First, Can I do it ? and if yes How ?!!

Thanks in advance



how to get yesterday's date from Get System info step

$
0
0
HI All,

I need to pass a set a variable holding value of "yesterday's date" in YYYY-MM-DD format.
How to achive this using get system info step.

Thanks,
Malibu

Kettle transformation stops after start

$
0
0
Hi, i use kettle for 2 years and i have an rare issue.

I have a transformation, and when i click on starts the log shows:
2014/03/04 10:17:07 - Spoon - Transformación abierta.
2014/03/04 10:17:07 - Spoon - Ejecutando transformación [MIGRACION_RDQ_HCO]...
2014/03/04 10:17:07 - Spoon - Se ha iniciado la ejecución de la transformación.
2014/03/04 10:17:07 - MIGRACION_RDQ_HCO - Iniciado despacho de la transformación [MIGRACION_RDQ_HCO]

And the start arrow is again enabled. I press F11 to validate transformation and i dont obtain any error / warning line.

¿what can i do?

I use W7-64 Kettle 4.4 stable,
java version "1.7.0_45"
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)

Transformationi kettle

$
0
0
Hi gus,

I am new to Pentaho DI(Kettle). I stuckup o one issue in transformatio. Please see the details below.

I am using Stream Lokup to do the matching of 2 csv files.But in the output parameters, the field obtained in the stream lookup field is null, instead of the actual value it should return.

Kettle version I am using is : Kettle 5.0.1

Could somebody know why it is so?

Thanks,
Arun

Trying to find a multiple classifier for known and unknown category outcomes

$
0
0
Hi,

I'm a biologist trying to develop a capture-mark-recapture survey method using animal vocalisations. I've only just started using WEKA. I've formerly used Statistica to run artificial neural networks - MLPs - to show that I can classify vocalisations to individuals with up to 100% accuracy. But if I have new individuals present in the validation database that weren't in the training database, it just classifies them to the nearest individual in the training database. I need it to go to an "unknown" category and have read Probabilistic Neural Networks can do this.

So I have the variables, the correct identity and the knowledge that I can classify them correctly with up to 100% accuracy. What I want is a dual-layer outcome:

1) Identity: known or unknown
and then if identity is known
2) Identity = individual A (confidence level = X)

Can anyone let me know how this would work in WEKA? As I say I'm a novice so as much detail as possible would be wonderful!

Thanks!

Action successfull

$
0
0
Is it possible to remove the “Action successful” message which appear after firing a xaction. The xaction runs a kettle transformation. The xaction has no output so something like

<outputs>
<image-tag type="string">
<destinations>
<response>content</response>
</destinations>
</image-tag>
</outputs>

Is not working

AAAR Plugin - Error Executing AAAR_Extract.sh

$
0
0
Hi all,
I just installed the "A.A.A.R. – Alfresco Audit Analysis and Reporting" plugin from the pentaho Marketplace (as described here http://fcorti.com/alfresco-audit-analysis-reporting/) and I'm facing the following error.

My Alfresco installation use a postgres DB (9.0.x) while kettle and datamart use a Mysql DB (5.1).

Thank you in advance for help.

Tommaso

2014/03/04 12:30:39 - Cmis Input documents before last update.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unexpected error
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - Unable to get queryfields for SQL:
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - select
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:objectId,
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:name,
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:contentStreamLength,
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:lastModificationDate
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - from
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:document
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - where
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:lastModificationDate < TIMESTAMP '2001-01-01T00:00:00.000+00:00'
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - and (
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:contentStreamLength > 0
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - or (
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:contentStreamLength = 0
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - and cmis:name >= ''
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - ))
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - order by
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:contentStreamLength asc,
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - cmis:name asc
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - Internal Server Error
2014/03/04 12:30:39 - Cmis Input documents before last update.0 -
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at it.francescocorti.kettle.cmis_input.CmisInputMeta.getFields(Unknown Source)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at it.francescocorti.kettle.cmis_input.CmisInput.processRow(Unknown Source)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at java.lang.Thread.run(Thread.java:744)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - Caused by: org.apache.chemistry.opencmis.commons.exceptions.CmisRuntimeException: Internal Server Error
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.bindings.spi.atompub.AbstractAtomPubService.convertStatusCode(AbstractAtomPubService.java:487)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.bindings.spi.atompub.AbstractAtomPubService.post(AbstractAtomPubService.java:629)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.bindings.spi.atompub.DiscoveryServiceImpl.query(DiscoveryServiceImpl.java:145)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.runtime.SessionImpl$3.fetchPage(SessionImpl.java:600)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.runtime.util.AbstractIterator.getCurrentPage(AbstractIterator.java:132)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.runtime.util.AbstractIterator.getTotalNumItems(AbstractIterator.java:70)
2014/03/04 12:30:39 - Cmis Input documents before last update.0 - at org.apache.chemistry.opencmis.client.runtime.util.AbstractIterable.getTotalNumItems(AbstractIterable.java:94)

Displaying Stream Value

$
0
0
I included a simple transformation where I am providing a stream value (id) using Generate a Row. I am using this stream value to perform a query. This works fine.

The part that I am trying to figure out is how I can display the value of stream value (id) in the log. The Table Input step does not pass the metadata therefore it is not available.

I can set a variable on a preceding page resulting in more transforms.

Is there a cleaner way to capture the stream value (id) in the logging step?

Many thanks
Ray
Attached Images

Security Tables-Users,Authorities,Granted_Authorities in Pentaho 5

$
0
0
Hi ,


In Pentaho 4.1 for the external security implementation , we are using the tables Users, Authorities ,granted_authorities from the hibernate database. In Pentaho 5 we are not able to locate these tables in the hibernate, jcrabit and quarts data bases.
In Pentaho 5 which tables are having these datas?

Thanks

Problem reading excel files

$
0
0
Hi,

Maybe I will do a stupid question, but I have a problem. I did my transformations using a excel input step. I tested it to fews rows (exactly 10) and it works. Now, I tried to use the same transformation, with the whole file (500 rows) and only 10 is being read. I don't know why this is happing. I changed all data type to string (maybe could be strange data in the fields,) but this did not work. I only see 10 rows in the input and in the output. Anybody has some idea how can I resolve this problem?
I really need some help here, I need that my transformations work for the whole file, not only for some part of it.

Thank you in advances for any help.

multiple row for row axis in crosstab report

$
0
0
Hello all,
I am facing a problem in crosstab report designed in PRD 5.0. In my cross tab for each itemname I want to show income by each item in a month but for every month I am getting a different row which is not desired. i.e for ItemName 'Registration of ALC' I am getting 3 different row with the value of one month at a time and value 0 for other month.

Itemtype- Income
CLIENT NAME- DPS
Head Name- Student Registration
January-2014 February-2014 March-2014
ITEMNAME ITEMCOUNT ITEM_RATE AMOUNT ITEMCOUNT ITEM_RATE AMOUNT ITEMCOUNT ITEM_RATE AMOUNT
Registration of Student 11.00 10.00 110.00 0.00 0.00 0.00 0.00 0.00 0.00
Registration of Student 0.00 0.00 0.00 6.00 10.00 60.00 0.00 0.00 0.00
Registration of Student 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
and this is my query for report

SELECT
IT.ITEMTYPENAME AS ITEMTYPE,
C.CLIENTNAME,
H.HEADNAME,I.ITEMNAME,
DATE_FORMAT(FROMDATE,'%M-%Y') AS DATELABEL,
ITR.ITEMCOUNT ITEMCOUNT,
ITR.PERITEMRATE AS ITEM_RATE,
ITR.AMOUNT, ITR.FROMDATE
FROM ITEMTRANS ITR,ITEM I,HEAD H,
`CLIENT` C,
PROGRAM P,
ITEMTYPE IT
WHERE ITR.ITEMID = I.ID
AND I.HEADID = H.ID
AND H.CLIENTID = C.ID
AND C.PROGRAMID = P.ID
AND I.ITEMTYPEID=IT.ID
AND P.ID=1
GROUP BY I.ITEMTYPEID,C.CLIENTNAME, H.HEADNAME, I.ID,DATELABEL

is there any problem with my query or report I have designed.
Please reply me as soon as possible.
Thanks in advance.
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>