Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Dialog fails for Cassandra Input/Output

$
0
0
When I just drag in a Cassandra Output and try to open the properties dialog by double clicking it nothing happend. When I do the same thing with the Cassandra Input I get the following error:
java.lang.ClassNotFoundException: org.pentaho.di.ui.trans.steps.cassandrainput.CassandraInputDialog
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at org.pentaho.di.core.plugins.KettleURLClassLoader.loadClassFromParent(KettleURLClassLoader.java:87)
at org.pentaho.di.core.plugins.KettleURLClassLoader.loadClass(KettleURLClassLoader.java:106)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.getStepDialog(SpoonStepsDelegate.java:231)
at org.pentaho.di.ui.spoon.Spoon.getStepEntryDialog(Spoon.java:8637)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:120)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8797)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3027)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:744)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.notifyListeners(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1316)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:654)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at apple.launcher.LaunchRunner.run(LaunchRunner.java:116)
at apple.launcher.LaunchRunner.callMain(LaunchRunner.java:51)
at apple.launcher.JavaApplicationLauncher.launch(JavaApplicationLauncher.java:52)

Javascript issue (missing method in Context)when launching a transformation from Java

$
0
0
When running from my Java application (using JDK 1.7), the following error is raised (only if the transformation get inside some JavaScript step - if no Java Script, then NO issue). Note also that the One knows why we get this missing function ? Any other Jar I need to attach ? A big thanks per advance to all.

Relase used:
5.2.0.0 (Build: Sept 30, 2014 07:48:28

OS - Java :
Win7 64 bits / jdk1.7.0_75

Attached libraries (taken from folder \Pentaho\data-integration\lib):
commons-logging-1.1.1.jar
commons-vfs-20100924-pentaho.jar
js-1.7R3.jar
jxl-2.6.12.jar
kettle-core-5.2.0.0-209.jar
kettle-engine-5.2.0.0-209.jar
org-apache-commons-lang.jar

Error Message:
java.lang.NoSuchMethodError: org.mozilla.javascript.Context.enter(Lorg/mozilla/javascript/Context;Lorg/mozilla/javascript/ContextFactory;)Lorg/mozilla/javascript/Context;
at org.mozilla.javascript.ContextFactory.enterContext(ContextFactory.java:616)
at org.mozilla.javascript.ContextFactory.enterContext(ContextFactory.java:579)
at org.pentaho.di.trans.steps.scriptvalues_mod.ScriptValuesMod.addValues(ScriptValuesMod.java:181)
at org.pentaho.di.trans.steps.scriptvalues_mod.ScriptValuesMod.processRow(ScriptValuesMod.java:715)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
at java.lang.Thread.run(Thread.java:745)
2015/04/30 08:42:26 - Modified Java Script Value.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected error
2015/04/30 08:42:26 - Modified Java Script Value.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : java.lang.NoSuchMethodError: org.mozilla.javascript.Context.enter(Lorg/mozilla/javascript/Context;Lorg/mozilla/javascript/ContextFactory;)Lorg/mozilla/javascript/Context;
2015/04/30 08:42:26 - Modified Java Script Value.0 - at org.mozilla.javascript.ContextFactory.enterContext(ContextFactory.java:616)
2015/04/30 08:42:26 - Modified Java Script Value.0 - at org.mozilla.javascript.ContextFactory.enterContext(ContextFactory.java:579)
2015/04/30 08:42:26 - Modified Java Script Value.0 - at org.pentaho.di.trans.steps.scriptvalues_mod.ScriptValuesMod.addValues(ScriptValuesMod.java:181)
2015/04/30 08:42:26 - Modified Java Script Value.0 - at org.pentaho.di.trans.steps.scriptvalues_mod.ScriptValuesMod.processRow(ScriptValuesMod.java:715)
2015/04/30 08:42:26 - Modified Java Script Value.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/04/30 08:42:26 - Modified Java Script Value.0 - at java.lang.Thread.run(Thread.java:745)
child index = 1, logging object : org.pentaho.di.core.logging.LoggingObject@38cbd867 parent=7b4978a0-43a3-4556-a781-6ca4a892f954

Integration

Mini Tutorial on Looping and Parameters

$
0
0
Whilst I am just a beginner with Kettle I have many decades experience is other tools and have been falling foul of terminology differences.
(like variables aren't variable; and there are 2 completely different 'Set Variables' steps).


With help from others, notably Marabu, I finally made some sense of loops in Jobs and how to pass parameters and not get confused with Variables
I decided to write it down for my own team then decided to post it here as well in case others were facing similar issues to me.


This is offered in spirit of sharing - anybody who can correct or enhance is of course encouraged to do so.


Thanks
JC

New mapping (.CSV to table).

$
0
0
Hello,

We've introduced a new column in all .CSV files and corresponding tables. Is there an easy/automated way to introduce/reflect this mapping in transformation steps of all .KTR files?

Thanks.

How to Reload CDA and Mondrian cache in Pentaho CE 4.8?

$
0
0
Hi All,


I'm currently stuck in some performance issue for my Dashboard.

I've created a dashboard in Pentaho Community edition 4.8. For my charts, using the SQL and MDX (Mondrian) queries.


My Problem is that, When I first time open my dashboards after clearing cda and Mondrian cache. It take 50 secs to load. But next time it took less then 10 secs.


I know the method to clear CDA and Mondrain cache automatically.


How to reload the CDA and Mondrian schema cache from back-end (with out opening the dashboard.)?


Please suggest. I'm really getting stuck with that point.


Cheers Guys,

Can't populate data to table from Excel Sheet

$
0
0
Hi Forum,

I'm trying to populate an excel sheet of data to a table component. I can preview the data on "Excel input step" & I'm hoping it to "Table output" step and I can get DDL but when I execute it the table is not created and showing me below error.


Sample data in Excel preview & the metadata is of String Type for all fields. From the log I understood that when the filed reaches 1960 something is happening & I'm unable to find out it why ?

Country Name Country Code Indicator Name Indicator Code 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014
Aruba ABW Literacy rate, adult total (% of people ages 15 and above) SE.ADT.LITR.ZS 97.2912521 96.8226395
Andorra AND Literacy rate, adult total (% of people ages 15 and above) SE.ADT.LITR.ZS
Afghanistan AFG Literacy rate, adult total (% of people ages 15 and above) SE.ADT.LITR.ZS 18.1576805 31.7411175


Error :

ERROR: syntax error at or near "1960"
Position: 148

at org.pentaho.di.core.database.Database.execStatement(Database.java:1485)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1433)
at org.pentaho.di.ui.core.database.dialog.SQLEditor.exec(SQLEditor.java:398)
at org.pentaho.di.ui.core.database.dialog.SQLEditor.access$200(SQLEditor.java:81)
at org.pentaho.di.ui.core.database.dialog.SQLEditor$7.handleEvent(SQLEditor.java:242)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.tableoutput.TableOutputDialog.open(TableOutputDialog.java:884)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:124)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8797)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3027)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:744)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1316)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:654)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.postgresql.util.PSQLException: ERROR: syntax error at or near "1960"
Position: 148
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2198)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1927)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:561)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:405)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:397)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1459)
... 27 more

Line chart - Third series to be dashed

$
0
0
I have a line chart with three series, and I want the third series to have dashed line insted of solid. I know how to implement dashed lines on all series using line_strokeDasharray in extension point, but do anyone know how to specify it for a specific series?

Modified Javascript

$
0
0
Hi,
I want to give o/p as a character equivalent to its number.

Ex: 1 as a
2 as b
......27 as aa 28 as ab

Help me
Ramya

Problem in defining table name as variable in "sqljdbc" query

$
0
0
Hello,

for my Dashboard I have to define the destination table name as a variable in my sql query.
Currently I can change the parameter that defines the table name with a button click.

My query:
Code:

Select * From ${param_tablename}
Error in Catalina log:
javax.ws.rs.WebApplicationException: pt.webdetails.cda.dataaccess.QueryException: ERROR: syntax error at or near "$1" Position: 16

Does anybody have or know a solution for my mentioned problem?

Thanks for your help in advance!

error SAP Input step

$
0
0
Hi everybody, I'm trying to use the SAP Input step in rel. 5.3, but the step works only in preview mode, if I try to execute the transformation, I get the following error:


2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Error initializing step [RFC ZQUAD_GIAC_CATALYST]
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : java.lang.NullPointerException
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at java.util.Hashtable.put(Unknown Source)
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at java.util.Properties.setProperty(Unknown Source)
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.open(SAPConnectionImpl.java:76)
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.open(SAPConnectionImpl.java:65)
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at org.pentaho.di.trans.steps.sapinput.SapInput.init(SapInput.java:180)
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69)
2015/04/30 14:56:55 - RFC ZQUAD_GIAC_CATALYST.0 - at java.lang.Thread.run(Unknown Source)
2015/04/30 14:56:55 - 30042015 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Step [RFC ZQUAD_GIAC_CATALYST.0] failed to initialize!
2015/04/30 14:56:55 - Spoon - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : 30042015: preparing transformation execution failed
2015/04/30 14:56:55 - Spoon - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/04/30 14:56:55 - Spoon - We failed to initialize at least one step. Execution can not begin!
2015/04/30 14:56:55 - Spoon -
2015/04/30 14:56:55 - Spoon -
2015/04/30 14:56:55 - Spoon - at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:1149)
2015/04/30 14:56:55 - Spoon - at org.pentaho.di.ui.spoon.trans.TransGraph$27.run(TransGraph.java:3989)
2015/04/30 14:56:55 - Spoon - at java.lang.Thread.run(Unknown Source)

Can anybody help me?
Thank you.

Tommaso

Is it possible to use commons math 3.5 library?

$
0
0
Hello,

I need to use the Wilcoxon test, which was included in version 3.0. Is it possible to update the library? Should I just place the jar file in the lib folder?

Thanks.

n00b question: how to insert static column in each row

$
0
0
Hi everyone, I apologize in advance for probably asking a dumb question. I have tried to do my due diligence, but for the life of me I can't figure this out.

I have a job that reads data from an LDAP input, and spits it out into an AROutput (proprietary BMC extension ... for the purposes of this question, it's not important, it's basically the same thing as a database insert/update).

I need to insert a unique identifier, which will allow me to identify records touched by specific job runs (so on Monday I can say "hey database, show me how many runs we executed last week, and the records for each of them").

"No Problem", I thought. I'll just use "Generate random value" to create a UUID at the top of the job, and insert that into each row.
Being a n00b, however, I can't figure out how to do that.

I've tried hooking it up like this:

[generate random value] -> [LDAP Input] -> [AROutput]

when I do this, the field containing the random value ouput is consistently NULL at the AROutput step

I've also tried hooking it up like this:

[LDAP Input] -> [generate random value] -> [AROutput]

this works, however, it calls generate random value for EACH ROW of the LDAP Input, so ... I get a new UUID for each row. Not what I'm after.

I have also tried looking at "Stream lookup" and "Merge Join", but these seem to want to work on SETS of data. I don't need to join two data SETS, I just need to insert the same value into each row.

Again, sorry ... I'm sure this is a real basic question.
I've looked around, but if the answer is already out there, my google-fu isn't strong enough to find it.

any help would be greatly appreciated.

Tabuleau Vs Pentaho

$
0
0
Am interested in knowing the difference between Tabuleau and Pentaho bigdata visualization tools.

Execute SQL Script challenge (quote strings vs. not quote strings?)

$
0
0
Hey I'm playing around with Execute SQL Script as opposed to Table Output/ Update --- because it seems to be faster (6 seconds for 1000 rows vs. 9 seconds).

Probably not super important long term.

I'm taking a JSON document, parsing it, and sending it to the SQL database.

In Table Output this works just fine. The JSON inputs values like 1, 2, , dog .... notice that the 3rd value is empty.

When an empty value is inserted using the Table Output step, it's simply converted as NULL in the database. How exactly this happens, I'm not sure.


Now .... let's say I'm doing the same thing in Execute SQL Script using insert into table (a,b,c) values (?,?,?) using the fields as parameters.

Suddenly there is a problem.

That's because the fields are 1, 2, , dog, as well as the following:
{3,faler??ca,,} aka random garbage with a lot commas and other SQL-breaking syntax.

The clear solution? Quote strings!

The {3,faler??ca,,} garbage becomes '{3,faler??ca,,}' and no longer breaks SQL - it becomes inserted as the string it is.

HOWEVER ... now the empty values () ... become "". What's the problem with that? Well, it turns out an empty value becomes NULL ... but a quoted empty value becomes an "invalid integer." It must be unable to convert into an integer column.

I'm guessing the only way around this is to quote everything except "empties" between the JSON parsing and the SQL script run. Much of a pain though, maybe I'll accept a 50% process overhead instead.

arff sparse format with huge missing values

$
0
0
I'm using weka to mine association rules on a supermarket data. The data contains 400 different attributes(items) and 100,000 instances where each instance is a shopping record having a list of items bought together, for example: 1,5,200,305,399 I processed the data into arff format, I assigned a single value '1' for each attribute, for example
@attribute 1 {1}
@attribute 2 {1}
@attribute 3 {1}
...
for each shopping record, I used '?' to denote the value is missing, for example:
1,?,?,?,1,1,?,?.....
As the shopping record is very huge and I want to turn it into sparse format, but the sparse format processed by weak removes '1' retains and index the missing values '?', is there a way to maintain and index '1' in the sparse format?

Row Normaliser - split fieldname into rows & columns

$
0
0
Hi,

Using the row normaliser example shown in the pentaho wiki (http://wiki.pentaho.com/display/EAI/Row+Normaliser), it is possible to automatically split the field names of source table and use them as row headers and column names?

So for a table like this:

DATE PR1_NR PR_SL PR2_NR PR2_SL PR3_NR PR3_SL
20030101 5 100 10 250 4 150
... ... ... ... ... ... ...

I want to convert it to something like what's shown below: with the new field names (SL, NR) automatically extracted from the table fieldnames, and the row heading (PR1, PR2,PR3) also automatically extracted from the table field names.

Date
Type
SL
NR
20030101 PR1 100 5
20030101 PR2 250 10
20030101 PR3 150 4
20030101


Thanks for your assistance!
Paul

Google Spreadsheet Input - How write the Spreadsheet KEY

$
0
0
Dear all,

is there a sample about how to write the Spreadsheet key on second tab of step ??

It's really difficult to understand the syntax to apply...

my sheet for istance has this link

https://docs.google.com/spreadsheets...FwM/edit#gid=0

How have I to use it into the Spreadsheet Key fiel ... ??

The tutorial is absolutely obscure for me

Error Analysis with Weka

$
0
0
I'm trying to do an error analysis on my misclassified instances in Weka. The goal is to include the source file names so I can examine them specifically and look for patterns. My updated flow goes as follows:



  1. TextDirectoryLoader reads the samples from two sub-directories that map to the classes. The outputFilename attribute is set to true.
  2. StringToWordVector parses the text attribute into new tokens/features.
  3. StringToNominal converts the filename attribute (generated by TextDirectoryLoader) from a string type to a nominal type otherwise the classifier will choke on it.
  4. ClassAssigner uses the @@classname@@ attribute as the class distinguishing value.
  5. CrossValidationFoldMaker splits the attributes into 3 folds.
  6. A classifier (Naive Bayes and SVM in turn) is trained on the data.


My problem is that the filename attribute skews my results. But I'd like to keep it and use it to identify the source files for misclassified instances, as mentioned above. What step(s) do I need to introduce, or configurations do I need to change, in order to have the classifier ignore the filename attribute but still output it with the predictions?

PDI - Spoon Unable to create database connection

$
0
0
Hi All, I am unable to create Database connection to my local MYSQL DB and Db2 database in SPOON. I've copied driver jar files in data-integration/lib folder. I am able to connect database from DB visualizer but not in SPOON. Kindly suggest. Thanks
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>