Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Couldn't find field [league_id] in row!

$
0
0
In my transformation, i am reading the values from the JSON input in the next step given that values into the unique rows step.

While running the transformation with the Pentaho it's working fine.

When running the transformation with the java project. I am facing the couldn't find a field in row Exeception!

Error parsing parameter information on open prpt report with chart

$
0
0
Hi evryone!

I'm using Pentaho Report Designer 7.0.0.0-25
Pentaho User Console 5.0.1 stable

I've created report in PRD and it looks fine and i can preview it.
Then i upload it to Pentaho User Console as server, open in bowser and everything is still fine.
Then I add Bar Chart to report. I can preview it in PRD, it looks fine.
Then I upload it to Pentaho User Console, and when I open the report I get the Fatal Error "Error parsing parameter information"

The problem is in Bar Chart in report. If the report is empty and there is Bar chart not connected to data source the error appears.

If I remove Bar chart report works fine.

Is any ideas where to look the problem ?
Maybe there's some settings i should use ?

I'll be grateful for any participation.

Thanks!

Excel excoding wrong

$
0
0
Excel to MySQL

I use encoding utf8 because English language and Thai language
MySQL I create with utf8 all tables.
When I run PDI, database is wrong.
English language is right but Thai language show ????????

aaa.jpg
Attached Images

Reading and writing xlsb in PDI

$
0
0
Hi,

i would like to know if there is any to read from and write in .xlsb files (excel binay workbook file)

thanks everyone

WebSpoon - Can't read anything from "Filter Rows" step

$
0
0
Hello,

Just tried webspoon on my local environment and when opening the Filter Rows step, nothing is shown. Can't read existing conditions and can't create new ones either.
I'm using it via Firefox 44.0.2.
Is there anyone capable to fix that for me please ?

Best Regards.

does get files row count cache result?

$
0
0
Hi all,

in a main stream I have to know the number of rows contained into a txt file. I used a "get files rows count" step and a "join rows (cartesian product)" step to get the rows count and then I use this count into the "getIndex" step.
This figure shows in red square the two steps JoinRows.jpg

I would ask you for these two hints:
  1. there is another way to get this info (rows count of a file) only once before working with the main stream?
  2. If is not possible to read only once this info, does the "get files rows count" cache its result or does it reads repeately the txt file for each row of the main stream?

Thank you.
Gianpiero
Attached Images

Unable to run the JSON Input Transformation from the java application

$
0
0
I am facing the issue cannot run transformation due to plugin missing.

These are the jars I am using

val pentahoVersion = "7.0.0.1-35"

"pentaho-kettle" % "kettle-core" % pentahoVersion,
"pentaho-kettle" % "kettle-dbdialog" % pentahoVersion,
"pentaho-kettle" % "kettle-engine" % pentahoVersion,
"pentaho-kettle" % "kettle-ui-swt" % pentahoVersion exclude("org.eclipse", "swt"),
"pentaho-kettle" % "kettle5-log4j-plugin" % pentahoVersion,
"pentaho-kettle" % "kettle-json-plugin" % pentahoVersion,
"pentaho" % "pentaho-vfs-browser" % pentahoVersion,
"org.codehaus.enunciate" % "enunciate-jersey-rt" % "1.27",
"javax.ws.rs" % "jsr311-api" % "1.1.1",
"org.eclipse" % "swt" % "3.3.0-v3346",
"net.sf.jt400" % "jt400" % "6.1",
"com.sun.jersey" % "jersey-client" % "1.19.1",
"com.sun.jersey" % "jersey-bundle" % "1.19.1",
"com.sun.jersey.contribs" % "jersey-apache-client" % "1.19.1",
"com.sun.jersey" % "jersey-core" % "1.19.1",
"com.sun.jersey.contribs" % "jersey-multipart" % "1.19.1",
"jsonpath" % "jsonpath" % "1.0",
"com.googlecode.json-simple" % "json-simple" % "1.1",
"rhino" % "js" % "1.7R3",
"com.jayway.jsonpath" % "json-path" % "2.1.0",
"net.minidev" % "json-smart" % "2.2",
"org.apache.commons" % "commons-vfs2" % "2.1",
"commons-vfs" % "commons-vfs" % "20100924-pentaho",
"pentaho" % "pentaho-vfs-browser" % "7.0.0.0-25",
"org.safehaus.jug" % "jug-lgpl" % "2.0.0",
"net.sourceforge.jexcelapi" % "jxl" % "2.6.12",
"com.jcraft" % "jsch" % "0.1.46",
"com.itextpdf" % "itextpdf" % "5.5.6",
"org.apache.commons" % "commons-io" % "1.3.2",

Capture.jpg
Attached Images

Transformation Multiple Step Failure

$
0
0
I am having a nagging problem with PDI transformations whem I connect more than 2 steps together. I get the following error when I connect a third step in a transformation.

This only occurs when I use PDI on a Windows 10 workstation.

I do not have this problem when I construct a job, only a transformation.

2016/12/27 20:31:17 - Spoon - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : An unexpected error occurred in Spoon:
2016/12/27 20:31:17 - Spoon - null
2016/12/27 20:31:17 - Spoon - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : java.lang.NullPointerException
2016/12/27 20:31:17 - Spoon - at org.pentaho.di.ui.spoon.trans.TransGraph.addHop(TransGraph.java:1610)
2016/12/27 20:31:17 - Spoon - at org.pentaho.di.ui.spoon.trans.TransGraph$8.widgetSelected(TransGraph.java:1563)
2016/12/27 20:31:17 - Spoon - at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1359)
2016/12/27 20:31:17 - Spoon - at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7990)
2016/12/27 20:31:17 - Spoon - at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9290)
2016/12/27 20:31:17 - Spoon - at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:685)
2016/12/27 20:31:17 - Spoon - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2016/12/27 20:31:17 - Spoon - at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
2016/12/27 20:31:17 - Spoon - at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2016/12/27 20:31:17 - Spoon - at java.lang.reflect.Method.invoke(Unknown Source)
2016/12/27 20:31:17 - Spoon - at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)

I sure will appreciate any insight how to correct this issue as it really getting on my nerves.

Thank you

Ray

HTML XLS input

$
0
0
Hi all,

I've got a problem with importing an Oracle output file. This file shows me the inventory per x dimensions. Oracle outputs it as an html file with xls extension. When opening it with Excel it also says it's not excel format. When I try to load this into Spoon it gives me error below. Is there a way to have pentaho convert it to xls before using it? Or is there another way to not have to manually save the file as something before inputting it into spoon?


2016/12/29 09:55:30 - Microsoft Excel Input.0 - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : Error processing row from Excel file [C:\Users\ddejong\Downloads\LBTSOHQRPT_140392924_1.xls] : org.pentaho.di.core.exception.KettleException:
2016/12/29 09:55:30 - Microsoft Excel Input.0 - jxl.read.biff.BiffException: Unable to recognize OLE stream
2016/12/29 09:55:30 - Microsoft Excel Input.0 - Unable to recognize OLE stream
2016/12/29 09:55:30 - Microsoft Excel Input.0 - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : org.pentaho.di.core.exception.KettleException:
2016/12/29 09:55:30 - Microsoft Excel Input.0 - jxl.read.biff.BiffException: Unable to recognize OLE stream
2016/12/29 09:55:30 - Microsoft Excel Input.0 - Unable to recognize OLE stream

Shared connections in other machines

$
0
0
Hello everyone,


My name is João Correia and I am trying to resolve an issue.


I am developing some stuff in one machine through spoon (graphic interface). I created one connection and I shared that connection to use in every jobs and transformations.


I am versioning all the files created and modified with git to use them in other machines.
The problem is when something in the connection changes and I have to change it in all the jobs and transformations. This happens because I only use kitchen in the other machines (which don’t have the file /home/”username”/.kettle/shared.xml ).


Is any way to fix this in one single way?


Thank you all.


Kind regards,
João Correia.

PDI 7 - Blank page for Repository create / connect

$
0
0
I have installed the CE edition of PDI release 7 and when I run spoon and try to connect / create a repository it just opens a blank screen.
I have release 6.1 running on the same server and it opens the repository connections without any issue.

This is a Redhat Linux server release 6.8
The libwebkitgtk is installed.

Is there some way to get more debug info on what is failing when trying to do the connect or what else do I need installed that wasn't needed for release 6?

Usage of setvariable step in mapping transformation

$
0
0
Existing Approach:
I have some common logic where I need to call at the beginning of each job . Common logic is it will connect to database and call some procedure and returns a number, using a setvariable step will set this value to a pentaho variable which I will use further
Suppose I have four job J1,J2,J3,J4, and I implemented the same logic(common logic) in all the four jobs by using four different transformations t1,t2,t3,t4

j1------------------ call t1 transformation
t1 called stored procedure using x from job j1
stored procedure returned 1
set pentaho variable to PV=1-------à j1 make use PV further


j2------------------ call t2 transformation
t2 called stored procedure using y from job j2
stored procedure returned 2
set pentaho variable PV=2-------à j2 make use PV further

j3----------------- call t3 transformation
t3 called stored procedure using z from job j3
stored procedure returned 3
set pentaho variable PV=3-------à j3 make use PV further


New approach:
My intention is to use mapping transformation to implement the common logic and call the mapping transformation in all the jobs .
Suppose I have four job J1,J2,J3,J4, implement the common logic in transformation t1 and make use of t1 transformation in all the jobs.

Here one thing I am not clear is , in the mapping transformation I am using setvariable step to set the value returned from stored procedure to a pentaho variable. Does this causes any jumbling of return values when returning value back to calling jobs.

Like explained above j1 should receive value of 1 and j2 should receive value of 2.

There shouldn't be any jumbling here like j1 receiving value of 2 and j2 receiving value of 1

Thanks,
Sriram

[KnowledgeFlow] Initializing KF... com.pentaho.commons.dsc.f: license missing, inval

$
0
0
Hello.

I have a fresh install of pdi-ee-6.1.0-196 for linux.

Another resource did the install.

Kitchen.sh works great - but this benign stacktrace appears.

What's the cause? Or rather - the fix?

Incidentally, I'm using Oracle. And this file exists: pdi-ee-6.1.0-196/data-integration/lib/ojdbc7.jar - and I'm establishing connections to Oracle.
Is the solution to disable Weka Editors in kitchen.sh?


Thanks.

em

INFO: ---Registering Weka Editors---
Trying to add database driver (JDBC): RmiJdbc.RJDriver - Warning, not in CLASSPATH?
Trying to add database driver (JDBC): jdbc.idbDriver - Warning, not in CLASSPATH?
Trying to add database driver (JDBC): org.gjt.mm.mysql.Driver - Warning, not in CLASSPATH?
Trying to add database driver (JDBC): com.mckoi.JDBCDriver - Warning, not in CLASSPATH?
[KnowledgeFlow] Loading properties and plugins...
[KnowledgeFlow] Initializing KF...
com.pentaho.commons.dsc.f: license missing, invalid, or expired
at com.pentaho.commons.dsc.j.a(SourceFile:92)
at a.a.a.a.a.a.<init>(SourceFile:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

Pentaho Json Output step with dynamic fields

$
0
0
Hi, I am new to PDI. My query is i have to read 200+ tables from db through 'Table Input' and this step runs a query on each table i.e Select * from each table. I want to generate a json of each table data but Json Output step require Field name and Element Name. In my case how can i make fields dynamic to generate json of 200+ tables data
Your earliest response will be highly appreciated.
Thanks

Pentaho delete records with HBase

$
0
0
Hi ,

I have created one transformation in which I want to delete records from HBase table.

I have used User Defined Java Step.

When I run ETL , this java step is not finishing.

I have attached my java code written for this step, in which at line "HTable table = new HTable(HBconf, "my_table");" process not going forward.

Does anyone know , how I can delete records in Hbase ? Any other way to achieve this ?

Thanks in Advance.

Code :-

import java.io.IOException;
import java.util.*;
import org.pentaho.di.core.database.*;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.client.Delete;
import org.apache.hadoop.hbase.client.HTable;

public String LinkField;

public boolean processRow(StepMetaInterface smi, StepDataInterface sdi) throws KettleException
{

Object[] r = getRow();

if (r == null) {
setOutputDone();
return false;
}
if (first) {
first=false;
}

LinkField = get(Fields.In, "DOC_VPK_DOCUMENT_NBR").getString(r);

logBasic("Logging input field "+LinkField);
Configuration HBconf = HBaseConfiguration.create();
System.out.println("Stage0");
logBasic("Stage1");

String tableName = getInputRowMeta().getString(r, getParameter("TABLEFIELD"), null );
logBasic("Stage table"+tableName);


try{
HTable table = new HTable(HBconf, "my_table");
Scan s =new Scan();
logBasic("Stage2");


logBasic("Logging input field2"+LinkField);
Instantiating HBaseAdmin class
HBaseAdmin admin = new HBaseAdmin(HBconf);
logBasic("Stage3");

Getting all the list of tables using HBaseAdmin object
HTableDescriptor[] tableDescriptor = admin.listTables();
logBasic("Stage4");

printing all the table names.
for(int i=0; i<tableDescriptor.length;i++ ){
System.out.println(tableDescriptor[i].getNameAsString());
}

ResultScanner rs = table.getScanner(s);


for (org.apache.hadoop.hbase.client.Result r1 = rs.next(); r1 != null; r1 = rs.next())
{
String key = Bytes.toString(r1.getRow());
// logBasic("Scan: "+r1);
// logBasic("Scan: "+key);
// logBasic("==========" + key.equals(LinkField));

if(key.equals(LinkField)){
Delete delete = new Delete(Bytes.toBytes(key));
table.delete(delete);
logBasic("Deleted.......................");
}
else{
logBasic("Why are you in else............?");
}
}
logBasic("Logging input field3"+LinkField);
table.close();
}

catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
//throw new KettleException(e);
}

return true;
}

Nested if ... then , else if ...

$
0
0
hi,

i have to do 5 nested conditions. Something like:

If field1>1 then target_field="value 1"
else if field2>1 then target_field="value 2"
else if field3>5 then target_field="value 3"
else if field4<0 then target_field="value 4"
else target_field="value 5"

i know I can do this in javascript or the UDJE. My question is if there is a way to do this without having to write the code, i.e. using any of the steps

thanks

PDI 7, Amazon AMI, and webkit

$
0
0
Hi,

I am currently in the process of upgrading to Pentaho 7 on an Amazon instance and in the installation guide it says that linux users need to download libwebkitgtk-1.0-0. I am having trouble finding a way to download this onto the instance. I have tried yum and the package does not exist and google is not helping much. How can I resolve this issue?

Thank you

report-designer.sh

$
0
0
Hello and happy new year.
I have a problem whit starting report designer.
when i run report-designer.sh i have this error:

java.lang.reflect.InvocationTargetException
at java.awt.EventQueue.invokeAndWait(EventQueue.java:1288)
at java.awt.EventQueue.invokeAndWait(EventQueue.java:1263)
at javax.swing.SwingUtilities.invokeAndWait(SwingUtilities.java:1347)
at org.pentaho.reporting.designer.core.ReportDesigner.main(ReportDesigner.java:327)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:134)
Caused by: java.lang.ExceptionInInitializerError
at org.pentaho.reporting.designer.core.editor.palette.PaletteButton.<init>(PaletteButton.java:79)
at org.pentaho.reporting.designer.core.ReportDesignerFrame.createPaletteToolBar(ReportDesignerFrame.java:1567)
at org.pentaho.reporting.designer.core.ReportDesignerFrame.<init>(ReportDesignerFrame.java:1053)
at org.pentaho.reporting.designer.core.ReportDesigner$CreateReportDesignerFrame.run(ReportDesigner.java:152)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:302)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:745)
at java.awt.EventQueue.access$300(EventQueue.java:103)
at java.awt.EventQueue$3.run(EventQueue.java:706)
at java.awt.EventQueue$3.run(EventQueue.java:704)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:715)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)
Caused by: java.lang.RuntimeException: failed to load system cursor: DnD.Cursor.CopyDrop : cannot load system cursor: CopyDrop.32x32
at java.awt.dnd.DragSource.load(DragSource.java:135)
at java.awt.dnd.DragSource.<clinit>(DragSource.java:147)
... 18 more
Who help me? thank you

How to avoid error in case of not existing member

$
0
0
Hi,
I'm facing this error
Code:

mondrian.olap.MondrianException: Mondrian Error:MDX object '<membername>' not found in cube '<MyCube>'
Actually it is a dimension member that does not exist but for some application logic I need to give the user a chance to query the cube for potential members without facing errors.
I'm trying in the schema workbench (as a dev environment) and set the mondrian properties in the relevant file

Code:

mondrian.rolap.ignoreInvalidMembers=true
mondrian­.­rolap­.­ignoreInvalidMembersDuringQuery=true


but had no luck. Any hint?

Thanks in advance for your replies.
Regards

how to move view data in etl

$
0
0
1)I am moving data in oracle sql to postgresql .but hear problem is in oracle sql have view tables
so how can i transfer the view table from oracle to posgresql

2)I want to move data from one database to another data base in single transformation. not table after table........
Viewing all 16689 articles
Browse latest View live