Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Extend XML generators Steps

$
0
0
I would like to extend the functionality of the "XML Output" and "Add XML" steps. Where do I find the source codes for these steps? I need to add a default attribute to all elements for each file generated and wrap the data in <![CDATA[]]>.

Thanks.

Anonymous Role does not exist, how can I create it?

$
0
0
I am trying to bypass Pentaho standard security and allow anonymous access. In PUC, there exists an "Anonymous" role in the "System Roles" section but there is no "Anonymous" Role in the "Manage Roles" tab. I just have "Power User", "Administrator", "Business Analyst", and "Report Author".

I have tried to use the "+" symbol to add the anonymous role to the "Manage Roles" tab, but it does not work.

Is there some other way I can create the Anonymous role?

Pentaho 5.2

Error - LIBPATH in spoon.sh

$
0
0
Recently, I downloaded the R Script Executor plugin for PDI 5.3. So I did all the settings as http://wiki.pentaho.com/display/EAI/R+script+executor. Still, I got the error:
Loading JRI library from: /opt/libswt/linux/x86_64
JRI not found in java.library.path


correction:
Analyzing the startup script of the PDI spoon.sh we find the following line according to the operating system in use:


original line
LIBPATH = $ BASEDIR/../libswt/linux/x86_64/


changed to
LIBPATH = $ BASEDIR/libswt /linux/x86_64/

After changing the plugin worked properly.
Loading JRI library from: /opt/data-integration/libswt/linux/x86_64

Thank you, bye.

Lookup on Staging table

$
0
0
Hi Forum,

I'm creating a staging table and below is my first insert with 1st CSV file (1 row comes from 1 CSV file ).
For this I did CSV---->TableOutput and 1st row of data is inserted.
Image:
table.jpg

Next time when I run the same transformation with same file the same row should not insert into the table. Currently it is happening.
Also, if I run the transformation pointing to another CSV of same metadata for 2nd row insertion it should lookup whether the row is available in table or not. If it
is available it should reject the row and dump into a file and if it is not available it should insert into the table.

How to do this ? Currently, I'm trying with Stream lookup step but it is not resulting as expected with above scenario.

Thank you in Advance :-)

- Sadakar
Attached Images

SQL Server connection issue

$
0
0
Hi ,

There have two connection type "MS SQL Server" an "MS SQL Serve(Native) "in the database connection. anyone knows the difference?

after I added he JDBC jar file to PDI, the DB connection worked fine in the "SQL Server(Native)", but the connection failed when I switch to "SQL Server".
Error log:
Error connecting to database [My SQLServer] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database

Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
Unable to get information from SQL Server: SSTCP16564-7.

org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database
Error connecting to database: (using class net.sourceforge.jtds.jdbc.Driver)
Unable to get information from SQL Server: SSTCP16564-7.

Init a dashboard

$
0
0
HI

I made my first CDE dashboard and I like it.

The dashboard has 1 chart and 3 selectors (month, province, region).

I would like to set a default chart when the dashboard starts.

I made an init function like this:

Code:

function load(){
     
    Dashboards.fireChange('mzr_geb_id','7510');
    Dashboards.fireChange('maand','1');
    Dashboards.fireChange('provincie','Noord-Holland');
   
}

This works fine for the chart, it displays the defined default values at start.

But the 3 selectors do not act on this.

How can I set the selectors to init also with these values?

month is a checkbox list
province is a radio list
region is a table

Hans

Not able to write in cassandra using sstable output

$
0
0
I am trying to write into a cassandra table through sstable output , initial question:
-- it doesnot give option to specify database details, so where it will write the data?
-- I gave the details assuming it will use local database connection it gives me following error :

2015/04/23 16:53:20 - SSTable Output.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Failed to process row
2015/04/23 16:53:20 - SSTable Output.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/04/23 16:53:20 - SSTable Output.0 - Failed to create SSTableSimpleUnsortedWriter
2015/04/23 16:53:20 - SSTable Output.0 - Could not initialize class org.apache.cassandra.config.CFMetaData
2015/04/23 16:53:20 - SSTable Output.0 -
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.steps.cassandrasstableoutput.SSTableWriter.init(SSTableWriter.java:136)
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.steps.cassandrasstableoutput.SSTableOutput.initialize(SSTableOutput.java:134)
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.steps.cassandrasstableoutput.SSTableOutput.processRow(SSTableOutput.java:153)
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/04/23 16:53:20 - SSTable Output.0 - at java.lang.Thread.run(Unknown Source)
2015/04/23 16:53:20 - SSTable Output.0 - Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.cassandra.config.CFMetaData
2015/04/23 16:53:20 - SSTable Output.0 - at org.apache.cassandra.io.sstable.SSTableSimpleUnsortedWriter.<init>(SSTableSimpleUnsortedWriter.java:80)
2015/04/23 16:53:20 - SSTable Output.0 - at org.apache.cassandra.io.sstable.SSTableSimpleUnsortedWriter.<init>(SSTableSimpleUnsortedWriter.java:91)
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.steps.cassandrasstableoutput.SSTableWriter.init(SSTableWriter.java:133)
2015/04/23 16:53:20 - SSTable Output.0 - ... 4 more
2015/04/23 16:53:20 - SSTable Output.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Unexpected error
2015/04/23 16:53:20 - SSTable Output.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : java.lang.NullPointerException
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.step.BaseStep.putError(BaseStep.java:1618)
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.steps.cassandrasstableoutput.SSTableOutput.processRow(SSTableOutput.java:170)
2015/04/23 16:53:20 - SSTable Output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/04/23 16:53:20 - SSTable Output.0 - at java.lang.Thread.run(Unknown Source)
2015/04/23 16:53:20 - SSTable Output.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2015/04/23 16:53:20 - testtrans - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Errors detected!
2015/04/23 16:53:20 - Spoon - The transformation has finished!!
2015/04/23 16:53:20 - testtrans - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Errors detected!
2015/04/23 16:53:20 - testtrans - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Errors detected!


Please help it as we need to suggest Pentaho enterprise edition to our client.

Thanks,
Saurabh.

CDE and Saiku Widgets

$
0
0
Hi all,
please can you confirm that it is NOT possible to insert Saiku Widgets in dashboards created through CDE if Saiku CE is used?

I was trying to add a Saiku Analysis created through Saiku Analytics CE 3.1 to a CDE dashboard and I got these errors:

12:29:11,403 ERROR [JsMinifiedDependency] Error getting input stream for dependency saikuWidget/../../ui/js/saiku/render/SaikuRenderer.js. Skipping..
12:29:11,407 ERROR [JsMinifiedDependency] Error getting input stream for dependency saikuWidget/../../ui/js/saiku/render/SaikuTableRenderer.js. Skipping..
12:29:11,408 ERROR [JsMinifiedDependency] Error getting input stream for dependency saikuWidget/../../ui/js/saiku/render/SaikuChartRenderer.js. Skipping..
12:29:11,410 ERROR [JsMinifiedDependency] Error getting input stream for dependency saikuWidget/../../ui/js/saiku/embed/SaikuEmbed.js. Skipping..

indeed above components are not in the folders they are expected to be.

Can you give me a feeback, please?

thanks
alessandro

AS400 Bulk Load

$
0
0
I am loading data into a AS400 DB2i system and is using bulk load option. But it is giving me the error. Where as the data is getting loaded if I disable the batch update option

2015/04/23 18:22:18 - BADTEL_TGT.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Because of an error, this step can't continue:
2015/04/23 18:22:18 - BADTEL_TGT.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/04/23 18:22:18 - BADTEL_TGT.0 - Error batch inserting rows into table [BADTEL_T].
2015/04/23 18:22:18 - BADTEL_TGT.0 - Errors encountered (first 10):
2015/04/23 18:22:18 - BADTEL_TGT.0 -
2015/04/23 18:22:18 - BADTEL_TGT.0 -
2015/04/23 18:22:18 - BADTEL_TGT.0 - Error updating batch
2015/04/23 18:22:18 - BADTEL_TGT.0 - [IBM][System i Access ODBC Driver]Column 7: Numeric value out of range.
2015/04/23 18:22:18 - BADTEL_TGT.0 -
2015/04/23 18:22:18 - BADTEL_TGT.0 -
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:342)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:118)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at java.lang.Thread.run(Unknown Source)
2015/04/23 18:22:18 - BADTEL_TGT.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseBatchException:
2015/04/23 18:22:18 - BADTEL_TGT.0 - Error updating batch
2015/04/23 18:22:18 - BADTEL_TGT.0 - [IBM][System i Access ODBC Driver]Column 7: Numeric value out of range.
2015/04/23 18:22:18 - BADTEL_TGT.0 -
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.pentaho.di.core.database.Database.createKettleDatabaseBatchException(Database.java:1351)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:289)
2015/04/23 18:22:18 - BADTEL_TGT.0 - ... 3 more
2015/04/23 18:22:18 - BADTEL_TGT.0 - Caused by: sun.jdbc.odbc.JdbcOdbcBatchUpdateException: [IBM][System i Access ODBC Driver]Column 7: Numeric value out of range.
2015/04/23 18:22:18 - BADTEL_TGT.0 - at sun.jdbc.odbc.JdbcOdbcPreparedStatement.emulateExecuteBatch(Unknown Source)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at sun.jdbc.odbc.JdbcOdbcPreparedStatement.executeBatchUpdate(Unknown Source)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at sun.jdbc.odbc.JdbcOdbcStatement.executeBatch(Unknown Source)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:294)
2015/04/23 18:22:18 - BADTEL_TGT.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:285)
2015/04/23 18:22:18 - BADTEL_TGT.0 - ... 3 more

Field Clob Input - Field Clob Output

$
0
0
Hi everyone, I hope your fine.

Please, I have a table input with one field (clob data type), and another table output with the same field (clob data type). Both tables in the same server and schema (Teradata 14).

I create one transformation. I used step Table Input and Table Output, in PDI, to get data of my first table and put it on my second table.

What do I expect? I need to send same data from one side to the other side. Easy. But:

My input table has a Clob field, I can see content trought my Teradata SQL Assistant (Teradata Client). And then, when I ran transformation file, my output table shows only:

com.teradata.jdbc.jdk6.JDK6_SQL_Clob@2134b246

What that I did wrong? Its a simply transformation.

I have PDI 5.2.

Please and thanks in advance.

Keeping Group Header records together

$
0
0
I'm working through a report grouping issue. I derive my group header from the first character of a shop in a BSF expression.

Code:

import org.pentaho.reporting.engine.classic.core.DataRow;

String shop = (String) dataRow.get("SHOP");
String shopGrp = null;

switch ( shop.substring(0, 1) ) {
    case "B":   
        shopGrp = "BASE";
        break;
    case "L":   
        shopGrp = "LOST";
        break;
    case "U":   
        shopGrp = "UMBRELLA";
        break;
    default:
        shopGrp = "OTHER";
        break;
}

return shopGrp

I'm having an issue with records with the "OTHER" heading. I desire to have them all display together (see below).

Code:


OTHER
--------
detail a
detail b

OTHER
--------
detail a
detail b
   
--- should be ----
   
OTHER
--------
detail a
detail b
detail c
detail d

Read JSON file from S3

$
0
0
Do we have steps in kettle to read JSON file from Amazon S3?

your help is really appreciated.

Thanks,
Herwin Rayen

Spoon "Open" dialog uses Linux Mint bookmarks instead of last directory

$
0
0
Kettle - Spoon General Availability Release - 5.3.0.0-213
Linux Mint Cinnamon 17.1

Spoon "Open" dialog always uses the "bookmarks" rather than the last folder opened. This is an unpleasant change from 5.01 on Windows7. While I have added the most common folders I use to the bookmarks this still works poorly.


  1. Is this a linux specific issue? If so, is there a way around it?
  2. Was this a change from 5.01 and if so, do I need to create a support ticket to request changes to the dialog?

File to stage error handling process

$
0
0
Hi,

I'm currently working on file to stage process in Pentaho
I'm on windows operating system
I have source raw files in a remote server
we connect to remote server using mapdrives
and we have to change the format of the files lets say if I'm getting files in '.xslx' format I have to convert the file to '|' delimited format and place the file in destination folder again this folder will reside on the same server as PDI resides and I'll connect to the server using mapdrives
I need help on approaches for error handling
1.) I don't want to process the file if the data[columns] in .xslx file is different than what I'm expecting in target file
2.) how to handle this: I have 5 bad records in 1000 records I want to process good records but want to send bad records to reject table
3.) how to complete the load with 0 byte file if I don't get the file for the given day
4.) I'm getting null values for required fields and want to send those records to reject table
5.) out of bound values lets say I'm expecting date in mm/dd/yyyy format as 12/15/2014 but in the file I'm getting it as dd/mm/yyyy as 15/12/2014 but we don't have a 15th month so we have to flag the record as bad record
6.) In a file of 1000 records we have error in 11th record. Is there a way we can generate row-id and Pentaho can give the row level details of bad records in the output log

Please add all the possible error handling techniques to above scenario for file to stage job and give your valuable suggestions


Thank you!!


Regards,
Sunil

Set Table Input data to Polling Folder - Pentaho Data Integration

$
0
0
I have a requirement where we can get list of file names from SQL and need to pass these file names as variable to Step which can poll folder for these file names as text file. Please advise how to set SQL output of file names as array variable and pass to polling folder step ?

Upload Files to FTP step

$
0
0
Hello, i have a problem connecting to the FTP server when using the "Upload files to FTPS step":
i have an ssl explicit connection type.
i put on the server name : ftpes://xxxxxxxxxx.com
port: 21
username: yyyyy@xxxxxxxxxx.com
Connection type : TLS
when i click on test connection the first time the error message shows : Error Trying to connect to host! Exception: Error connecting to host [ftpes://xxxxxxxxxx.com:21]! FtpIOException--> Return Value: 530 Description : Error connection to : ftpes://xxxxxxxxxx.com:21

When i click on test button at the second time it shows connection OK,
But when i execute my job the same error message show up.

Anyone idea please.

Thnaks in advance

error handling in file to stage job

$
0
0
Please give suggestion for error handling in file to stage job

Copy file to Cassandra File System

$
0
0
Is there any way to write a file to CFS (Cassandra File System) ?

Creating a dynamic report

$
0
0
Hi,

I am new to Pentaho, and I'm still in its learning stages. I have created an interactive report - but the requirement is to make it dynamic. As in, instead if showing all the values it is supposed to report, the user should have the option to filter out and view only those values he wants. You can add filters while editing the report, but I found that you have to specify there itself what values you want to filter out - in the form of a SQL query.

I was wondering if there was any way to keep filters like how we have in excel sheets, to sort the data.

I hope my requirement is clear. Any suggestions or solutions would be highly appreciated.

Thanks,

Java Version

$
0
0
Which Java version is recommended with the latest PDI 5.3 version: JRE 7 or JRE 8?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>