Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

ElasticSearch Bulk Insert

$
0
0
Trying to connect to Elasticsearch (localhost:9200 or 9300). Any idea why this error?

An unexpected error occurred in Spoon:
Could not initialize class org.elasticsearch.threadpool.ThreadPool
java.lang.NoClassDefFoundError: Could not initialize class org.elasticsearch.threadpool.ThreadPool
at org.elasticsearch.client.transport.TransportClient$Builder.build(TransportClient.java:133)
at org.pentaho.di.ui.trans.steps.elasticsearchbulk.ElasticSearchBulkDialog.test(ElasticSearchBulkDialog.java:923)
at org.pentaho.di.ui.trans.steps.elasticsearchbulk.ElasticSearchBulkDialog.access$300(ElasticSearchBulkDialog.java:86)
at org.pentaho.di.ui.trans.steps.elasticsearchbulk.ElasticSearchBulkDialog$7.handleEvent(ElasticSearchBulkDialog.java:358)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1375)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:8104)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9466)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:701)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)

Can Kettle decrypt data with AES/CBC/PKCS5Padding algorithm

$
0
0
Can some one help with how to decrypt a filed with AES/CBC/PKCS5Padding algorithm in PDI?

Thanks,
Mano

8.1: Call Mapping using parameters in Transformationpath

$
0
0
When trying to upgrade from PDI 8.0 to 8.1, I encountered a problem.
In our repository, some Mappings are called dynamically using parameters (e.g. "${location}/${type}Mapping").
With PDI 8.1 I get the error "No valid mapping was specified!".

So I logged out from the repo and built a small demo (see attachments):
I added some paramters and tried to call one Mapping four different ways, but only the "hardcoded" one could be found.
As you can see, the call with two parametes gets resolved in the tooltip. Am I doing somethin wrong, or ist this a bug?



Another funny thing is, when being connected to the repo and using a single parameter, PDI adds a leading "/" to the field after closing the dialog, resulting in an error opening and closing it afterwards.

Decryt Data

$
0
0
Hi,

Can some one help with how to decrypt a filed with AES/CBC/PKCS5Padding algorithm in PDI/Spoon?

8.1 CE - Problems with Text File Output Transformation, reverted back to 8.0

$
0
0
Just wanted to mention a problem I've had with 8.1 and the text file output transformation.

I have a large database table consisting of about 8.5 million records. For subsequent processing I have a transformation that outputs this table to multiple CSV files, each consisting of no more than 500,000 records per the transformation's "Split every ... rows" option under the content tab. With PDI up through version 8.0 this transformation works fine and I end up with 16 separate files, the first 15 about the same size and the 16th with the remaining rows.

However, in PDI 8.1 only the first 9 CSV files produced were "split" correctly. The 10th file created (i.e., <filename_9>.csv) contained the bulk of the remaining rows in the table and the remaining six files were very small in size (unfortunately I didn't save any of these six files to see what they may have contained... it may have only been the header row).

After discovering the problem through subsequent processing steps failing I reverted back to PDI 8.0 and everything worked properly as before. So, for me at least, something isn't worked properly with the text file output transformation in PDI 8.1.

Ruby theme on Pentaho Version 7.1

$
0
0
Hi,
I am new to Pentaho BA and the user console.

I was was wondering if the Ruby theme from version 8 of Pentaho user console can be retrofitted to version 7.1.

if so can anyone guide me on how?

thanks
Shawn

Not able to execute ETL job from cmd prompt windows 10

$
0
0
Hi All,

I have a designed a etl job which is to be executed using batch file.
But when I try it fails. I tried executing job from cmd prompt. I get access denied message in cmd prompt.
I am using windows 10 machine, Pentaho 7.1 ce and jre1.8.0_171
When I execute the etl job from spoon.bat UI it works well but not from cmd prompt.
The logged in user on windows 10 machine is a admin user.

Please help!


Thanks
Ajinkya

How to manage PRD file version?

$
0
0
Hi all,

Please somebody can help now?

Is that possibility by configuration in PRD to allow ton open report build with 7.1 version with 6.0 version???:cool:

Please!!

Sorry with my English language:p

Regards,

Java casting error after upgrading from PDI 6.1 to 7.1

$
0
0
Hi, we're trying to upgrade from PDI 6.1.0.1-196 CE to PDI 7.1.0.0-12 CE using java 1.8.0_179.

Job that ran fine under 6.1 now has abort on 7.1 Calculator step with this error:

error: java.util.Date cannot be cast to java.sql.Timestamp

this is happening when removing the time from an Oracle sourced timestamp column trunc(sysdate) and assigning it to a date column which works fine in version 6.

Any suggestion or thoughts?

Thank you,
Adam

Oportunidade para DEV ETL (Kettle) em SP

$
0
0
Olá, temos oportunidades para contratação de desenvolvedores de ETL usando Kettle para atuar em SP. Interessados enviar cv para fabiogibon@gmail.com

ETL job opportunity - SP (Brazil)

Handling tab seperated values and comma separated values in text file input

$
0
0
Hi,

I am using text file input step to read .txt file. This file can be either tab separated .txt file or comma separated .txt file.
Both files have same fields. How to handle this in etl?
we do not know whether user will pass tab separated or comma separated file to the etl transformation.


Thanks
Ajinkya

How to read the last filename to notify it via email?

$
0
0
Hello,
I'm using PDI for running a script that backup my SQL Server DBs. I like to receive an email after the moving of the BAK files on my NAS and I'd like to read in the e-mail the filename of the file just moved.

The structure is
J1 use a T1 to know the DBs to backup
T1 passes N rows to J2 that is setted to Execute every input row (every row is the name of the DB)
J2 Execute a batch script using %1 as argument (then %1 is the name of the DB) and the Backup starts
When the backup finishes J2 Move the file from the local folder to the folder on nas and the Move Files Step has setted the flag to Add files to result files name
When the move process is finished J2 sends an email and here I'd like to write the filename of the file moved or AT LEAST the value of %1

But I can't read anyone of both :(

Can you help me?

Best Linux + install instructions

$
0
0
Which Linux distribution is best to install Pentaho (community, everything = BI server + PDI)? And where are clear and complete install instructions for community edition on Linux?

Dummy step reporting failure

$
0
0
I have a job with a simple evaluation. If the variable it is evaluating is 1, it reports success and runs a transformation. If it is zero then I have the error running a dummy step. Problem is that the job metrics reports a failure for that job and I want it to report a success even though the evaluation "failed". There are no errors in the log but the job reports "false" under "finished job entry". Basically the variable is set in a config file and allows the user to decide which transforms to turn on and off.

occurences

$
0
0
Hi,

How to count the number of delimiter occurrences without including enclosures. If delimiter contains with in the enclosure it should not count it.

Input
====
A,B,C
"A,D",E,F,"G,H"
"MN",O,P,R,S,"T,U,V"

Output
=====
1st line ==> 2
2nd line ==> 2
3rd line ==> 5

Couldn't get field info from [SELECT * FROM products LIMIT 0]

$
0
0
Hi, I have pentaho CE 8.1 with MySQL...I'm doing the tutorial from "Pentaho Data Integration - Beginners' Guide", chap 8 inserting new products or updating existing ones


In the Insert/Update step
At the moment of configuring the step and setting the fields to map, when clicking on "edit mapping" I've got the following error:
Couldn't get field info from [SELECT * FROM products LIMIT 0]
Error determining value metadata from SQL resultset metadata
Unknown system variable 'OPTION'

and asking for details I've got the log:
==============================

org.pentaho.di.core.exception.KettleException:
Unable to determine the required fields.


Couldn't get field info from [SELECT * FROM products LIMIT 0]




Error determining value metadata from SQL resultset metadata
Unknown system variable 'OPTION'






at org.pentaho.di.trans.steps.insertupdate.InsertUpdateMeta.getRequiredFields(InsertUpdateMeta.java:881)
at org.pentaho.di.ui.trans.steps.insertupdate.InsertUpdateDialog.generateMappings(InsertUpdateDialog.java:558)
at org.pentaho.di.ui.trans.steps.insertupdate.InsertUpdateDialog.access$200(InsertUpdateDialog.java:82)
at org.pentaho.di.ui.trans.steps.insertupdate.InsertUpdateDialog$4.handleEvent(InsertUpdateDialog.java:400)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.insertupdate.InsertUpdateDialog.open(InsertUpdateDialog.java:507)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:120)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8949)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3291)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:785)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1375)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:8104)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9466)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:701)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn't get field info from [SELECT * FROM products LIMIT 0]




Error determining value metadata from SQL resultset metadata
Unknown system variable 'OPTION'




at org.pentaho.di.core.database.Database.getQueryFieldsFallback(Database.java:2354)
at org.pentaho.di.core.database.Database.getQueryFields(Database.java:2193)
at org.pentaho.di.core.database.Database.getQueryFields(Database.java:1847)
at org.pentaho.di.core.database.Database.getTableFields(Database.java:1843)
at org.pentaho.di.trans.steps.insertupdate.InsertUpdateMeta.getRequiredFields(InsertUpdateMeta.java:872)
... 28 more
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error determining value metadata from SQL resultset metadata
Unknown system variable 'OPTION'


at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromSQLType(ValueMetaBase.java:4881)
at org.pentaho.di.core.database.Database.getValueFromSQLType(Database.java:2439)
at org.pentaho.di.core.database.Database.getRowInfo(Database.java:2400)
at org.pentaho.di.core.database.Database.getQueryFieldsFallback(Database.java:2345)
... 32 more
Caused by: java.sql.SQLException: Unknown system variable 'OPTION'
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1606)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1503)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
at com.mysql.jdbc.ResultSetMetaData.getColumnDisplaySize(ResultSetMetaData.java:213)
at org.pentaho.di.core.row.value.ValueMetaBase.getValueFromSQLType(ValueMetaBase.java:4657)
... 35 more


==============================

I finished entering the transformation and run with the same result (error).

Any idea to solve it ? as I have not found the same problem

Thanks

Loop Advice Request

$
0
0
Hi

Would greatly appreciate any assistance or advice with this question.

Step #1: I make an http call to a simple web page (field name = "base_url") which contains static data. I am converting the response (HTML) to a single text stream with all CR/LF's removed.

Step #2: I then extract values from that text stream in the form of anywhere between zero and thirty numbers, relating to employee IDs (field name = "employee_id").

Step #3: I then join the base_url field and the employee_id field to create a URL which is the relevant web page for that employee.

Now I need to repeat Steps 1 through to 3 for each of those employees.

As you will have already guessed I'm starting with a department head and trying to iterate my way down through employee's to scrape/create the management/reporting structure.

How can I achieve this loop, please?

Thanks!

Step fields and their origin

$
0
0
Both the Pentaho documentation and the book, "Learning Pentaho Data Integration v 8.0" say that by pressing the spacebar while 'moving/placing/hovering' the mouse over a row in the Caclulator transform 'edit page' (would you call that a definition?), that the dialog called 'Step Fields and their Origin' will appear.

This does not occur on the Mac running High Sierra (10.13.5).

I have tried all variations of key strokes, as well as all interpretations of the instructions. the keystroke combinations are mostly overloaded by MAC features such as the 'searchlight'.

I've also tried all the variations listed such as bringing up the context menu (the reputed 'show output fields' option does not appear), and 'right-clicking' the line also produces no 'step fields' dialog.

But someone working on the team that develops the 'community version' must use a mac, right? Maybe just one person?

Thanks,
Kimball

Where does ${Internal.Entry.Current.Directory} actually point on a MAC?

$
0
0
In Maria Roldan's recent book on PDI v8 she says...

We will create a folder in the same place where our Job is saved. For doing that, in the textbox next to the Folder name option, type ${Internal.Entry.Current.Directory}/SAMPLEFILES and click on OK.

However, on a mac, the job fails with these messages:
Could not create Folder [/Training/samplefiles]
FileSystemException: Could not create folder "file:///Training".

Without this macro, the folder is created in the PDI folder in my HOME.

To me this indicates that the macro is interpreted as a 'root folder' in the file system.

So what is the correct syntax?

Thanks,
Kimball
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>