Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Problem Excel Writer to append lines

$
0
0
Hi

I need to merge excel files to one. But with the excel writer I can just append lines, when I create an .xls file.
Can you please help me to create an xlsx file?

Same problem with Excel Output Step.....

I have over 70'000 rows to import an xls does not support over 65'000 rows....

Thanks
Thomas

Errors while importing data into Salesforce using the SalesForce Insert step

$
0
0
Hey all,

I have a task where I am supposed to use the salesforce Insert step to load data into Salesforce.

https://www.dropbox.com/s/hyv66cfq6o...tdata.txt?dl=0 is the sample file of data I am loading.

https://www.dropbox.com/s/kx7f8w5l8k...to_SF.ktr?dl=0 is the transformation I am using to load the data.

When I test the connection, the coonection looks good.

Previewing the salesforce insert step, I get unexpected error:

The details logs are located here:

https://www.dropbox.com/s/maz3n4bbvi..._logs.txt?dl=0

Is there anyone who can shade some light on this transformation of mine if at all I am missing something.

Thanks,

Ron

Unable to insert or update SQL Server DateTime column from Dimension Update step

$
0
0
Hi everyone,

My environment is as follows:
PDI Kettle v 5.3.0.0-513
MS SQL Server 2008

My transformation process includes a Dimension Lookup / Update step configured for SCD Type II updates.
Below are the 2 scenarios I am facing, neither of which allows me to implement SCD Type II successfully.

If more details are needed, please let me know. Hopefully someone has the answer :)

Scenario I

Enable the cache is TRUE
Type of Dimension Update is INSERT
Outcome: Dimension Lookup/Update step fails for SQL Serve 2k8 DateTime comparisons when these are listed in the Lookup/Update Fields

2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Unexpected error
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : java.lang.RuntimeException: Error serializing row to byte array
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.core.row.RowMeta.extractData(RowMeta.java:981)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.addToCache(DimensionLookup.java:1507)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.lookupValues(DimensionLookup.java:742)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.processRow(DimensionLookup.java:220)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at java.lang.Thread.run(Unknown Source)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - Caused by: java.lang.RuntimeException: MemorySampleTime Date : There was a data type error: the data type of java.lang.String object [Jan 22 2015 2:59PM] does not correspond to value meta [Date]
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.core.row.value.ValueMetaBase.writeData(ValueMetaBase.java:2420)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.core.row.RowMeta.writeData(RowMeta.java:579)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - at org.pentaho.di.core.row.RowMeta.extractData(RowMeta.java:976)
2015/08/31 09:49:24 - Dimension lookup/update Core.DEVICE.0 - ... 5 more

Removing all DateTime field comparisons solves the issue, but this is not ideal solution...

Scenario II

Enable the cache is FALSE
Type of Dimension Update is INSERT
Outcome: Dimension Lookup/Update step completes for SQL Server 2k8 DateTime comparisons when these are listed in the Lookup/Update Fields, BUT a new version of the record is inserted to the DB even when there is NO CHANGE to the data set.

PDI with Windows 10. Unable to connect steps

$
0
0
The problem that I seem to be having is that I cannot connect steps together in a transformation.

I am trying to use the same process that I am using on a different version 8.1 which works fine. In Windows 10 I cannot get the steps to connect.

I loaded a fresh copy of Windows 10 without any problems.

I have JRE and JDK install.

I really appreciate some insight.

Thansk
Ray

Slow performance with calculated members (

Please help me build java_server

$
0
0
Dear Everybody

When I build ant war then note error :

build java.jpg

Please help fix error. I build java_server in pack Pentaho-report-OpenERP-master.

Thanks very much.
Attached Images

recommended browsers

$
0
0
Which are the recommended browsers&versions for Pentaho 5.3?
The list in Infocenter seems to be outdated.

In particular I'm having some issues when displaying reports (prd) in html & pdf formats.

Ideally, the client needs to land on Chrome.

Thank you,

How can I prime the Pentaho Report cache using PDI?

$
0
0
Hello,

I'm not sure if I should ask this here or in another sub-forum.

I want to prime the Pentaho Report cache. I have tried using a HTTP client in PDI with the URL (http://localhost:8080/pentaho/api/re...ba.prpt/viewer). I get a 200 status code and a html page as result, but the report is not executed and the cache isn't primed.

How can I do this?

Thanks.

Nulls, empty strings, table output and database lookup

$
0
0
Hi - I have the situation that I want to design an address table. I want to ensure each unique address only exists in the table once. As I process an address, if it already exists in the table, I want to update a 'last seen' field. If it doesn't exist then I want to insert it. I have several fields, for example street number, street name, street direction. Not every street has a direction, so many times this field is blank. Pentaho inserts NULL instead of blank during the insert. Because of this, when this address is seen again, the database lookup doesn't detect that it is the same as the previous address (because null != null).

I have set KETTLE_EMPTY_STRING_DIFFERS_FROM_NULL to Y.

Going into table output, the street direction is an empty string according to preview data. It appears empty and does not say <null>. The database, however, receives this as null because in it's infinite wisdom, an empty string and a null are the same to Oracle.

However, the issue is that when that address comes up again, I do a database lookup on all the address fields, including street direction. Since it is null in Oracle it isn't considered to be equal. There is no mechanism in database lookup to use an NVL(FIELD,'-') or something to allow me to match against nulls. I can't return values from an Execute SQL statement, and the table is too large to use a Table input step and do the conversion / matching in Kettle.

So how can I do this? I guess maybe I substitute an empty string in the transformation for a key character on both the insert and subsequent match? Is there a quick way like the 'if null' step, or do I need to check each field with javascript? I have to think someone has encountered this before with Oracle and Table Output/Database Lookup.

Thanks!

-Aaron

group concat using kettle

$
0
0
Hi All,

Is there anyway to achieve mysql's group_concat functionality using kettle alone?

Regards,
Lourdhu.A

How to get the most of Pentaho Forums and help from community Members

$
0
0
When posting a question on a forum please keep in mind the more details you give in the question the more chances you have to receive a valid answer.

If possible present the problem by documenting it with logs, screenshots, videos if possible, sample reports, sample configurations etc

Keep in mind there are many versions of the software outhere so make sure to specify the version you are working on.

Stay away from short answers (unless they are Yes/No). If the suggestions given to you did not work; then reply and be specific on what you did and what did not work; please; please; please do not reply saying it didn't work.

Remember the more you give the more you receive

MultiButton component : how catch the selected value

$
0
0
Hello, I currently try to use in a dashboard the "MultiButton component". I display it correctly but I don’t now how I can catch the value selected by the user. I tried to catch this action in the “post change” function but I don’t know what is the correct function/syntax to catch the value selected by the user. I trying to use the same syntaxe of charts components “Scene.atoms.category.value()” but that doesn’t work. Someone can explain to me what is the correct method to do this ?

Pentaho CDE release : 5.1

Thank you on advance for your answer. I apologize for m'y bad English.

Getting Transformation object from job in java

$
0
0
Hi,

I want to get the TransMeta or Trans object from a JobMeta or Job in java.
Below is my code:
Code:

JobMeta jobMeta1 = new JobMeta(filename, repo);                   
Job job = new Job(repo, jobMeta1);
job.setInteractive(true);
Map<JobEntryCopy,JobEntryTrans> transFromJob = job.getActiveJobEntryTransformations();
job.start();

But the transFromJob is empty.Can anyone please help?

JNDI AS400 iSeries error "Attribute value not valid"

$
0
0
Hi there,

I'm fighting a problem for a long time now and I guess you are my last chance :)

I want to use JNDI (/opt/data-integration/simple-jndi/jdbc.properties) on my centos 6.7 to connect to a as400 database.

I am using the following configuration in jdbc.properties:
con_as400/type=javax.sql.DataSource
con_as400/driver=com.ibm.as400.access.AS400JDBCDriver
con_as400/url=jdbc:as400://10.41.0.30//DBNAME;translate binary=true;
con_as400/user=USER
con_as400/password=PASSWORD

For the transformation I use JNDI, Generic database.

What works:
- if I click "Test" within the database connection of the transformation I get OK
- if I click preview within the table input transformation I get 1000 lines correctly displayed
- if I create a connection within the transformation with the same properties it works

What doesn't work:
- when I actually run the transformation with the jndi connection I always get the following error: Attribute value not valid
Since then I try to figur out what I've done wrong in the JNDI to cause that error.
- when I chose as400 instead of generic I habe the same error

Anyone an idea what could be the cause?

Now I've got the following problems:
2015/09/01 14:35:24 - Table input.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : Unexpected error
2015/09/01 14:35:24 - Table input.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2015/09/01 14:35:24 - Table input.0 - An error occurred executing SQL:
2015/09/01 14:35:24 - Table input.0 - SELECT * FROM schema1.table1
2015/09/01 14:35:24 - Table input.0 - Attribute value not valid.
2015/09/01 14:35:24 - Table input.0 -
2015/09/01 14:35:24 - Table input.0 - at org.pentaho.di.core.database.Database.openQuery(Database.java:1722)
2015/09/01 14:35:24 - Table input.0 - at org.pentaho.di.trans.steps.tableinput.TableInput.doQuery(TableInput.java:224)
2015/09/01 14:35:24 - Table input.0 - at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:138)
2015/09/01 14:35:24 - Table input.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/09/01 14:35:24 - Table input.0 - at java.lang.Thread.run(Thread.java:722)
2015/09/01 14:35:24 - Table input.0 - Caused by: java.sql.SQLException: Attribute value not valid.
2015/09/01 14:35:24 - Table input.0 - at com.ibm.as400.access.JDError.throwSQLException(JDError.java:389)
2015/09/01 14:35:24 - Table input.0 - at com.ibm.as400.access.JDError.throwSQLException(JDError.java:366)
2015/09/01 14:35:24 - Table input.0 - at com.ibm.as400.access.AS400JDBCStatement.setFetchSize(AS400JDBCStatement.java:3487)
2015/09/01 14:35:24 - Table input.0 - at org.pentaho.di.core.database.Database.openQuery(Database.java:1700)
2015/09/01 14:35:24 - Table input.0 - ... 4 more

run SSH multiple commands

$
0
0
Hello,

I need to establish a connection to a router, run multiple output commands and export them to excel.


module "run SSH commands" only allows me to execute a command,
There is some separator to run two or more commands at once?


E.J.
show clock
show version
show interfaces

regards

Enrique

Automatic Login to CDE via JWT token

$
0
0
Hi,

i read your post which shows how you can enable userid=username&password=password authentication into pentaho.

/pentaho/api/repos/:public:index.wcdf/generatedContent?userid=username&password=3786368

We have enabled this and if i use the above URL then it does indeed sign me in and takes me straight to the CDE dashboard.

We have, however, implemented a JWT token based authentication of the form:

/pentaho/api/repos/:public:index.wcdf/generatedContent?jwt=esksnksknsknKV1QiLCJhbGciOiJIU...

This works for any pentaho user console reports, but this approach does not work for CDE and the page relentlessly asks for the username and password over and over again.

Can anybody point me in the correct direction to get our token based authentication to authenticate directly into CDE pages?

Thanks!

Serialize / De-serialize step

$
0
0
Is the format proprietary to Pentaho and can only be interpreted by Kettle?

Where is PDI Splunk connector plugin?

$
0
0
Hi.
We are using PDI(Pentaho Data Integration) CE 5.4.
And I found PDI Community Edition includes Splunk connector in web documents.
But, I can't find Splunk Input and Splunk Output in my PDI's Big Data section.

How can I use Splunk connector in PDI CE version?
Please let me know how to.

Thank you.



(from website I found followings.)
PDI Community Edition includes the following new feature:
...
Several new Transformation Steps and Job Entries including: Splunk input/output, Table compare, ZIP file, OpenERP input/output, Telnet, Nagios traps…

How to upload the analysis into baserver with command line.

$
0
0
I am using pentaho EE 5.3 . I want to upload a mondrian schema with command line. I also want to pass parameter related to DynamicSchemaParser.

I am using the below import/export utility , But that is not working.

./import-export.sh --import --url=http://pentaho.algoworks.com:8080/pentaho --username=admin --password=Space2001! --file-path=/home/ashok/work/tract-data/trunk/pentaho-solutions/resources/tract_dwh/TRACTOrders.xml --resource-type=DATASOURCE --datasource-type=ANALYSIS --analysis-datasource=tract_report_dw --analysis-parameters="DynamicSchemaProcessor=com.transverse.bleep.mondrian.security.TractTenantedDynamicSchemaProcessor" --analysis-parameters="UseContentChecksum=true" --overwrite=true

If i remove the below part it worked for me But i want to pass parameter.

--analysis-parameters="DynamicSchemaProcessor=com.transverse.bleep.mondrian.security.TractTenantedDynamicSchemaProcessor" --analysis-parameters="UseContentChecksum=true"

Could anybody please help me ,How to pass parameter for analysis.

Regards,
Pankaj

Scheduler for the Saiku Reports

$
0
0
Hello ,

Since i didn't find any schedule option for the saiku reports under the File Actions.
So, how can we schedule to run /send email for the saiku reports in Pentaho BI Server CE 5.0.1??

Thanks in advance.


Mukesh
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>