Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Error with blank values when reading from an Excel Input step

$
0
0
Hi,

I have an excel file in which some fields may take a Data vale or a blank value. When I read the excel file I define the field as "Date" type. I could defne it as "String". However, I need this field to be defined as "Date" type. When I launch the transformation and I always get an error in the row where there is the fist blank value.

Does anyone know how to deal with blank values when reading data from an Excel Input step.
Attached Files

S3 File Output not working in Linux

$
0
0
Hi all,

I am having trouble with the "S3 File Output" step in Pentaho 5.3 when running on Linux/CentOS.

I have a 2-step PoC transformation that pushes a bunch of fields into Gz-compressed text file in my S3 bucket. This works fine when running in Windows 7. But when running in Linux I get the following:
Code:

2016/12/22 17:03:34 - TestArea S3 PoC - ERROR (version 5.3-SNAPSHOT, build 1 from 2015-04-23 13.43.15 by mgt) : org.pentaho.di.core.exception.KettleException:
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error during transformation metadata load
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1205)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:648)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:716)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:598)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:69)
2016/12/22 17:03:34 - TestArea S3 PoC -    at java.lang.Thread.run(Thread.java:745)
2016/12/22 17:03:34 - TestArea S3 PoC - Caused by: org.pentaho.di.core.exception.KettleMissingPluginsException:
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2835)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2671)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2623)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2600)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1154)
2016/12/22 17:03:34 - TestArea S3 PoC -    ... 8 more
2016/12/22 17:03:34 - TestArea S3 PoC - ERROR (version 5.3-SNAPSHOT, build 1 from 2015-04-23 13.43.15 by mgt) : org.pentaho.di.core.exception.KettleException:
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error occurred while launching entry [CREATE GZ FILE ON S3.0]
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error during transformation metadata load
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:862)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:598)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:69)
2016/12/22 17:03:34 - TestArea S3 PoC -    at java.lang.Thread.run(Thread.java:745)
2016/12/22 17:03:34 - TestArea S3 PoC - Caused by: org.pentaho.di.core.exception.KettleException:
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error during transformation metadata load
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1205)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:648)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:716)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    ... 5 more
2016/12/22 17:03:34 - TestArea S3 PoC - Caused by: org.pentaho.di.core.exception.KettleMissingPluginsException:
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2835)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2671)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2623)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2600)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1154)
2016/12/22 17:03:34 - TestArea S3 PoC -    ... 8 more
2016/12/22 17:03:34 - TestArea S3 PoC - ERROR (version 5.3-SNAPSHOT, build 1 from 2015-04-23 13.43.15 by mgt) : org.pentaho.di.core.exception.KettleException:
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error occurred while launching entry [Get Messages & Property Detail.0]
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error occurred while launching entry [CREATE GZ FILE ON S3.0]
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error during transformation metadata load
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:862)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:598)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:69)
2016/12/22 17:03:34 - TestArea S3 PoC -    at java.lang.Thread.run(Thread.java:745)
2016/12/22 17:03:34 - TestArea S3 PoC - Caused by: org.pentaho.di.core.exception.KettleException:
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error occurred while launching entry [CREATE GZ FILE ON S3.0]
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error during transformation metadata load
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:862)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    ... 4 more
2016/12/22 17:03:34 - TestArea S3 PoC - Caused by: org.pentaho.di.core.exception.KettleException:
2016/12/22 17:03:34 - TestArea S3 PoC - Unexpected error during transformation metadata load
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1205)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:648)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:716)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.Job.execute(Job.java:859)
2016/12/22 17:03:34 - TestArea S3 PoC -    ... 5 more
2016/12/22 17:03:34 - TestArea S3 PoC - Caused by: org.pentaho.di.core.exception.KettleMissingPluginsException:
2016/12/22 17:03:34 - TestArea S3 PoC - Missing plugins found while loading a transformation
2016/12/22 17:03:34 - TestArea S3 PoC -
2016/12/22 17:03:34 - TestArea S3 PoC - Step : S3FileOutputPlugin
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2835)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2671)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2623)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2600)
2016/12/22 17:03:34 - TestArea S3 PoC -    at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1154)
2016/12/22 17:03:34 - TestArea S3 PoC -    ... 8 more
org.pentaho.di.core.exception.KettleException:

It sounds like this is related to the following issue:
http://jira.pentaho.com/browse/PDI-13575
(Despite being a different plugin)

In the abovementioned JIRA, a workaround is suggested as:
"As a workaround - the one you already found out - is to call the kitchen/pan from inside the folder the scripts are located."
In the documentation subtask on that page, there is also a sample Shell Script for Spoon named "spoon_5.3.sh".

Looking in this file, the only difference I can see between that and my spoon.sh (which I don't have permission to modify) is that in my spoon.sh I have an extra case statement under "setPentahoEnv" in the "Init BASEDIR" section as follows:
Code:

# **************************************************
# ** Init BASEDIR                                **
# **************************************************


BASEDIR=`dirname $0`
cd $BASEDIR
DIR=`pwd`
cd -


. "$DIR/set-pentaho-env.sh"


setPentahoEnv


# Spoon really needs to be in the "right" directory
case "$1" in
    # These are non-spoon calls
    -main)
    ;;


    # This is actually spoon
    *)
        cd $DIR
        BASEDIR=.
    ;;
esac


The kitchen.sh script I am using to access this looks like this:
Code:

#!/bin/sh


BASEDIR="`dirname $0`"
cd "$BASEDIR"
DIR="`pwd`"
cd - > /dev/null


if [ "$1" = "-x" ]; then
  set LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$BASEDIR/lib
  export LD_LIBRARY_PATH
  export OPT="-Xruntracer $OPT"
  shift
fi


export IS_KITCHEN="true"


"$DIR/spoon.sh" -main org.pentaho.di.kitchen.Kitchen "$@"

I have interpreted the workaround such that the guts of my shell script (/home/pentaho/testarea/run_s3test.sh) looks like this:
Code:

cd /home/pentaho/testarea
KETTLE_HOME=/home/pentaho/testarea \
PENTAHO_MEM=15G \
/home/pentaho/kettle/kitchen.sh \
  /file:///home/pentaho/testarea/s3_poc.kjb \
  > logs/s3_poc.`date +%Y%m%d`.log 2>&1

Questions:
1) Is this indeed the same issue as assumed above or is it a separate issue.

2) If this is the same issue, then can someone show me how I can modify my run_s3test.sh shell script to make it work? :)
(If it's the same issue, then my interpretation of the fix is wrong because it ain't workin)

Thanks!!!


Regards,

Stanbridge

Flag 1 if numeric, 0 if not (Regex/Replace in String?)

$
0
0
Hi,

What's the best way to do this? I have a string field and I'd like to evaluate each value in this field to see if it could be cast as a number(double).

I can pick up a number followed by 1 decimal place using regex & replace in string, but can you tell me how I should configure it so I get to an output field with flags of 1 or 0 only?


Much obliged,

Andy

Report takes long time to load on server restart

$
0
0
Hi,
I have integrated CDE dashboard into my web application. But whenever pentaho server is restarted, it takes a very long time to load the first report(only) .
Is it the expected behaviour? or any configuration changes to be made.

Thanks,
Padma Priya N.

Limit the number of pages to be read using REST clinet step

$
0
0
Hi,

I am able to read data using REST client step available in Pentaho data integration, Howerver I am getting only first page of the data by default. Can you please help me out to get all pages of data using REST Client step.?

I assume, I should pass some kind of fileter in API, any suggestions ?


Regards
Kullayappa.

Pentaho Localization

$
0
0
How can i localize my platform?
Which conf file i should change, to set 'locale' variable on for example 'fr'?

OpenERP plugin and Odoo 10

$
0
0
Hello,

i'm trying to make the OpenERP plugin in PDI/Spoon v7 work with Odoo 10.

The connection test is OK

Screen Shot 12-26-16 at 11.04 PM.jpg

but when i click on "get field" i have this

Screen Shot 12-26-16 at 11.06 PM.jpg

Did somebody have experienced that?
Or maybe the OpenERP plugin is not compatible with the last version 10 of Odoo?

Screen Shot 12-26-16 at 11.07 PM.jpg

thank you

jndi mysql

$
0
0
hi,

I successfully parameterized a jdbc connection and tried unsuccessfully for the moment to set up a jndi connection.
My PDI directory installation: D: \ pentaho \ bin \ data-integration
It looks like the file "D: \ pentaho \ bin \ data-integration \ simple-jndi \ jdbc.properties" is not used. I have changed the settings without effects.
How can I check the jndi file used?
How to use the jdbc.properties file by default?
Or any hints ?

Thank you in advance for your time
best regards

How to implement a Real- Time ETL wirh Kettle 7.0

$
0
0
Hi All,

I am starting using Kettle V7.0 and as requirement I'll have to create a real-time ETL. Does anyone have any URL or documentations that could help me to implement this ?


Thanks in advance

Regards,

Dynamic fill

$
0
0
Hi guys

I need your help!!!

Having this input table.

item month quantiy
54000001 1 100
54000005 1 175
54000009 1 195
54000013 1 215
54000002 2 300
54000006 2 180
54000010 2 200
54000014 2 220
54000003 3 50
54000007 3 185
54000011 3 205
54000015 3 225
54000004 4 200
54000008 4 190
54000012 4 210

I want to assign the correct Facility depending of his capacity to produce.

knowing that:
Facility 1 has priority to produce over Facility 2, and Facility 2 over Facility 3

Facility 1 can produce 300 in a month
Facility 2 can produce 1000 in a month
Facility 3 can produce infinite

The pseudo-code should be something like:

If (Select sum(quantity) where facility=’facility 1’) < 300 then ’facility 1’
Else If (Select sum(quantity) where facility=’facility 2’) < 1000 then ’facility 2’
Else ’facility 2’

The execution of this should works on this way(thinking that is perfectly defined)


Start process…

Read 1:
item month quantiy facility
54000001 1 100 ?

Result 1: sum(quantity)=0 where facility=’facility 1’
item month quantiy facility
54000001 1 100 facility 1

Read 2:
item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 ?

Result 2: sum(quantity)=100 where facility=’facility 1’
item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 facility 1

Read 3:
item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 facility 1
54000009 1 195 ?

Result 3: sum(quantity)=275 where facility=’facility 1’
item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 facility 1
54000009 1 195 facility 1
Read 4:
item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 facility 1
54000009 1 195 facility 1
54000013 1 215 ?
Read 4: sum(quantity)=370 where facility=’facility 1’
Then… sum(quantity)=0 where facility=’facility 2’
item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 facility 1
54000009 1 195 facility 1
54000013 1 215 facility 2
.
.
.
.
.

Finally my result table is:

item month quantiy facility
54000001 1 100 facility 1
54000005 1 175 facility 1
54000009 1 195 facility 1
54000013 1 215 facility 2
54000002 2 300 facility 1
54000006 2 180 facility 2
54000010 2 1000 facility 2
54000014 2 220 facility 3
54000003 3 50 facility 1
54000007 3 185 facility 1
54000011 3 205 facility 1
54000015 3 225 facility 2
54000004 4 200 facility 1
54000008 4 190 facility 1
54000012 4 210 facility 2

I hope you understand

How can I solve this in kettle?????

Thanks in advance.

Carlos.

Pentaho - Hadoop Issue when internet lost

$
0
0
Hi,

I am having Pentaho version 5.0.1 & Hadoop version 2 .
When I am connecting to internet it is working fine.

When I lost internet and reconnect it is not working.
To make it working , I need to restart pentaho server.
I have no issue with MySQL and Oracle , but having this with Hadoop only.

Could you give any suggestions, please ?

Is XLSX Input for S3 available?

$
0
0
Hi,

I have the requirement to process xlsx files stored on s3, it seems there is no step available out of the box on Spoon to do that i can only see S3 csv ip/op steps. Is there any way work around to do that. ?

Thanks

Rows with different layout

$
0
0
Hi guys,
I've a problem with my trasformation. The input of trasformation is a table with this SQL
Code:

SELECT
 nome
FROM casa_editrice

and another table with this SQL
Code:

SELECT DISTINCT
 casa_editrice
FROM swap_libri
WHERE casa_editrice IS NOT NULL

after sort rows step i go in step Merge Rows (Diff) and here I receive the error
Code:

The data type of field #1 is not the same as the first row received:  you're mixing rows with different layout. Field [nome String(255)] does  not have the same data type as field [casa_editrice String(255)].
I've tried also to insert a select value step to rename casa_editrice in nome but in this case I receive a null pointer exceptions
Attached Files

PDI - monitoring jobs and transformations (best practices)

$
0
0
Hello,

I have a question regarding monitoring PDI transformations and jobs. In all my jobs and transformations I have set that Pentaho is writing all possible informations and metrics to logging tables. Now I am looking for best way for monitoring executions of all jobs and transformations (to see which parts of ETL are consuming most of the time, to see which tasks are getting slowly day by day etc).

Querying tables can be sometimes very time consuming and unpractical so I am wondering if there is some kind of already prepared dashboard within Pentaho solutions for monitoring those kind of things? If not, is there available any detailed documentation for logging tables - with detailed explanations of fields content, explained relationships between tables, is there available relationships model...?

What are best practices for monitoring jobs and transformations?

Thanks!

Multiway merge join help me

$
0
0
File A have column: id, no, year, cbc, ua, xray
File B have column: id, no, year, result_cbc, result_ua, result_xray

I want join to new file >> id, no, year, cbc, ua, xray, result_cbc, result_ua, result_xray

I run but it's wrong. New excel file have 7 rows why??? Multiway merge join


1.jpg
2.jpg
Attached Images

Kettle 7.0 and RegExp Wildcard

$
0
0
I have a zip-step in a kettle job which zip files in a folder. In the zip-step I have a "Include Wildcard (RegExp)" =
Code:

.*(${TO_ZIP}|META-INF).*
and the option "Include sub-folder" is checked(META-INF - the name of the sub-folder).
This wildcard works perfectly in Kettle 5.4 and java 7, but does not in Kettle 7.0 and java 8.
In Kettle 7.0 the zip-step selects only files starting with variable ${TO_ZIP} and does not select ones in the 'META-INF' sub-folder. Can anybody help me with the proper wildcard?

problem to call job with java

$
0
0
I created a job that reads from an xml and save data on database, if I call with pentaho works but if I invoke with java does not read my xml file.

PDI 7 - kitchen intialDir

$
0
0
I'm using kitchen.sh to start my jobs from within zip files like:
./kitchen.sh -file:"zip:file:////home/user/pentaho/pdi-ee/my_package/linked_executable_job_and_transform.zip\!Hourly_Stats_Job_Unix.kjb"

When I recently migrated from PDI 6.1 to 7.0 my scripts stopped working and generated error messages like:

"file <my_current_directory>zip:file:////home/user/pentaho/... does not exist"

...which is absolutely correct, since this is no valid path with the <my_current_directory> prepended.

The root cause seemed to be a newly added paramter in kitchen.sh "initialDir" that is passed down from kitchen to spoon and wasn't present in 6.1. After removing the intialDir parameter, my scripts seem to work again.

My questions are though:
- is there a better solution to calling jobs from zip files with kitchen than tempering with the kitchen.sh files?
- what is the initialDir parameter used for? Am I breaking something else by removing it?

Thanks a lot!

Java error launching Spoon

$
0
0
Hi can anyone help me with this error trying to setup PDI v 7 with Java 8 on Windows:

C:\PDI\data-integration>"C:\Program Files (x86)\Java\jre1.8.0_111\\bin\java.exe" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2" "-Djava.library.path=libswt\win32" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-launcher-7.0.0.0-25.jar -lib ..\libswt\win32 /level:Debug
Error occurred during initialization of VM
Could not reserve enough space for 2097152KB object heap
Java HotSpot(TM) Client VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0

one to many relationship between fact and dimension table

$
0
0
Possible to maintain the one to many relationship between fact and dimension table pentaho with out using bridge table
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>