Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

How Mondrian generates the sql queries?

$
0
0
For mdx query:
Code:

DEBUG [mondrian.mdx] 1218: with set [~FILTER] as '{[Time].[2004]}'
  set [~COLUMNS] as '{[Markets].[Territory].Members}'
  set [~ROWS] as '{[Product].[Line].Members}'
select NON EMPTY Crossjoin([~COLUMNS], {[Measures].[Quantity]}) ON COLUMNS,
  NON EMPTY [~ROWS] ON ROWS
from [SteelWheelsSales]
where [~FILTER]

Mondrian mondrian-3.11.1.0-386 generates:
Code:

DEBUG [mondrian.sql] 9: Segment.load: executing sql [select "DIM_TIME"."YEAR_ID" as "c0", sum("ORDERFACT"."QUANTITYORDERED") as "m0" from "DIM_TIME" as "DIM_TIME", "ORDERFACT" as "ORDERFACT" where "ORDERFACT"."TIME_ID" = "DIM_TIME"."TIME_ID" and "DIM_TIME"."YEAR_ID" = 2004 group by "DIM_TIME"."YEAR_ID"]
DEBUG [mondrian.sql] 10: Segment.load: executing sql [select "CUSTOMER_W_TER"."TERRITORY" as "c0", "DIM_TIME"."YEAR_ID" as "c1", sum("ORDERFACT"."QUANTITYORDERED") as "m0" from "CUSTOMER_W_TER" as "CUSTOMER_W_TER", "ORDERFACT" as "ORDERFACT", "DIM_TIME" as "DIM_TIME" where "ORDERFACT"."CUSTOMERNUMBER" = "CUSTOMER_W_TER"."CUSTOMERNUMBER" and "ORDERFACT"."TIME_ID" = "DIM_TIME"."TIME_ID" and "DIM_TIME"."YEAR_ID" = 2004 group by "CUSTOMER_W_TER"."TERRITORY", "DIM_TIME"."YEAR_ID"]
DEBUG [mondrian.sql] 13: Segment.load: executing sql [select "CUSTOMER_W_TER"."TERRITORY" as "c0", "PRODUCTS"."PRODUCTLINE" as "c1", "DIM_TIME"."YEAR_ID" as "c2", sum("ORDERFACT"."QUANTITYORDERED") as "m0" from "CUSTOMER_W_TER" as "CUSTOMER_W_TER", "ORDERFACT" as "ORDERFACT", "PRODUCTS" as "PRODUCTS", "DIM_TIME" as "DIM_TIME" where "ORDERFACT"."CUSTOMERNUMBER" = "CUSTOMER_W_TER"."CUSTOMERNUMBER" and "ORDERFACT"."PRODUCTCODE" = "PRODUCTS"."PRODUCTCODE" and "ORDERFACT"."TIME_ID" = "DIM_TIME"."TIME_ID" and "DIM_TIME"."YEAR_ID" = 2004 group by "CUSTOMER_W_TER"."TERRITORY", "PRODUCTS"."PRODUCTLINE", "DIM_TIME"."YEAR_ID"]

The query #13 i understand.
But for what requests # 9, 10 are executed?
On big tables it not so quickly.

Multiple input tables to single output

$
0
0
Hello guys,

I am totally new to Pentaho and currently evaluating the product.
I have succeeded to read from multiple input tables (xbase files) of the same structure and output in a single sql table.

What i want to do now is to have another field added in output table which will specify the input file in order to be able to apply proper filtering later on.

Can you help me with that?

Thanks in advance,
Demetris

Modified Java Script Value > execution of a function only one time, not once per row

$
0
0
Hi all,

I'm wondering if it's possible to execute a javascript function (snippet of code into a Modified Java Script Value) one time, before processing first row and not once per each row. In fact in the Mod Java Script Value Step I initialize a cryptographic key and a mac (Message authentication code) provider, but I'd like to reuse this provider instance and not to instantiate per each row.
There is a chance to achieve this goal?

Or maybe there are other strategies, maybe I could use a type of step to load into a buffer same data... Or I could write a Java class with "Singleton pattern"... Any hint would be appreciated.

Thank you.
Gianpiero

Singular Value Decomposition Weka

$
0
0
Hello,
Where's the SVD (Singular Value Decomposition) in Weka? I know the SVD function is weka.core.matrix.SingularValueDecomposition But I don't know where to find in software weka.
Thank you

Function Js

$
0
0
Hi all
I wanna to extract my data (ZONE) have 3 values (ZONE1, ZONE2 , ZONE3)
to
NameZone : Zone*
NumberZone : *

and insert in my sql insert into values (NameZone,NumberZone)

problem connecting to salesforce URL

$
0
0
Timeout error in connecting to salesforce URL
How to connect to salesforce through proxy settings?

Weka scoring plugin traceback: Problem loading model file

$
0
0
I'm on 64-bit Windows 7 running Pentaho 6.0.1.0-386 that has the Weka jar already in the deployment named pdi-wekascoring-plugin-6.0.1.0-386. I'm using Weka 3.6.13 to create the model. I have saved my model from the Weka Explorer and opened it up in Pentaho. I then attached the CSV reader. The attached shows the Jump and then Java traceback. Getting an index out of bounds.


I'm able to get the fields and preview them just fine in the CSV node. Then when I run the flow I get the TB.

Since Weka requires ARFF files might this somehow cause an issue when attempting to read CSV files?

Here's the row level log file

2016/04/04 10:41:37 - General - Logging plugin type found with ID: CheckpointLogTable
2016/04/04 10:41:38 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/04/04 10:41:44 - DBCache - Loading database cache from file: [C:\Users\jmetcalf\.kettle\db.cache-6.0.1.0-386]
2016/04/04 10:41:44 - DBCache - We read 0 cached rows from the database cache!
2016/04/04 10:41:45 - General - Starting agile-bi
2016/04/04 10:41:45 - Spoon - Trying to open the last file used.
2016/04/04 10:41:46 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/04/04 10:41:46 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/04/04 10:41:46 - class org.pentaho.agilebi.platform.JettyServer - WebServer.Log.CreateListener localhost:10000
2016/04/04 10:44:22 - Spoon - org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate@138f0661
2016/04/04 10:46:44 - Spoon - Save to file or repository...
2016/04/04 10:46:44 - Spoon - File written to [C:\Users\jmetcalf\Downloads\WekaScoring\WekaScoring\docs\data\WekaScoring.ktr]
2016/04/04 10:46:44 - DBCache - We wrote 0 cached rows to the database cache!
2016/04/04 10:46:44 - WekaScoring - Transformation is pre-loaded.
2016/04/04 10:46:44 - WekaScoring - nr of steps to run : 2 , nr of hops : 0
2016/04/04 10:46:44 - Spoon - Transformation opened.
2016/04/04 10:46:44 - Spoon - Launching transformation [WekaScoring]...
2016/04/04 10:46:44 - Spoon - Started the transformation execution.
2016/04/04 10:46:44 - WekaScoring - Dispatching started for transformation [WekaScoring]
2016/04/04 10:46:44 - WekaScoring - Nr of arguments detected:0
2016/04/04 10:46:44 - WekaScoring - This is not a replay transformation
2016/04/04 10:46:44 - WekaScoring - I found 2 different steps to launch.
2016/04/04 10:46:44 - WekaScoring - Allocating rowsets...
2016/04/04 10:46:44 - WekaScoring - Allocating rowsets for step 0 --> CSV file input
2016/04/04 10:46:44 - WekaScoring - Allocated 0 rowsets for step 0 --> CSV file input
2016/04/04 10:46:44 - WekaScoring - Allocating rowsets for step 1 --> Weka Scoring
2016/04/04 10:46:44 - WekaScoring - Allocated 0 rowsets for step 1 --> Weka Scoring
2016/04/04 10:46:44 - WekaScoring - Allocating Steps & StepData...
2016/04/04 10:46:44 - WekaScoring - Transformation is about to allocate step [CSV file input] of type [CsvInput]
2016/04/04 10:46:44 - WekaScoring - Step has nrcopies=1
2016/04/04 10:46:44 - CSV file input.0 - distribution activated
2016/04/04 10:46:44 - CSV file input.0 - Starting allocation of buffers & new threads...
2016/04/04 10:46:44 - CSV file input.0 - Step info: nrinput=0 nroutput=0
2016/04/04 10:46:44 - CSV file input.0 - Finished dispatching
2016/04/04 10:46:44 - WekaScoring - Transformation has allocated a new step: [CSV file input].0
2016/04/04 10:46:44 - WekaScoring - Transformation is about to allocate step [Weka Scoring] of type [WekaScoring]
2016/04/04 10:46:44 - WekaScoring - Step has nrcopies=1
2016/04/04 10:46:44 - Weka Scoring.0 - distribution activated
2016/04/04 10:46:44 - Weka Scoring.0 - Starting allocation of buffers & new threads...
2016/04/04 10:46:44 - Weka Scoring.0 - Step info: nrinput=0 nroutput=0
2016/04/04 10:46:44 - Weka Scoring.0 - Finished dispatching
2016/04/04 10:46:44 - WekaScoring - Transformation has allocated a new step: [Weka Scoring].0
2016/04/04 10:46:44 - WekaScoring - This transformation can be replayed with replay date: 2016/04/04 10:46:44
2016/04/04 10:46:44 - WekaScoring - Initialising 2 steps...
2016/04/04 10:46:44 - CSV file input.0 - Released server socket on port 0
2016/04/04 10:46:44 - Weka Scoring.0 - Released server socket on port 0
2016/04/04 10:46:44 - WekaScoring - Step [CSV file input.0] initialized flawlessly.
2016/04/04 10:46:44 - WekaScoring - Step [Weka Scoring.0] initialized flawlessly.
2016/04/04 10:46:44 - CSV file input.0 - Starting to run...
2016/04/04 10:46:44 - Weka Scoring.0 - Starting to run...
2016/04/04 10:46:44 - WekaScoring - Transformation has allocated 2 threads and 0 rowsets.
2016/04/04 10:46:44 - Weka Scoring.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Unexpected error
2016/04/04 10:46:44 - Weka Scoring.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : java.lang.NullPointerException
2016/04/04 10:46:44 - Weka Scoring.0 - at org.pentaho.di.scoring.WekaScoring.processRow(WekaScoring.java:206)
2016/04/04 10:46:44 - Weka Scoring.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/04/04 10:46:44 - Weka Scoring.0 - at java.lang.Thread.run(Unknown Source)
2016/04/04 10:46:44 - Weka Scoring.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2016/04/04 10:46:44 - WekaScoring - Transformation detected one or more steps with errors.
2016/04/04 10:46:44 - WekaScoring - Transformation is killing the other steps!
2016/04/04 10:46:44 - WekaScoring - Looking at step: CSV file input
2016/04/04 10:46:44 - CSV file input.0 - Header row skipped in file 'C:\Users\jmetcalf\Dropbox\SAS\MachineLearning\Santander\TrainOutliersRemovedCooksDSLIMMED.csv'
2016/04/04 10:46:44 - CSV file input.0 - Stopped while putting a row on the buffer
2016/04/04 10:46:44 - CSV file input.0 - Finished processing (I=2, O=0, R=0, W=0, U=0, E=0)
2016/04/04 10:46:44 - WekaScoring - Looking at step: Weka Scoring
2016/04/04 10:46:44 - WekaScoring - searching for annotations
2016/04/04 10:46:44 - WekaScoring - no annotations found
2016/04/04 10:46:44 - WekaScoring - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Errors detected!
2016/04/04 10:46:44 - Spoon - The transformation has finished!!
2016/04/04 10:46:44 - WekaScoring - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Errors detected!
2016/04/04 10:46:44 - WekaScoring - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Errors detected!
2016/04/04 10:49:51 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/04/04 10:49:51 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Problem loading model file
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : java.lang.IndexOutOfBoundsException: Index: 173, Size: 0
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at java.util.ArrayList.rangeCheck(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at java.util.ArrayList.get(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at weka.core.Instances.attribute(Instances.java:350)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at weka.core.Instances.classAttribute(Instances.java:439)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.scoring.WekaScoringDialog.checkAbilityToProduceProbabilities(WekaScoringDialog.java:1103)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.scoring.WekaScoringDialog.loadModel(WekaScoringDialog.java:856)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.scoring.WekaScoringDialog.access$700(WekaScoringDialog.java:71)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.scoring.WekaScoringDialog$11.widgetSelected(WekaScoringDialog.java:759)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.scoring.WekaScoringDialog.open(WekaScoringDialog.java:821)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:125)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8728)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3032)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:2090)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDown(TransGraph.java:864)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1339)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7939)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9214)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:653)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at java.lang.reflect.Method.invoke(Unknown Source)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
2016/04/04 10:49:51 - org.pentaho.di.scoring.WekaScoringMeta@f1be5ac5 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Problem loading model file

Least of Timestamps in Columns

$
0
0
I have data where each row has 3 timestamps, kind of like:

Key1,Timestamp1,Timestamp2,Timestamp3
Key2,Timestamp1,Timestamp2,Timestamp3
Key3,Timestamp1,Timestamp2,Timestamp3

I'm looking for the minimum value of each timestamp for each row, lets say:

Key1,Timestamp2
Key2,Timestamp3
Key3,Timestamp3

This is easy in PostgreSQL (for those of you familiar with) using the LEAST([Arg1],[Arg2],...) function.

How can I implement this using Pentaho Data Integration? I don't believe I can use the Calculator or Formula step, and I must use the Script step to write a custom script. However, I just want to make sure I'm not missing out on anything.

Thanks!

Changing position of chart value

$
0
0
I would like to know how to change the position of a BarChart's value/label. As the picture below it stays on the middle of the chart, I would like to leave it on top, but out of the chart area.
I already went to the advanced properties and was able to leave the value on top of the chart, but still not out of the chart area.
forum.PNG
Attached Images

Parallel processing of the same job

$
0
0
Hi all,

Have a question about parallelism.

I need to import several CSV files into a DB. I have the job running sequentially.

The idea is to have a job that can be called one time for each entry in a list.
So i get my list of files in one job, Copy results, and the next step I mark it to run for each row, and in parallel.

The problem is that run in parallel is ignored, and it runs one for each record, but sequentially.

ANyone has any experience running the same job several times in parallel?

Any example I can use?

Thanks all!

Daniel

Modified Javascript

$
0
0
Hi
i have this error
TypeError: Cannot find function getString in object test value test value test value test value test value test value test value test value test value test value. (script#3)Modified javascript.jpg
Attached Images

Maybe a bug of PDI pdi-ce-6.0.1.0-386: errorcode = 0 always

$
0
0
PDI verion 5.4 have similar problem and the bug fixed in PDI 6.0.

http://forums.pentaho.com/showthread...ce-5-4-0-1-130
http://forums.pentaho.com/showthread...des-in-PDI-5-4

and the version I used is PDI6.0.1, and this bug is fixed already.

I create a simple job, which execute a non-exists shell script.
I run a job with kitchen.sh, and log tell me it's failed.
But, the exit code is still 0.

I modify the end lines of spoon.sh as follw:
--------------------------
# ***************
# ** Run... **
# ***************
OS=`uname -s | tr '[:upper:]' '[:lower:]'`
if [ $OS = "linux" ]; then
"$_PENTAHO_JAVA" $OPT -jar "$STARTUP" -lib $LIBPATH "${1+$@}" 2>&1 | grep -viE "Gtk-WARNING|GLib-GObject|GLib-CRITICAL|^$"
else
"$_PENTAHO_JAVA" $OPT -jar "$STARTUP" -lib $LIBPATH "${1+$@}"
fi
EXIT_CODE=$?
echo "==========================="$_PENTAHO_JAVA
echo "==========================="$EXIT_CODE
# return to the catalog from which spoon.sh has been started
cd $INITIALDIR

exit $EXIT_CODE
--------------------------

execute this job with kitchen.sh , and Logs as follows:

2016/04/05 17:43:19 - test_not_exist_shell - 开始执行任务
2016/04/05 17:43:19 - test_not_exist_shell - exec(0, 0, START.0)
2016/04/05 17:43:19 - START - Starting job entry
2016/04/05 17:43:19 - test_not_exist_shell - 开始项[Shell]
2016/04/05 17:43:19 - test_not_exist_shell - exec(1, 0, Shell.0)
2016/04/05 17:43:19 - Shell - Starting job entry
2016/04/05 17:43:19 - Shell - Found 0 previous result rows
2016/04/05 17:43:19 - Shell - Running on platform : Linux
2016/04/05 17:43:19 - Shell - Executing command : /home/hduser/aa.sh
2016/04/05 17:43:19 - Shell - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Error running shell [/home/hduser/aa.sh] : java.io.IOException: Cannot run program "/home/hduser/aa.sh": error=2, no such file or directory
2016/04/05 17:43:19 - Shell - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : java.io.IOException: Cannot run program "/home/hduser/aa.sh": error=2, no such file or directory
2016/04/05 17:43:19 - Shell - at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
2016/04/05 17:43:19 - Shell - at org.pentaho.di.job.entries.shell.JobEntryShell.executeShell(JobEntryShell.java:578)
2016/04/05 17:43:19 - Shell - at org.pentaho.di.job.entries.shell.JobEntryShell.execute(JobEntryShell.java:418)
2016/04/05 17:43:19 - Shell - at org.pentaho.di.job.Job.execute(Job.java:730)
2016/04/05 17:43:19 - Shell - at org.pentaho.di.job.Job.execute(Job.java:873)
2016/04/05 17:43:19 - Shell - at org.pentaho.di.job.Job.execute(Job.java:546)
2016/04/05 17:43:19 - Shell - at org.pentaho.di.job.Job.run(Job.java:435)
2016/04/05 17:43:19 - Shell - Caused by: java.io.IOException: error=2, no such file or directory
2016/04/05 17:43:19 - Shell - at java.lang.UNIXProcess.forkAndExec(Native Method)
2016/04/05 17:43:19 - Shell - at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
2016/04/05 17:43:19 - Shell - at java.lang.ProcessImpl.start(ProcessImpl.java:134)
2016/04/05 17:43:19 - Shell - at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
2016/04/05 17:43:19 - Shell - ... 6 more
2016/04/05 17:43:19 - test_not_exist_shell - 完成作业项[Shell] (result=[false])
2016/04/05 17:43:19 - test_not_exist_shell - 任务执行完毕
2016/04/05 17:43:19 - Kitchen - Finished!
2016/04/05 17:43:19 - Kitchen - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Finished with errors
2016/04/05 17:43:19 - Kitchen - Start=2016/04/05 17:43:03.882, Stop=2016/04/05 17:43:19.746
2016/04/05 17:43:19 - Kitchen - Processing ended after 15 seconds.
===========================/usr/java/jdk1.8.0_73/bin/java
===========================0

Because shell script doesn't exists, this job is failed. But, the return value is still 0.

Is there anyone have the same problem.
What shall I do?

:(

Set configuration variables in parent job and run subjob independently

$
0
0
I was looking how to configure a parent job and a subjob such that subjob could be run independently from its parent. In a subjob I tried to use parameters with default values but it does not seems to work.

expectations:
- run parent job -> configuration are propagated from parent to subjob (pass parameter option enabled)
- run subjob -> configuration are taken from default values

In actual execution configurations are propagated correctly just in case when no parameters are defined in the subjob which means that subjob cannot be run separately. Do you suggest any other way how to design a job?

Attached is a simple scenario of both jobs.
000447.jpg
Attached Images
Attached Files

http post web service soap

$
0
0
Hi ,
i would like a solution for how to use http post to explore a web service soap that accept headers and soapaction (step by step please),
i already use it but i get an error

"
2016/04/05 11:43:45 - HTTP Post.0 - Can not result from [https://interfaces.sigmacare.com/Sig...ice.asmx?wsdl]
2016/04/05 11:43:45 - HTTP Post.0 - Unbuffered entity enclosing request can not be repeated"

url web service:
https://interfaces.sigmacare.com/SigmaCareWebServices/ResidentNotificationService.asmx
headers http: SC_USERNAME , SC_ACCOUNTID , SC_PSWD .
http login: username, password.

Search only match first element when dimension nameColumn use the same description.

$
0
0
I have the following level declaration at a Payers dimension hierarchy:

<Level name="payer" visible="true" column="payor_id" nameColumn="payor_name" ordinalColumn="payor_id" type="String" uniqueMembers="false" levelType="Regular" hideMemberIf="Never" caption="Payer">
</Level>


Here is the problem. In some cases we are transferring data from the Master Data to the Warehouse were we have records with unique column (Key element) , but the same description at the nameColumn, as you can see with the following sample:

column= 1234 nameColumn=John Smith
column= 1235 nameColumn=John Smith
column= 1236 nameColumn=John Smith
column= 1237 nameColumn=John Smith

When performing a search, the MDX will look like this:

select NON EMPTY {[Measures].[total_payments]} ON COLUMNS,
NonEmptyCrossJoin([d_organization.h_organization].[business_entity].Members, {[d_security.h_security].[1]}) ON ROWS
from [payments_cube]
where {[d_payers.h_payer_group_type].[GROUP1].[GROUP_TYPE_1].John Smith],
[d_payers.h_payer_group_type].[GROUP1].[GROUP_TYPE_1].[John Smith],
[d_payers.h_payer_group_type].[GROUP1].[GROUP_TYPE_1].[John Smith],
[d_payers.h_payer_group_type].[GROUP1].[GROUP_TYPE_1].[John Smith]}

The payer (last level in hierarchy) is belong to the same GROUP1 and GROUP_TYPE but they are different at the payer level, per column id. Mondrian will only match the first record total and not the rest. Is there a way to configure the schema to use the column identifier to properly aggregate those records in the results set?

Report output as "text file"

$
0
0
Hi All,

I tried several ways to generate a report where its default output type as TEXT FILE (.txt format) through Pentaho 5.2 community edition. But didn't succeeded in this requirement. Does any one how to generate a text file as output.

There is an option as TEXT in default output type. But we can't download it as TEXT FILE into our machine.

Your help would be really appreciable

Regards,
Naveen

Error Processing component due to parameter value

$
0
0
Hello,

I'm creating a dashboard that has two filters and a several charts, they are all tied up. If I change the values on filter1, filter2 will change it's options and so on. On my filter2 I have some options that start with a "[]". Example: "[TEST] Option". When i select those values on the filter the charts receive the "error processing component".
If I choose an option that does not have "[]" the dashboard works perfectly.
I believe that it's reading the "[]" as something different than a string value. Should I change something on my cube or directly on pentaho?

Text file input field repeat not working

$
0
0
Hi, in PDI 4.4 the option to repeat values when the next step is empty works fine. After migrating to PDI 6.0 Stable, this step fails to repeat. Tried in PDI 5.4 with same issue.
Is this a known bug?

Thanks,
David

PRD freezes when I open new tab.

$
0
0
Hi All,

When I open a new tab in PRD Version 3.9.1 the whole program freezes.

It works fine if i'm working on one report but once i open a new tab with the existing report open PRD freezes.

I increased the memory size in report-desinger.bat file:


Code:

start "Pentaho Report Designer" "%_PENTAHO_JAVA%" -XX:MaxPermSize=512m -Xmx1024M -jar "%~dp0launcher.jar" %*


Here is the report-designer.sh file:

Code:

"$_PENTAHO_JAVA" -XX:MaxPermSize=512m -jar "$DIR/launcher.jar" $@


I've also tried downloading it again but that didn't work

any help would be greatly appreciated.

Thanks in advanced!!

Best,
Vic

Partitioning problem

$
0
0
Hi there

I don't know what I am doing wrong. I want to create a partition. This is my ELT configuration:

Immagine.jpg

But I cannot see the name of the partition schema I have created.
Attached Images
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>