Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

onClick on bootstrap button

$
0
0
I have one bar chart component (trial_start_graph) in CDE and two datasources (trialStart_day / trialStart_mon). I want the chart to be updated with either day or month when clicking on the bootstrap buttons. Any suggestions?

This is how my bootstrap panel looks like:

<div class="panel panel-default">
<div class="panel-heading clearfix">
<h6 class="panel-title pull-left">TRIAL START GRAPH</h6>
<div class="btn-group btn-group-xs pull-right" id="perioddSelect">
<a href="" class="btn btn-default btn-sm">Day</a>
<a href="" class="btn btn-default btn-sm">Month</a>
</div>
</div>
</div>

Passing Resultset of a Stored procedure's sql to a table output step

$
0
0
Hi,

I need to know is there any step/technique available to get the result set from a stored procedure select sql, and pass it to a table output.

My business scenario is as follows,

I have multiple sql linked servers, and my stored procedure retrieves data from all those linked servers and store is as a temp table.
I need to pass the temp table data from the stored procedure to my destination table which resides in another sql server instance (not in the linked servers).

I have tried 'call DB procedure' step, but don't know how to get the result set from the stored procedure & pass to next step.

I am using pentaho kettle 5.1.0 EE.

Regards,
Lourdhu

how to delete orphan records using spoon or kettle

$
0
0
Hi All,

I am new to kettle and learning PDI. I want to know how to delete orphan records which is in postgresql using the Kettle. Please let me know how to do with the transformation.

Thanks in advance

Karthik R

Dynamic database connection

$
0
0
Hi All,

I am new to PDI, how do i create dynamic db connection in kettle.

Thanks in advance

Karthik R

Restart mondrian internal server

$
0
0
Hi.
Some times I need to restart internal mondrian server. Is there any way for this?

Not able to Schedule a Job from PDI Tool in local machine

$
0
0
Hello,

I have a DI Repository created with jobs & Transformations in it. when I run the job directly from PDI it works.but when I schedule a job from PDI Tool ( from local machine ) itself it fails.It is not showing any error in schedule perspective but when I checked the Job log record in Job log table it has Error column with '1'.below is the log data in log field.


2015/02/23 15:50:29 - RepositoriesMeta - Reading repositories XML file: FromInputStream
2015/02/23 15:50:29 - General - Creating repository meta store interface
2015/02/23 15:50:29 - General - Connected to the enterprise repository
2015/02/23 15:57:47 - Carte - Cleaned up transformation Form 555 Top Company - SQL - SELECT DATA_YEAR , SPIN , SAC , COMPANY_NAME , SAC_NAME , WIRELESS_INDICATOR , STATE , TOTAL_DISBURSEMENT , RANK FROM "Form 555 Top Company" ORDER BY DATA_YEAR, SAC with id 50393e2a-2176-4af6-8e41-6631005e3d94 from Mon Feb 23 11:57:46 EST 2015, diff=240
2015/02/23 15:57:47 - Carte - Cleaned up transformation Form 555 Top Company - Service - SELECT DATA_YEAR , SPIN , SAC , COMPANY_NAME , SAC_NAME , WIRELESS_INDICATOR , STATE , TOTAL_DISBURSEMENT , RANK FROM "Form 555 Top Company" ORDER BY DATA_YEAR, SAC with id e99417a0-3f02-471a-974e-e07e235b69c0 from Mon Feb 23 11:57:46 EST 2015, diff=240
2015/02/23 15:58:07 - Carte - Cleaned up transformation Form555FilingService-Prod - Service - SELECT STATE , SAC , SAC_NAME , SPIN , SPIN_NAME , HOLDING_COMPANY , DATA_YEAR , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , SECTION_2_A , SECTION_2_B , SECTION_2_C , SECTION_2_D , SECTION_2_E , SECTION_2_F , SECTION_2_G , SECTION_2_H , SECTION_2_I , SECTION_2_J , SECTION_2_K , SECTION_2_L , SECTION_3_M , SECTION_3_N , SECTION_3_O , ELIGIBILITY_SOURCE , SECTION_2_C_CERT FROM "Form555FilingService-Prod" ORDER BY DATA_YEAR, SAC with id 5db2cba8-e9bd-4f0f-b431-72f03e8983e1 from Mon Feb 23 11:57:58 EST 2015, diff=240
2015/02/23 15:58:07 - Carte - Cleaned up transformation Form555FilingService-Prod - SQL - SELECT STATE , SAC , SAC_NAME , SPIN , SPIN_NAME , HOLDING_COMPANY , DATA_YEAR , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , SECTION_2_A , SECTION_2_B , SECTION_2_C , SECTION_2_D , SECTION_2_E , SECTION_2_F , SECTION_2_G , SECTION_2_H , SECTION_2_I , SECTION_2_J , SECTION_2_K , SECTION_2_L , SECTION_3_M , SECTION_3_N , SECTION_3_O , ELIGIBILITY_SOURCE , SECTION_2_C_CERT FROM "Form555FilingService-Prod" ORDER BY DATA_YEAR, SAC with id be3ce3bc-6f00-4322-aee7-ad6241e8bbf5 from Mon Feb 23 11:57:58 EST 2015, diff=240
2015/02/23 15:59:27 - Carte - Cleaned up transformation Form555FilingService-Prod - Service - SELECT DATA_YEAR , SPIN , SAC , STATE , SAC_NAME , SPIN_NAME , HOLDING_COMPANY , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , section_2_a , section_2_d , ETC_PREPAID , DE_ENROLL_JANUARY , DE_ENROLL_FEBRUARY , DE_ENROLL_MARCH , DE_ENROLL_APRIL , DE_ENROLL_MAY , DE_ENROLL_JUNE , DE_ENROLL_JULY , DE_ENROLL_AUGUST , DE_ENROLL_SEPTEMBER , DE_ENROLL_OCTOBER , DE_ENROLL_NOVEMBER , DE_ENROLL_DECEMBER FROM "Form555FilingService-Prod" ORDER BY DATA_YEAR, SAC with id 18c60b86-305f-4de4-ba00-f9a7a97d996f from Mon Feb 23 11:59:14 EST 2015, diff=240
2015/02/23 15:59:27 - Carte - Cleaned up transformation Form555FilingService-Prod - SQL - SELECT DATA_YEAR , SPIN , SAC , STATE , SAC_NAME , SPIN_NAME , HOLDING_COMPANY , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , section_2_a , section_2_d , ETC_PREPAID , DE_ENROLL_JANUARY , DE_ENROLL_FEBRUARY , DE_ENROLL_MARCH , DE_ENROLL_APRIL , DE_ENROLL_MAY , DE_ENROLL_JUNE , DE_ENROLL_JULY , DE_ENROLL_AUGUST , DE_ENROLL_SEPTEMBER , DE_ENROLL_OCTOBER , DE_ENROLL_NOVEMBER , DE_ENROLL_DECEMBER FROM "Form555FilingService-Prod" ORDER BY DATA_YEAR, SAC with id 7c67d343-6f15-4177-8b0b-b5b692f70f9a from Mon Feb 23 11:59:14 EST 2015, diff=240
2015/02/23 16:00:27 - Carte - Cleaned up transformation Form 555 Top Company UAT - SQL - SELECT DATA_YEAR , SPIN , SAC , COMPANY_NAME , SAC_NAME , WIRELESS_INDICATOR , STATE , TOTAL_DISBURSEMENT , RANK FROM "Form 555 Top Company UAT" ORDER BY DATA_YEAR, SAC with id 096ed575-d666-425b-b461-08e79c2ae4b5 from Mon Feb 23 12:00:21 EST 2015, diff=240
2015/02/23 16:00:27 - Carte - Cleaned up transformation Form 555 Top Company UAT - Service - SELECT DATA_YEAR , SPIN , SAC , COMPANY_NAME , SAC_NAME , WIRELESS_INDICATOR , STATE , TOTAL_DISBURSEMENT , RANK FROM "Form 555 Top Company UAT" ORDER BY DATA_YEAR, SAC with id b32e0a61-ebaa-4697-a56e-7976e73317a1 from Mon Feb 23 12:00:21 EST 2015, diff=240
2015/02/23 16:00:47 - Carte - Cleaned up transformation FilingsService - Service - SELECT DATA_YEAR , SPIN , SAC , STATE , SAC_NAME , SPIN_NAME , HOLDING_COMPANY , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , section_2_a , section_2_d , ETC_PREPAID , DE_ENROLL_JANUARY , DE_ENROLL_FEBRUARY , DE_ENROLL_MARCH , DE_ENROLL_APRIL , DE_ENROLL_MAY , DE_ENROLL_JUNE , DE_ENROLL_JULY , DE_ENROLL_AUGUST , DE_ENROLL_SEPTEMBER , DE_ENROLL_OCTOBER , DE_ENROLL_NOVEMBER , DE_ENROLL_DECEMBER FROM FilingsService ORDER BY DATA_YEAR, SAC with id d2cd1a8b-6286-40f1-97eb-b8e616ba008f from Mon Feb 23 12:00:33 EST 2015, diff=240
2015/02/23 16:00:47 - Carte - Cleaned up transformation FilingsService - SQL - SELECT DATA_YEAR , SPIN , SAC , STATE , SAC_NAME , SPIN_NAME , HOLDING_COMPANY , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , section_2_a , section_2_d , ETC_PREPAID , DE_ENROLL_JANUARY , DE_ENROLL_FEBRUARY , DE_ENROLL_MARCH , DE_ENROLL_APRIL , DE_ENROLL_MAY , DE_ENROLL_JUNE , DE_ENROLL_JULY , DE_ENROLL_AUGUST , DE_ENROLL_SEPTEMBER , DE_ENROLL_OCTOBER , DE_ENROLL_NOVEMBER , DE_ENROLL_DECEMBER FROM FilingsService ORDER BY DATA_YEAR, SAC with id 43cf3cb7-0be9-4a43-95cc-6dd076b767ce from Mon Feb 23 12:00:33 EST 2015, diff=240
2015/02/23 16:07:07 - Carte - Cleaned up transformation FilingsService - SQL - SELECT DATA_YEAR , SPIN , SAC , STATE , SAC_NAME , SPIN_NAME , HOLDING_COMPANY , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , section_2_a , section_2_b , section_2_c , section_2_d , section_2_e , section_2_f , section_2_g , section_2_h , section_2_i , section_2_j , section_2_k , section_2_l , section_3_m , section_3_n , section_3_o , eligibility_source , SECTION_2_C_CERT FROM FilingsService ORDER BY DATA_YEAR, SAC with id 960deb4c-6ce6-405e-be56-c5b423b3dac9 from Mon Feb 23 12:06:49 EST 2015, diff=240
2015/02/23 16:07:07 - Carte - Cleaned up transformation Form 555 Top Company UAT - SQL - SELECT DATA_YEAR , SPIN , SAC , COMPANY_NAME , SAC_NAME , WIRELESS_INDICATOR , STATE , TOTAL_DISBURSEMENT , RANK FROM "Form 555 Top Company UAT" ORDER BY DATA_YEAR, SAC with id c41f8884-ae62-406d-919a-ce837b058c94 from Mon Feb 23 12:06:48 EST 2015, diff=240
2015/02/23 16:07:07 - Carte - Cleaned up transformation FilingsService - Service - SELECT DATA_YEAR , SPIN , SAC , STATE , SAC_NAME , SPIN_NAME , HOLDING_COMPANY , FORM_STATUS , DATE_UPDATED , FILING_METHOD , UNIQUE_SAC_COUNT , section_2_a , section_2_b , section_2_c , section_2_d , section_2_e , section_2_f , section_2_g , section_2_h , section_2_i , section_2_j , section_2_k , section_2_l , section_3_m , section_3_n , section_3_o , eligibility_source , SECTION_2_C_CERT FROM FilingsService ORDER BY DATA_YEAR, SAC with id 866e2134-2f12-4193-a541-274cce4dce4a from Mon Feb 23 12:06:49 EST 2015, diff=240
2015/02/23 16:07:07 - Carte - Cleaned up transformation Form 555 Top Company UAT - Service - SELECT DATA_YEAR , SPIN , SAC , COMPANY_NAME , SAC_NAME , WIRELESS_INDICATOR , STATE , TOTAL_DISBURSEMENT , RANK FROM "Form 555 Top Company UAT" ORDER BY DATA_YEAR, SAC with id 3331786d-5f1a-4857-81ae-c4e1c8746640 from Mon Feb 23 12:06:48 EST 2015, diff=240
2015/02/23 16:08:31 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : The transformation specification for data service 'TestCouchDB' could not be found
2015/02/23 16:08:35 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service Form555FilingService-Prod, transformation: null
2015/02/23 16:08:35 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service Form 555 Top Company, transformation: null
2015/02/23 16:08:35 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service OracleTestDaniel, transformation: null
2015/02/23 16:08:35 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service PDITest, transformation: null
2015/02/23 16:08:35 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service PentahoDev, transformation: null
2015/02/23 16:08:36 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service testOraclePDIService, transformation: null
2015/02/23 16:08:36 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service testSQLServerPDI, transformation: null
2015/02/23 16:08:44 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : The transformation specification for data service 'TestCouchDB' could not be found
2015/02/23 16:08:48 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service Form555FilingService-Prod, transformation: null
2015/02/23 16:08:48 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service Form 555 Top Company, transformation: null
2015/02/23 16:08:48 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service OracleTestDaniel, transformation: null
2015/02/23 16:08:48 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service PDITest, transformation: null
2015/02/23 16:08:48 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service PentahoDev, transformation: null
2015/02/23 16:08:49 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service testOraclePDIService, transformation: null
2015/02/23 16:08:49 - Servlet - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unable to get fields for service testSQLServerPDI, transformation: null
2015/0



Thank
Padmakar

Get Email. Error Connecting To Host

$
0
0
Hi,

I have an windows Live Mail configured for sending/receiving my emails. I am trying to use "GET MAIL" with settings as specified in Windows Live but cannot get through. It throws error like " Cannot Connect to host[xx.xxx.co.in]". I am entering following:-

Source host = xx.xxx.co.in---as specified in Windows Live Mail Account-> Properties-> Servers-> Incoming Mail(POP3)
user name and password = as specified in Windows Live Mail Account-> Properties-> Servers-> Incoming Mail Server-> user Name & password
SSL = True
Port = 995
Protocol = POP3

The email account is actually an official email account assigned to me. When i click on Test Connection, it fails. What can be the possible reasons for this failure.

Issue is kettle , Get Exit 1 even after Job finished successfully.

$
0
0
Hi,

I have created kettle Master job and when I execute the job it start with "Kitchen - Start of run." and Transformation works perfectly fine and at the end in log i can see "Job execution finished", But my nohup bash command end up with "Exit 1".

I have used "Success" event in my master job at the end.

I can only see below Abort Job Error in Log which I dont understand what exactly it is.
Abort job - ERROR (version 4.3.0-stable, build 16786 from 2012-04-24 14.11.32 by buildguy) : Aborting job.

Thanks
Pankaj Gupta.

Making a varying threshold

$
0
0
Hi,

Assume I have a fact table about sales with a date of order and a date of delivery. I want to track the numbero f sales where "(date of delivery - date of order) > 10 days" ; it's easy to do in the schema with a measure expression :

Code:

<Measure name="Too late" datatype="Integer" aggregator="sum" visible="true">
      <MeasureExpression>
        <SQL dialect="generic">
          <![CDATA[CASE WHEN (date_delivery - date_order) > 10 THEN 1 ELSE 0 END]]>
        </SQL>
      </MeasureExpression>
    </Measure>

But how can I do if I want to be able to make the threshold (10 days) varying? I have thought to dynamic schema processing, but what if just some users are allowed to make it varying ; trying to be clearer :

- Alice displays the cube with a threshold of 10

- Concurrently, Bob woudl like to simulate a threshold of 5 days ; that can be done with DSP , but what about Alice? Will she still see a threshold of 10?

Thank you

data type error: the data type does not correspond to value meta [Timestamp]

$
0
0
I'm using the Sakila database and running a transformation trying to update one record as part of a slowly changing dimension exercise.

The error message:

Dimension lookup/update.0 - Caused by: java.lang.RuntimeException: create_date Timestamp : There was a data type error: the data type of java.util.Date object [Tue Feb 14 22:04:36 CST 2006] does not correspond to value meta [Timestamp]

The create_date from the origin and destination tables are both of type "datetime" in MySQL. The field is formatted in my transformation as a type "Date" field with format as "yyyy-MM-dd HH:mm:ss"

The transformation worked to populate the dim_customer table without any problems, but trying to update one record in the same table throws the above error.

I can see the one record trying to update prior to the final step of Dimension Lookup / Update, and it looks as it should.

Using MySQL 5.6, Pentaho 5.0.1-stable. Complete Log:

2015/02/25 09:44:09 - dim_customers - Dispatching started for transformation [dim_customers]
2015/02/25 09:44:09 - sub_address - Dispatching started for transformation [sub_address]
2015/02/25 09:44:09 - get max date.0 - Finished reading query, closing connection.
2015/02/25 09:44:09 - get max date.0 - Finished processing (I=1, O=0, R=0, W=1, U=0, E=0)
2015/02/25 09:44:09 - Table input customer.0 - Finished reading query, closing connection.
2015/02/25 09:44:09 - Table input customer.0 - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Select values.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Mapping input specification.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Database lookup adress.0 - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Database lookup city.0 - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Database lookup country.0 - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Mapping output specification.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - address mapping.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Value Mapper - active.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Select values - order.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2015/02/25 09:44:09 - Dimension lookup/update.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unexpected error
2015/02/25 09:44:09 - Dimension lookup/update.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : java.lang.RuntimeException: Error serializing row to byte array
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.core.row.RowMeta.extractData(RowMeta.java:921)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.addToCache(DimensionLookup.java:1487)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.lookupValues(DimensionLookup.java:680)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.processRow(DimensionLookup.java:235)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at java.lang.Thread.run(Unknown Source)
2015/02/25 09:44:09 - Dimension lookup/update.0 - Caused by: java.lang.RuntimeException: create_date Timestamp : There was a data type error: the data type of java.util.Date object [Tue Feb 14 22:04:36 CST 2006] does not correspond to value meta [Timestamp]
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.core.row.value.ValueMetaTimestamp.writeData(ValueMetaTimestamp.java:494)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.core.row.RowMeta.writeData(RowMeta.java:532)
2015/02/25 09:44:09 - Dimension lookup/update.0 - at org.pentaho.di.core.row.RowMeta.extractData(RowMeta.java:914)
2015/02/25 09:44:09 - Dimension lookup/update.0 - ... 5 more
2015/02/25 09:44:09 - Dimension lookup/update.0 - Finished processing (I=1, O=0, R=1, W=0, U=1, E=1)
2015/02/25 09:44:09 - dim_customers - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2015/02/25 09:44:09 - Spoon - The transformation has finished!!
2015/02/25 09:44:09 - dim_customers - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2015/02/25 09:44:09 - dim_customers - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2015/02/25 09:44:09 - dim_customers - dim_customers
2015/02/25 09:44:09 - dim_customers - dim_customers

tia.

Transformation slow on startup

$
0
0
Hi,
I have a long tranformation with 119 steps. When I run the transformation, it takes a very long time before starting: as you can see in the log, the tranformation is opened, but it takes more than 6 minutes before it starts (from 17:35 to 17:41)

Code:

2015/02/25 17:35:02 - Spoon - Trasformazione aperta.
2015/02/25 17:35:02 - Spoon - Esecuzione trasformazione [czndb]...
2015/02/25 17:35:02 - Spoon - Esecuzione trasformazione avviata.
2015/02/25 17:35:02 - czndb - Spedizione iniziata per la trasformazione [czndb]
2015/02/25 17:35:03 - Table output.0 - Connected to database [cznstat] (commit=1000)
2015/02/25 17:41:29 - DataArgomento.0 - Elaborazione terminata (I=0, O=0, R=1, W=1, U=0, E=0)

Once started, the transformation is very fast and ends in one minute:

Code:

2015/02/25 17:42:37 - Spoon - La trasformazione è terminata!
This is very annoying in testing and debugging. I modified the query limiting the number of rows, but there was no difference in time.
Why it is so slow on startup? What can I do to speed up the starting?

Thank you, Francesco

Postgres Array Fields

$
0
0
I'm having issues inserting values especially NULL into a Postgres 9.3.4 array field. Ex:

myfield TEXT[]

INSERT INTO mytable(myfield) VALUES (NULL)

This above statement works perfectly fine in psql, but running a transform reveals this:

ERROR: column "myfield" is of type text[] but expression is of type character varying
HINT: You will need to rewrite or cast the expression.
STATEMENT: INSERT INTO mytable(myfield) VALUES ( $1)

I've tried setting stringtype=unspecified in Database Connection options, but still get the same errors.

Date parameters in Measures

$
0
0
Hi.

I tried searching on it, but I found nothing.
I did an MDX query in which the column is as follows:
Code:

NON EMPTY Hierarchize(Union(CrossJoin({[Time].[2013]}, {[Measures].[Sales]}), CrossJoin({[Time].[2014]}, {[Measures].[Sales]}))) ON COLUMNS
And I need to make the parameters to define the years to be used in the query (pYear1 and pYear2). Question: How do I use the parameters in this case?
I tried commands like StrToSet or StrToMember, but obviously it did not work, because it is a measure. Not even the Parameter() worked.

SAP Input Failed to be initialized (while it runs in preview)!

$
0
0
Hi,

I am using the 5.0.1 version of pentaho!
I created a simple transformation (see attachment) based on the SAP INPUT.

While running the preview it works fine...but when I run the transformation I systematically get the following error:

Erro_run_transformation.jpg

Then when I do the preview it works perfectly fine. I retrieve data finally such as if I was running the transformation.

The connection to the sap system is also ok (when I click to the "TEST" button).

Anyone can help me to solve that ?

Thanks a lot!
Julien
Attached Images
Attached Files

JSON Input for 1000 records running slowly

$
0
0
Hey there .. I'm trying to figure a way to speed up the slowest step on this big chain --- parsing a JSON input into rows.

I'm grabbing 22 elements for about 1000 records in a JSONpage --- and it's taking 90-120 seconds, per page -- there are an unknown number of pages -- but I can approximate maybe 100-200.

Most of the elements are sensible --- some may be slightly long but nothing like paragraphs.

How can I speed this up in Pentaho? I can't simply do multiple JSON inputs that are grabbing the same JSON page --- doesn't work out.




The only options I can think of --- do the JSON input on different pages at a time. The problem is the "fetching the next page" requires a cursory (Fast) parsing of the current page, for the end-time stamp.
I guess I could do the fetch-quick_parse-fetch-quick-parse-fetch for about the 100-200 pages, then begin the actual "heavy parsing" of each page --- but that would certainly overcomplicate the pentaho setup greatly in the name of time saving.

The other option I can think of is to keep the looping structure (fetch next page, parse JSON page completley, fetch next page) --- but somehow break a page into 3-4 documents to be parsed in parallel before proceeding to fetch the next page. However I'm not sure the best way to accomplish this. Perhaps there's no simple answer.

Mail sent but no attachment

$
0
0
Dear all,

I have several kettle transformations I have put together to get an Excel file dynamically and email it to myself everyday.

When the job runs, it runs successful but without any attachments.

Below are the following logs produced with some of the jobs and the transformations

I have put the transformation on dropbox and the link is below:

https://www.dropbox.com/sh/mec6nxvsm...0N6IOJj9a?dl=0

Can someone please explain to me what I am not doing right?



Thanks,

Ron

Odoo active_id

$
0
0
I want to pass parameter to my report to print selected row only from Odoo UI , is there any way ?
may thanks

Pentaho CE SSL

$
0
0
Hello!

I recently implemented https instead of http to access to my Pentaho BI server community edition (ver 5.0.1).
After that, at a fresh access to BI server, I encounter re-authentication message(probably issued by pentaho,
since the message contains "Pentaho Realm") after log in. It seems when new jssession id is posted, I cannot
authenticate properly. I can access to server, once I cancel the authentication message and log in again.
I am setting collaboration via apache(ajp protocol) to tomcat. Do I need to change any setting on pentaho side.

Please help me. Any information will be of great help.

Thank you!

Using External Java in 'User Defined Java Class' Step

$
0
0
Hi,

Essentially all I want to be able to do is use the 'Charset' Java Type in the 'User Defined Java Class' Step.

Currently what I have done is:
-Created a libext Folder in my Kettle Directory.
-Copied charset.jar to libext
-Modified the Lancher.properties File to read from libext as well as the defaults.
-Attempted to import java.nio.charset.Charset or java.nio.* or java.nio.charset.* into my 'Class'.
-When I hit 'Test Class' I get this error: Cannot determine simple type name "charset"

I have copied the entire 'Class' below.
Any pointers would be appreciated as I haven't attempted to use External Java before.

import java.nio.charset.Charset;
String textIn;


public boolean processRow(StepMetaInterface smi, StepDataInterface sdi) throws KettleException
{
Object[] r = getRow();


if (r == null) {
setOutputDone();
return false;
}


if (first) {
textIn = getParameter("TEXT_PART");
first=false;
}


Object[] outputRow = createOutputRow(r, data.outputRowMeta.size());


byte[] text = get(Fields.In, "TEXT_PART").getString(r);
byte[] bytes = text.GetNonUnicodeBytes();


charset ch = Charset.forName("Cp866");
String value = ch.decode(bytes);


get(Fields.Out, "TEXT_PART").setValue(outputRow, value);


putRow(data.outputRowMeta, outputRow);


return true;
}

How to generate pentaho report designer's prpt file programmatically, without GUI

$
0
0
My question is, how can I generate Pentaho Report Designer's saved .prpt file programmatically, without using Report Designer itself? Is there any libraries for that?
I need to generate those files programmatically, then later open using Report Designer and fix some values.
Viewing all 16689 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>