Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Scheduling in PUC

$
0
0
I wanted to know if it was possible to schedule automatic reports having in most fields, hence a scheduling automatically intersecting all parameters without creating more schedules

example:
field1: year( 2005, 2006)
field2: c_agente( A, B, C)
field3: cd_articles (012, 065, 073)

report 1 for 2005 -A - 012
report 2 for 2005 - B - 012
report 3 for 2005 - C - 012
report 4 for 2006 - A - 012 etc....

he must intersect all the parameters and give me the separate output, because if I have to create a schedule for each field and fields 100 I never finish

thanks

Clob to xmltype casting

$
0
0
Hi,

I'm very new to PDI. I need some help on xmltype() casting. Does Kettle support xmltype datatype?
Thanks in advance.

Passing rows from A->B->C

$
0
0
I know how to pass rows from transformation A to B on a job using Copy rows to result and Get rows from result and have been using it without major problems.

Today I was faced with a job with three transfomations, A reads the rows, B processes them and C also processes the same rows in a diferent way. I thought it all would run smoothly but found that transformation C is called n times (as much as rows exist after A) but every time with the same values: only last row is processed n times!

I tried to check "Clear list of result tows before execution" on B and add a Copy results to row there. Same behaviour.
I tried to not check "Clear list ..." on B and add a "Copy results ...". Now I get n x n rows (as expected) all of them with the last values.

Any idea what I'm doing wrong?

Thks

JOB
Start -> TA -> TB (execute for every row) -> TC (execute for every row)

TA: data grid (3 rows, values 1 2 & 3), log (3 rows displayed), copy rows to result
TB: get rows from result, log (3 rows displayed, values 1 2 & 3)
TC: get rows from result, log (3 rows displayed, values 3, 3 & 3)

How to set default logging level for Kettle transformations run on BI server?

$
0
0
Hi!

I'm trying to figure out why a Kettle transformation doesn't return the desired result when used as a data source in a CDA-file. To see what the transformation does in detail I would like to increase the debugging level on Kettle. Right now it only logs INFO lines in pentaho.log on the BI server. How do I change the default logging level for Kettle transformations/jobs on the Tomcat BI server?

PostgreSQL Bulk Loader - Path to PSQL client

$
0
0
Hi,

I have a transformation loading data into PostgreSQL database, and I use PostgreSQL Bulk Loader step to achieve faster data load process.

The transformation's PostgreSQL bulk loader step is set to run in clustered mode here.

Clustered enviroment of my setup has 2 slave servers, and these 2 slave servers (1 Windows and 1 Ubuntu Linux) have postgreSQL software installed in different locations.

So I got stuck here, how to pass the 'Path to PSQL client' property value in general common to both slave servers?

Can someone suggest an option here?

Thanks.

report into BI CE

$
0
0
Hi
is it possible to install report engine and Report Designer 5.4.0 into the bi-server 5.4.0.1 community edition version ?

Thanks

Make report from multiple databases

$
0
0
Hi,

I thank you for that Forum on Pentaho.
I'm sorry for my english, I'm a French speaker.

I'm using Pentaho Report Designer , I can make report using a single database.
I would like to know, how to make report using multiple databases.

When I tried for example to connect 2 databases. On Data Sets I see the 2 connections, but there is only one active. (see attach file).

I just want to know how to use both. I'm using JDBC MySQL.

I'm openning for others solutions if it's not possible with Pentaho Reporting.

Regards,
Attached Images

Execute job in windows cmd

$
0
0
Hello.

When I try to do in windows CMD with this command
F:/[PATH]/Kitchen.bat /file:F:/[PATH]/jobs/job1.kjb
and I got sucess

but if i try execute my jobs using a .bat file with this commandF:/[PATH]/Kitchen.bat /file:F:/[PATH]/jobs/job1.kjb

i get this error message

C:\[PATH]\Job 2>F:/V/Kitchen.bat /file:F:/[PATH]/jobs/job1.kjb
WARNING: Using java from path
DEBUG: _PENTAHO_JAVA_HOME=
DEBUG: _PENTAHO_JAVA=java.exe
C:\Users\solvian\.jenkins\workspace\Job 2
The system cannot find the path specified.
The system cannot find the path specified.

C:\Users\solvian\.jenkins\workspace\Job 2>"java.exe" "-Xmx512m" "-XX:MaxPermSize=256m" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-launcher-5.4.0.1-130.jar -lib ..\libswt\win64 -main org.pentaho.di.kitchen.Kitchen /file:F:/[PATH]/job1.kjb
Error: Unable to access jarfile launcher\pentaho-application-launcher-5.4.0.1-130.jar

Can anyone help me?




Upgrade path to 15.06.30 CTools release (on 5.3)

$
0
0
Hi -

If we're still on 5.3 CE/15.04.16 CTools, what is the recommended upgrade path to get the latest CTools?

I found this document:
http://redmine.webdetails.org/projec...wiki/RequireJS

which describes some backwards-incompatible api-level changes that seem to have been made in the latest version. Is this a complete list, or are there other compatibility or porting issues we should know about before attempting to upgrade to 15.06.30 from 15.04.16 version? Does anyone else have any experience in doing this porting?

thanks!

Performance issue with Mondrian / Olap4J.

$
0
0
I am experiencing a performance issue with Mondrian / Olap4J. The dataset is small, the tables are an optimized star schema. Database query has a very low cost, and according to the MDX logs the execution is < 1 second. However the data set is not returned for up to 10 seconds. The bottleneck appears to be when Olap4J gets the response (cell set) from the XMLA request and populates the cell set axis and cells. Has anyone seen this before? Any thoughts on how to increase performance?

Thanks,

Chris

DI Help

$
0
0
How to know how many users are opened same transformations and jobs in the pentaho DI repository.

Suppose i have 10 users in that 3 users has opened the same transformation or job then how would i know that this user has opened my transformation/job.

Any help is appreciated..

DI Error

$
0
0
Hi,

When i am importing jobs and transformations using command line utility i see below issues. How can i overcome this issue.

command which i have triggered :

sh import.sh -rep=xxxxx -user=yyyyyy -pass=xxxxx -dir=/ -file=/tmp/CPR.xml -norules -comment="Job Import"


- Unable to save repository element [172.21.35.888]
2015/08/01 09:56:46 - exception while moving file with id "5a3be3cd-369f-49b7-8e94-b49d3ad5ac93" to destination path "/etc/pdi/slaveServers/172.21.35.888.ksl"
2015/08/01 09:56:46 -
2015/08/01 09:56:46 - Reference number: 1e30554e-805a-461f-8b3f-3074ea73c9bf

Problem with execute xaction in CDE 5.4.0.1

$
0
0
Hi,
I can't execute xaction in CDE 5.4.0.1
The problem seems caused by an incorrect path to repository.
For example:
I want to execute a.xjpivot which is in folder Public/pivots

So
Solution : Public
Path: pivots
Action: a.xjpivot

But in catalina.out i get
Code:

ERROR [ActionEngine] java.lang.NullPointerException
I managed to execute xactions from older versions of CDE (like 5.2) but i'm unable to do it with 5.4.0.1.
What I do wrong?

Thanks

carles

Connection has been shutdown -- Salesforce Upsert

$
0
0
Hi,

I am trying to make an upsert operation to Salesforce using the Salesforce Upsert step. The transformation seems to be working fine when the number of records are around 2000. But when I am trying to load around 5000 records, I am getting the following error in the failure file generated:


Error while doing upsert operation, error message was: javax.net.ssl.SSLException: Connection has been shutdown: javax.net.ssl.SSLException: java.net.SocketException: Connection reset by peer: socket write error

and none of the records are getting upserted, and the error files generated are also very huge. The input file is 450kb and corresponding error file is 500mb!

Please let me know, how to resolve this issue.

Thanks in advance.

Update table input for errors while writing to table output

$
0
0
Hi,

I have a source table A which has an error flag (IS_ERROR). I want to update IS_ERROR it to Y when loading data fails from Table A to target table B. At the same time,I want to enter dummy records in table B instead of stopping the transformation and rolling back.

Here is the flow of events:

1) For successful scenarios: all records are copied from Table A to Table B (Working)

2) For error scenarios :
a) Update table A set IS_ERROR='Y' only for the fail record
b) insert dummy records for Table B for the fail record.

2015-08-03 14_42_36-Spoon - [Repo] STG_To_ITF (changed).jpg

How can I achieve both the actions for event (2)

Japanese character cannot display in PDF download

$
0
0
Dear forum,

I have a dashboard and this dashboard is created by using Ctools.
And, I have a graph crated by using report designer tool.
On my dashboard, I display the graph created by report designer tool by using "PRPT Component" of Ctools.
And also, I used that graph together with "Execute PRPT Component" of Ctools.
And, I prepared that graph to download with PDF format.

My problem is ...
My graph title is japanese characters and it can display on graph of dashboard and on Execute PRPT Component.

Fig 1: On dashboard with "PRPT Component".

Fig 2: On "Execute PRPT Component"

But the japanese characters cannot display when download with PDF format.

Fig 3: On PDF download.

I have tried to solve that problem as the followings:
1) Set UTF-8 in File >> Configuration >> output-pagable-pdf >> ~.Encoding ( by using report designer tool )
2) Set the following tow settings in /server/webapps/ods/WEB-INF/classes/classic-engine.properties
- org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.Encoding=UTF-8
- org.pentaho.reporting.engine.classic.core.modules.output.pageable.pdf.EmbedFonts=true

But still it is not work.

Therefore, could you please help me to solve this problem??

( Sorry that I cannot upload my images. There is some problem )

Count of Columns in text file

$
0
0
Hello All,

I want to count of the columns in the input text file. Is there any way of accessing the metadata of the input file? If not metadata, I would be satisfied to have the count of columns only.

CDE Pivot Component

$
0
0
Hi forum !

I have some questions about the Pivot Component in Pentaho CDE. I'm actually totally lost : how does it work exactly ?
What's an xaction ? Do i need to install other things to get it work ?

Thank you in advance for your help !

Does Metadata Editor understand the PK/FK relationships?

$
0
0
Since we have PK/FK relationships defined in the database (Postgres), when we drag the tables on the Metadata Editor canvass, we expect that it would pick up the relationship information from the database, so we won't have to define it again manually in the tool. Being new to Pentaho, are we missing something in the functionality - some check box we need to enable, etc.?

Thank you,
-- Alex

google analytics

$
0
0
Hi,

seems to be few of PDI developers are working on Google analytic's, is it free of cost for users? could you please guide me.

i gone thorough the google / web pages haven't find exact info regarding google analytic's, please share your inputs.

Thank you
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>