Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

How to add image column in Table component

$
0
0
Hello Experts,
Am new to cde pentaho how do i add an image column in my table component as shown below. Please advice, thanks in advance
Name:  addin.PNG
Views: 19
Size:  17.4 KB
Attached Images
 

Pentaho Reporting Output - Streaming HTML error

$
0
0
Hello!

We want to upgrade our Pentaho Environment from version 5.1 to version 8.3.
While we were testing we encounter a problem with the Pentaho Reporting Output step in DI.

Simply said:
We use the Pentaho Report to generate a HTML file which we use in the mail body to send mails to our clients.
Because we send it in a mail step, we need to use Streaming HTML to get rid of the CSS file.

Our job (which runs fine in version 5.1) has stopped working in version 8.3.
After that i made a small transformation and a small report:

Report: No parameters, just a label in the report header which says: Hello!
Transformation: Only giving the input and the output directories. Input --> "C:\Temp\test.prpt" Output --> "C"\Temp\Output\test.html"

This results in the following error:
2019/08/05 11:08:54 - Pentaho reporting output.0 - ERROR (version 8.3.0.0-371, build 8.3.0.0-371 from 2019-06-11 11.09.08 by buildguy) : Unexpected error
2019/08/05 11:08:54 - Pentaho reporting output.0 - ERROR (version 8.3.0.0-371, build 8.3.0.0-371 from 2019-06-11 11.09.08 by buildguy) : org.pentaho.di.core.exception.KettleException:
2019/08/05 11:08:54 - Pentaho reporting output.0 - There was an unexpected error processing report 'C:\Temp\test.prpt' to produce file 'C:\Temp\Output\test.html' with processor: Streaming HTML.
2019/08/05 11:08:54 - Pentaho reporting output.0 - Failed to process the report
2019/08/05 11:08:54 - Pentaho reporting output.0 -
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:418)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:143)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at java.lang.Thread.run(Thread.java:748)
2019/08/05 11:08:54 - Pentaho reporting output.0 - Caused by: org.pentaho.reporting.engine.classic.core.ReportProcessingException: Failed to process the report
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1479)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput$5.execute(PentahoReportingOutput.java:323)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.steps.pentahoreporting.ReportExportTask.run(ReportExportTask.java:111)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:399)
2019/08/05 11:08:54 - Pentaho reporting output.0 - ... 3 more
2019/08/05 11:08:54 - Pentaho reporting output.0 - Caused by: org.pentaho.reporting.engine.classic.core.InvalidReportStateException: Other failure
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.html.FastHtmlExportTemplate.write(FastHtmlExportTemplate.java:57)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.FastExportOutputFunction.reportStarted(FastExportOutputFunction.java:51)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.datarow.ExpressionEventHelper.fireReportStartedEvent(ExpressionEventHelper.java:368)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.datarow.ExpressionEventHelper.fireReportEvent(ExpressionEventHelper.java:57)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.InitialLayoutProcess.fireReportEvent(InitialLayoutProcess.java:175)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.SubLayoutProcess.fireReportEvent(SubLayoutProcess.java:172)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.fireReportEvent(ProcessState.java:1120)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.process.ReportHeaderHandler.advance(ReportHeaderHandler.java:43)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.advance(ProcessState.java:948)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processPage(AbstractReportProcessor.java:1126)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1452)
2019/08/05 11:08:54 - Pentaho reporting output.0 - ... 6 more
2019/08/05 11:08:54 - Pentaho reporting output.0 - Caused by: java.lang.IllegalArgumentException: The name given is not valid.
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.di.trans.steps.pentahoreporting.urlrepository.FileObjectContentLocation.createItem(FileObjectContentLocation.java:152)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.html.FastHtmlPrinter.print(FastHtmlPrinter.java:130)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.html.FastHtmlFormattedDataBuilder.compute(FastHtmlFormattedDataBuilder.java:54)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.html.FastHtmlContentProducerTemplate.writeContent(FastHtmlContentProducerTemplate.java:52)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.template.AbstractContentProducerTemplate.write(AbstractContentProducerTemplate.java:61)
2019/08/05 11:08:54 - Pentaho reporting output.0 - at org.pentaho.reporting.engine.classic.core.modules.output.fast.html.FastHtmlExportTemplate.write(FastHtmlExportTemplate.java:53)
2019/08/05 11:08:54 - Pentaho reporting output.0 - ... 16 more
2019/08/05 11:08:54 - Pentaho reporting output.0 - child index = 1, logging object : org.pentaho.di.core.logging.LoggingObject@70f21b6a parent=25f0b469-9269-41fa-891e-efdc1b63d3c4
2019/08/05 11:08:54 - Pentaho reporting output.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2019/08/05 11:08:54 - TestReport - searching for annotations
2019/08/05 11:08:54 - TestReport - no annotations found
2019/08/05 11:08:54 - TestReport - ERROR (version 8.3.0.0-371, build 8.3.0.0-371 from 2019-06-11 11.09.08 by buildguy) : Errors detected!
2019/08/05 11:08:54 - Spoon - The transformation has finished!!
2019/08/05 11:08:54 - TestReport - ERROR (version 8.3.0.0-371, build 8.3.0.0-371 from 2019-06-11 11.09.08 by buildguy) : Errors detected!
2019/08/05 11:08:54 - TestReport - ERROR (version 8.3.0.0-371, build 8.3.0.0-371 from 2019-06-11 11.09.08 by buildguy) : Errors detected!
2019/08/05 11:08:54 - TestReport - Transformation detected one or more steps with errors.
2019/08/05 11:08:54 - TestReport - Transformation is killing the other steps!
2019/08/05 11:08:54 - TestReport - Looking at step: Table input
2019/08/05 11:08:54 - TestReport - Looking at step: Add constants
2019/08/05 11:08:54 - TestReport - Looking at step: Pentaho reporting output


I can't seem to find any information on this. How can i resolve this?

Thanks in advance!

Best way to perform an equivalent of UNION in MySQL in PDI

$
0
0
Hi,

I am trying to see the best way to perform a UNION in PDI. I have 3 table input steps with 3 different queries that gives me two columns in the results like the Date and Tickets. I would like to combine the results from the 2 or more streams and add the values.

I am currently using the Dummy step and then using the Group by step to add the values. I am not sure if there is a better way to do this instead of using the Dummy step and wondering if I can get any inputs on the same.

I also tried using the Multiway Merge Join and doing a FULL OUTER JOIN but this is giving me the following results: Date_1, Tickets_1, Date_2, Tickets_2, Date_3, Tickets_3 so not sure if I am doing it right.

PDI SPOON VERSION: 7.0.0

Thank you,
Malavika

Extract log from Gitlab

$
0
0
Hi mates,
Somebody have tried to get the log from gitlab software and load those data in a warehouse?

Thanks in advance.

Kitchen issue

$
0
0
Hi all,

I've an issue using Kitchen.
When I run a JOB from the GUI of Spoon, it works like a charm, instead when I run the same JOB using Kitchen I get an error.
I think it should be some missing JAR file somewhere, but I don't know where...

The error is the following:
Code:

Unexpected error during transformation metadata load
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - Error reading object from XML file
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - Error reading object from XML file
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - Unable to load database connection info from XML node
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS  - Unable to create new database interface
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - database type with plugin id [IMPALASIMBA] couldn't be found!
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1356)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:690)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.execute(Job.java:732)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.execute(Job.java:873)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.execute(Job.java:547)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.run(Job.java:436)

Some information about the environment:
- OS Windows Server 2016 Standard
- Java Version 1.8.0_144
- PDI Version 8.0
- We don't have a repository but we are running JOBS and TRANSFORMATIONS from files (Kitchen.bat /file: ...)

Of course all the JOBS that aren't reading data from HADOOP and are run through a batch file are working fine.


Any help would be greatly appreciated

Best regards

Alessio

database type with plugin id [IMPALASIMBA]

$
0
0
Hi all,

I've an issue using Kitchen.
When I run a JOB from the GUI of Spoon, it works like a charm, instead when I run the same JOB using Kitchen I get an error.
I think it should be some missing JAR file somewhere, but I don't know where...

The error is the following:
Code:

Unexpected error during transformation metadata load
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - Error reading object from XML file
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - Error reading object from XML file
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - Unable to load database connection info from XML node
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS  - Unable to create new database interface
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS - database type with plugin id [IMPALASIMBA] couldn't be found!
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1356)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:690)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.execute(Job.java:732)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.execute(Job.java:873)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.execute(Job.java:547)
2019/08/06 17:20:52 - SOD_XXXXX_ORDERS -      at org.pentaho.di.job.Job.run(Job.java:436)

Some information about the environment:
- OS Windows Server 2016 Standard
- Java Version 1.8.0_144
- PDI Version 8.0
- We don't have a repository but we are running JOBS and TRANSFORMATIONS from files (Kitchen.bat /file: ...)

Of course all the JOBS that aren't reading data from HADOOP and are run through a batch file are working fine.


Any help would be greatly appreciated

Best regards

Alessio

Búsquedas dentro de un Pentaho repository

$
0
0
Buenas tardes,

me gustaría saber si alguien sabe cómo realizar búsquedas dentro de un repositorio de transformaciones y trabajos de PDI en la versión 8.

Hoy en día tengo un repositorio en base de datos con Kettle 7 y como está montado sobre una base PostgreSQL puedo conectarme y hacer consultas y búsquedas sobre las tablas r_job, r_transformation, r_step_attribute, etc.

En los nuevos "Pentaho repository" no sé cómo hacer algo similar.


Mil gracias!

Pentaho repository - Search inside?

$
0
0
Hi all,

I have been working with database repositories in several versions of PDI/Kettle (4, 5 and 7). They were stored in both MySQL and PostgreSQL engines and I could query the tables that stored all the data (r_job, r_transformation, r_step, r_step_attribute, etc.)

This was extremely useful to locate transformations and also to make changes in dozens of jobs at the same time without edditing them using PDI.


I don't know how to do something equivalent in the new Pentaho Repository that you can connect to using PDI 8. We have migrated all the content of a database repository (v7) to a new Pentaho Repository. The performance when loading the tree of directories and jobs/transformations is much better now, but having the possibility to search inside the repo is critital.

Do you know any way we can solve this?


Thanks!

Merging delimited text files with maximum value found

$
0
0
Hello,

I am trying to merge multiple text files using the maximum value found at a particular row & column

All the text files have exactly the same number of rows and columns, as per the below image and attached sample data
The twa column always has values from 0-180 in increments of 5 degrees

Name:  polar values.jpg
Views: 6
Size:  37.8 KB

Say for example:
  • the values of all the tws columns are initially zero
  • the first text file has values as shown in the attached image
  • if a subsequent text file has a value at a particular row/column that is greater than the previous value, update it.
    So say a subsequent text file has a value of 5 in the tws14 column, twa=40 row, then update it (yellow highlight would become 5)
  • if a subsequent text file had a lower value (say 3) in this position, then nothing would change


I have tried writing the data to a database various and using lookups/filters, database update queries etc etc, without success.
The end goal is to write a text file in the same format as the originals- which I am ok doing if I have the data in a suitable form (say a database table)

I think that there is probably a simpler way to achieve this than my efforts- so if someone can point me in the right direction, I would appreciate it

thank you in advance

Andrew
Attached Images
 
Attached Files
Viewing all 16689 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>