Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Loading hierarchy of objects to DB knowing patterns to match

$
0
0
I have a set of information that's going into a DB table.
Now I need to build up information about their hierarchy (parents) using some constraints and database sequence column.

case length substring hierarchy.jpg

id_parent is the column to be created based on lookup in the same stream involving info_lokalizacja substring match (presented as rectangles in different colors to show the lookup).

Based on length of string in column info_lokalizacja I need to select id_db_seq from the same stream where a substring of that info_lokalizacja column would match entire value.
This is a typical child-parent implementation.
I could probably duplicate the stream and do this all within Java Script, but I'm not sure if that's the best approach. I consider JS coding to be the last thing to do in Kettle, when there are no steps to achieve the task.

I basically need to implement something that would be equivalent to below SQL.

Code:

INSERT INTO table(...)
  SELECT
    id,
    CASE
      WHEN length(info_lokalizacja) = 4 THEN null
      WHEN length(info_lokalizacja) = 7 THEN (select id from table2 where info_lokalizacja = left(info_lokalizacja, 4))
      WHEN length(info_lokalizacja) = 11 THEN (select id from table2 where info_lokalizacja = left(info_lokalizacja, 7))
      END AS id_parent
  FROM
    table2;

I can think of some solutions but I bet they would be involving too much overhead for the task, because I have more of such patterns and depending on the id_typ column I have to apply a different "CASE" hierarchy building logic, so creating new columns for all substrings for each row is not really a solution.

I'd be thankful even for the list of steps which I could use to accomplish the task. I do NOT ask for a complete solution.
Attached Images

dos format issue

$
0
0
Hi Sir,

MySQL,pdi-ce-6.0 file repository, Windows, Java 1.7 versions

Could you please suggest me how can i achieve issue like : "DOS format was specified but only a single line feed character was found, not 2 ? "

IF I use Table input --> Text file output then text placing into multiple rows. if i use function replace('term_des','\n', ' ') then i am getting single row.

select term_des from prod -- getting issue using this query
select replace(term_des,'\n',' ' ) from prod -- getting correct result using this query

sample text:


update entity_enttlmnt set term_description='Turbines, turbine shaft, bearings, internalvariable vane assembly, and turbocharger housing. Does not include: wiringharnesses, wastegates; oil, fuel, or coolant lines; external fittings, clamps, bolts,or fasteners, charge air cooler and duct work, injector seals, cups or tubes, EGRvalves and associated components, linkages, connectors, V Pod, actuators, seals &gaskets, vacuum controls or electrical components off(course))))$#'

how can i acheive this issue without replace function ? please suggest me if i need to change separator and enclosure settings in text file output step.

text file output.png
Attached Images

problem looping on a compound field

$
0
0
Hello people,

i have a field in my stream that was key value field with field key lenght fixed 5 bytes and field value length variable ......that's like this:

00350227003521350101125150000000000000000000000000000000000000000000000

i have to transform like this:

00350=227;00352=13501;01125=15;

i have a db tables in whitch i have stored foreach key the lentgh of the value like this;

00350;3
00352;5
01125;2

I have to loop the attacched transformation ad reuse the output parameter until end reading fields.

I need an help pleeeease.:confused:

Thanks a lot.

Bye;)

What version of Mondrian is installed with BI Server 6.0 CE ?

$
0
0
Does BI Server 6.0 CE have mondrian 3 or mondrian 4 installed? I was planning on building mondrian 4 schema definitions.

Does PDI support UNION feature as "UNION" in SQL?

$
0
0
I was trying to merge two dataset which has same column:
for example:
table input 1:
ID Name
1 Jack
2 Rose

table input 2:
ID Name
3 Dan

after union:

ID Nama
1 Jack
2 Rose
3 Dan

I haven't found anything in PDI to do this and there is a Merge Join which to Join but not Union. The merge Rows one is kind of a comparison things. need someone help here.

Open Referenced Object .. Transformation not working

$
0
0
Good afternoon,

We are trying to migrate our PDI jobs from Version 5.1 to 6.0.1. In our current environment we are using Windows Server 2008, Java 7 and storing all jobs and transformations on the file system.

In the new environment we want to use Centos7, OpenJDK 8 and go back to storing the job and transformation definitions in a MySQL database repository. We had previously used an Oracle database as our repository about 4 years ago which worked great with PDI 4.1 and 4.2

I moved a couple of jobs to the new server and database repository and was able to run them successfully. However, when I tried to open a transformation by right clicking on it in the Job and selecting Open Referenced Object > Transformation, nothing happens. I opened Spoon in debug mode and I can see the line:

Code:

Loading transformation from repository [DataWarehouseAlertProcess] in directory [/Alerts]
in SpoonDebug.txt. But there are no error mesages.

However, if I move the transformation to the root directory of the repository and point the transformation step at this new location, then select Open Referenced Object > Transformation works successfully. Obviously, I don't want to have all of my jobs and transformations saved at the root level.

Has anyone else experienced this?

Thanks,
Greg

Connecting to InfluxDB

$
0
0
Hey guys,

I'm starting to work with InfluxDB. However, in the Connection Type list the InfluxDB option is not available. So, I would like to know if there's a connector or some way to work with InfluxDB in Spoon.

Thanks! Sorry for the bad English.

Marketplace Metadata; Communication About Pull Request


Problem with the installaion of pentaho:Pentaho Initialization Exception

$
0
0
Hello,
I have tried several versions of Pentaho BI, the community edition(the 5.3.0.0-213, and the 6.0.1.0.386), with both jdk 7 u79, jdk8u20,but i still have the same error and i'm already fed up of trying to fix it.
I got this error when i launched it:
The following errors were detected
[fr_49] One or more system listeners failed. These are set in the systemListeners.xml.
org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - [fr_71] Error while trying to execute startup sequence for org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener

Please see the server console for more details on each error detected.

Here is my pentaho.log
2016-02-24 18:16:49,862 ERROR [org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener]
org.pentaho.platform.api.engine.security.userroledao.AlreadyExistsException:
at org.pentaho.platform.security.userroledao.jackrabbit.JcrUserRoleDao.createRole(JcrUserRoleDao.java:123)
at org.pentaho.platform.repository2.mt.RepositoryTenantManager.createTenant(RepositoryTenantManager.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at com.sun.proxy.$Proxy28.createTenant(Unknown Source)
at org.pentaho.platform.repository2.unified.lifecycle.DefaultBackingRepositoryLifecycleManager.startup(DefaultBackingRepositoryLifecycleManager.java:164)
at org.pentaho.platform.repository2.unified.lifecycle.DelegatingBackingRepositoryLifecycleManager.startup(DelegatingBackingRepositoryLifecycleManager.java:87)
at org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener.startup(BackingRepositoryLifecycleManagerSystemListener.java:59)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:421)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
2016-02-24 18:16:52,385 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-02-24 18:16:52,385 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - [fr_71] Error while trying to execute startup sequence for org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener
org.pentaho.platform.api.engine.PentahoSystemException: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - [fr_71] Error while trying to execute startup sequence for org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:348)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - [fr_71] Error while trying to execute startup sequence for org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:430)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
... 27 more
Caused by: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - [fr_71] Error while trying to execute startup sequence for org.pentaho.platform.repository2.unified.BackingRepositoryLifecycleManagerSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:422)
... 35 more
2016-02-24 18:16:52,385 ERROR [org.pentaho.platform.util.logging.Logger] Error end:


Please i will be very grateful if someone could help to fix it!!

Data with 40K+ rows

$
0
0
Hi all PDI experts,

I am new to PDI... hope you will be able to assist me.

What I am trying to do is to load the ServiceNow data into a PostgreSQL DB.

Steps:
  1. Generate Rows
  2. REST Client
  3. Get data from XML
  4. Select values
  5. Table output

There are over 40K+ incidents, when I tried loading the data, it only returned 10,000 records.
How could I load every single records?

My plan is to get all records for the first time at least, then insert/update records on the weekly basis (still finding a way to do it). Or is there any better way to do it?
The XML link is a URL, which does not allow me to set any time period on it. :confused:

Thank you everyone!!!

Select with "All" option

$
0
0
Hi.
I need yo include an ALL option in dropdown selector for loading information off all providers in a chart.
¿How can i do that?

Thanks.

Changes to DB repository not committed

$
0
0
Hello all,

I have found a weird behavior when using a DB repository:

If I use the repository explorer and make some changes like e.g. moving a job to a folder or renaming a transformation, those changes are not committed to the repository UNLESS I disconnect the repository on purpose by using Tools -> Repository -> Disconnect Repository. They are not committed even if I quit Spoon.

I have confirmed this behavior using both mySQL and Postgres based repositories, so to me it doesn't look like an issue with one specific DB flavor.

My environment: PDI 6.0.1 with mySQL/Postgres repos running in an Ubuntu virtual box.

Can you confirm this issue and whether it will be resolved?

Thanks!

--
Julio.

PMR Failing with mssing libraries

$
0
0
Hi all,

I'm trying to get Pentaho Map Reduce working in our corporate Cluster (CDH 5.4, with 6.01 Kettle and the approriate shim). The libraries get copied up to HDFS no Problem, and then in my user home I get a .staging Directory with a job_xyz subfolder containg Job.jar, Job.xml, jop.split and Job.splitmetainfo. There are no subdirectories in this Folder (e.g. lib). The application fails with the message :

Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

It seems that somehow the required librares are not accessible. Does this sound familar to anyone out there?

Thanks.

Russ

CDA Datasource in CDE

$
0
0
I'm trying to build a dynamic dashboard that can deal with several different datasources, but I'd like to have to set those up on a CDA file only, kind of a plugCDA-and-play system. Also, I want each dashboard to view the datasources associated with it only, instead of all the ones I've got setup on pentaho.

As such, with the approach I'm taking, I need to be able to check which CDA files are defined on the dashboard's datasources, to then parse said files for the datasource connections that I should present to my users.

Can I access CDA data source path property somehow, while running?
How about modifying CDA data source data access ID, also while running?

Also, could you point me in the right direction as to how to fetch schema/cube data directly?

Pentaho Reporting Output in Debian: Cannot generate reports in PDF

$
0
0
Hello


I am pretty new with Pentaho and I have designed a job with Spoon that sends reports (designed with Report Designer) that generate PDF and are sended by email with the PDF attached file. I use the component Pentaho Reporting Output (output processor PDF) and SendMail. It works fine in Windows 7.


Now I have to do just the same in a Debian server. I have installed the same version of Pentaho Data Integration 6.0.1 linked with the DB repository. When I launch the job with kitchen.sh

./kitchen.sh -rep:"Etl_DataCRM" -dir:"Principal" -job:"JobEnvioCorreo" -user:admin -pass:admin


and I get this error:


ERROR [AbstractReportProcessor] 1363067431: Report processing failed.
org.pentaho.reporting.engine.classic.core.InvalidReportStateException: Failed to fire report event for sub-layout-process
at org.pentaho.reporting.engine.classic.core.states.InitialLayoutProcess.fireReportEvent(InitialLayoutProcess.java:181)
at org.pentaho.reporting.engine.classic.core.states.SubLayoutProcess.fireReportEvent(SubLayoutProcess.java:173)
at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.firePageStartedEvent(ProcessState.java:1083)
at org.pentaho.reporting.engine.classic.core.layout.output.DefaultOutputFunction.reportInitialized(DefaultOutputFunction.java:130)
at org.pentaho.reporting.engine.classic.core.states.datarow.ExpressionEventHelper.fireReportInitializedEvent(ExpressionEventHelper.java:501)
at org.pentaho.reporting.engine.classic.core.states.datarow.ExpressionEventHelper.fireReportEvent(ExpressionEventHelper.java:52)
at org.pentaho.reporting.engine.classic.core.states.InitialLayoutProcess.fireReportEvent(InitialLayoutProcess.java:176)
at org.pentaho.reporting.engine.classic.core.states.SubLayoutProcess.fireReportEvent(SubLayoutProcess.java:173)
at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.fireReportEvent(ProcessState.java:1105)
at org.pentaho.reporting.engine.classic.core.states.process.BeginReportHandler.advance(BeginReportHandler.java:50)
at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.advance(ProcessState.java:933)
at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processPaginationLevel(AbstractReportProcessor.java:651)
at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:478)
at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1424)
at org.pentaho.reporting.engine.classic.core.modules.gui.pdf.PdfExportTask.run(PdfExportTask.java:115)
at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:273)
at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:118)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.InternalError: Can't connect to X11 window server using ':0.0' as the value of the DISPLAY variable.
at sun.awt.X11GraphicsEnvironment.initDisplay(Native Method)
at sun.awt.X11GraphicsEnvironment.access$200(X11GraphicsEnvironment.java:65)
at sun.awt.X11GraphicsEnvironment$1.run(X11GraphicsEnvironment.java:110)
at java.security.AccessController.doPrivileged(Native Method)
at sun.awt.X11GraphicsEnvironment.<clinit>(X11GraphicsEnvironment.java:74)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:195)
at java.awt.GraphicsEnvironment.createGE(GraphicsEnvironment.java:102)
at java.awt.GraphicsEnvironment.getLocalGraphicsEnvironment(GraphicsEnvironment.java:81)
at java.awt.image.BufferedImage.createGraphics(BufferedImage.java:1182)
at java.awt.image.BufferedImage.getGraphics(BufferedImage.java:1172)
at org.pentaho.reporting.libraries.base.util.WaitingImageObserver.waitImageLoaded(WaitingImageObserver.java:154)
at org.pentaho.reporting.engine.classic.core.DefaultImageReference.<init>(DefaultImageReference.java:121)
at org.pentaho.reporting.engine.classic.core.filter.types.ContentType.filter(ContentType.java:181)
at org.pentaho.reporting.engine.classic.core.filter.types.ContentType.getValue(ContentType.java:84)
at org.pentaho.reporting.engine.classic.core.layout.build.DefaultLayoutBuilderStrategy.computeValue(DefaultLayoutBuilderStrategy.java:221)
at org.pentaho.reporting.engine.classic.core.layout.build.DefaultLayoutBuilderStrategy.processContent(DefaultLayoutBuilderStrategy.java:170)
at org.pentaho.reporting.engine.classic.core.layout.build.DefaultLayoutBuilderStrategy.addBandInternal(DefaultLayoutBuilderStrategy.java:115)
at org.pentaho.reporting.engine.classic.core.layout.build.DefaultLayoutBuilderStrategy.add(DefaultLayoutBuilderStrategy.java:78)
at org.pentaho.reporting.engine.classic.core.layout.build.ReportRenderModelBuilder.add(ReportRenderModelBuilder.java:204)
at org.pentaho.reporting.engine.classic.core.layout.AbstractRenderer.add(AbstractRenderer.java:342)
at org.pentaho.reporting.engine.classic.core.layout.output.DefaultOutputFunction.print(DefaultOutputFunction.java:1150)
at org.pentaho.reporting.engine.classic.core.layout.output.DefaultOutputFunction.updatePageHeader(DefaultOutputFunction.java:613)
at org.pentaho.reporting.engine.classic.core.layout.output.DefaultOutputFunction.updateHeaderArea(DefaultOutputFunction.java:558)
at org.pentaho.reporting.engine.classic.core.layout.output.DefaultOutputFunction.pageStarted(DefaultOutputFunction.java:515)
at org.pentaho.reporting.engine.classic.core.states.datarow.ExpressionEventHelper.firePageStartedEvent(ExpressionEventHelper.java:545)
at org.pentaho.reporting.engine.classic.core.states.datarow.ExpressionEventHelper.fireReportEvent(ExpressionEventHelper.java:38)
at org.pentaho.reporting.engine.classic.core.states.InitialLayoutProcess.fireReportEvent(InitialLayoutProcess.java:176)


Any idea what is happening? The job is sending empty reports...


Thanks in advance

Need Help to implement Rest Client

$
0
0
Hi All,
I am struggling to get the HTTP post implementation correct in the Pentaho. Initially I tried with the HTTP POST option. This is not at all working. Then I started trying with the REST CLIENT.

Can you kindly tell me what exactly meant By

- the BODY FIELD? In which format one have to use?
- What exactly is Parameters and Headers Tab?
Restclient_Settings_General.jpgTransformation.jpg
Can someone kindly help me here?
I am using the Generate Rows to read the URL and then it to REST CLIENT to read the JOSN file with POST request.

I will appreciate if someone can kindly send some links to working examples how to use the REST CLIENT with POST requests?

Regards,
Krishna
Attached Images

Transformation step metrics inside Job with Java API

$
0
0
I'm running Pentaho Job which has several Transformations inside through Java API. How can I retrieve step metrics for each Transformation?

Understanding Merge Join

$
0
0
Hello All,

I wish to understand the Merge Join. I have two streams (say Stream A and B) of data with varying columns and I wish to pick few columns from Stream B after performing a join between two streams.

Reading the documentation, it asks for the columns of each streams to be sorted, however, my question is if there is additional data in either of the Streams, is the join performed sequentially?


Stream A

Col1|Col2|Col3
-------------------
C|D|2
A|A|4
A|B|1
E|F|3


Stream B
Col11|Col12|Col13
----------------------
P|Q|12
X|Y|13
A|B|11


With this example, I want my output to be


Col1|Col2|Col3|Col4
----------------------
A|A|4|(null)
A|B|1|11
C|D|2|(null)
E|F|3|(null)

I tried to use Merge Join (Left Outer) but the join for A|B does not work and always Col4 persists as Blank.

Updateall object of a SQL Db

$
0
0
Hi to all,
I have a question. I have 3 Db on a Server and I have made a copy on another server. Now, for each table I have prepared with Kettle many scripts for Update and for inseret new rows.
But if I want copy or updates others objects as Stored Procedures, Views, Functions and logins, how can I do it?
Thanks a lot for each reply

Giorgio

instaling report designer not connecting to mysql

$
0
0
I'm very new to pentaho but experienced with mysql & php. I have just recently set up PDI on my macbook air. PDI has a folder with a bunch of files, and the important lib/ folder in which I added the jar file to enable pentaho to talk to mysql. It's connecting just fine to mysql, and I've run some transformations. However Report Designer is not. Unlike PDI, RD appears on the mac to be a single file application, no file folders in finder. Can you tell me how do I either get Report Designer to recognize the existing jar file, or put a copy of it somewhere so it can use it to connect to mysql?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>