Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Conditional update in pentaho kettle for nosql json mongo db

$
0
0
How do we do condtional update in Pentaho to update no sql db Mongo?

Normally in Informatica we have update strategy to update the target.

Scenario I have a Target which contains Json documents which contain fields which I need to update based on condition.

{
object_id("12344")

"field": '10-01-2014'

}

I am using a Formula step to generate the flag based on if condition -( if [field]<now();[field]=now();[field]=[field]) then I use filter which will have the flag value in boolean either 0 or 1. If 1 true part goes in my target or o false part goes in my target mongo .

However I am not able to conditonal update as the field doesnt get updated even after I have field set as $set in mongodb o/p and modifier enabled for update.

Any advice on how condtional update is done in Penatho for Mongodb docs will be appreciated.

Distributing application specific databases with Sparkl

$
0
0
The challenge

Did I say that Sparkl is amazing? Probably a few times :)

When we're developing operational applications, utilities, even dashboards there's a common requirement that often comes into play - the ability to have a storage engine available.

Sparkl is itself a plugin that runs on the Pentaho BA server (Check this older blog post for the Sparkl fundamentals), and the BA server needs to be connected to a database, and is usually connected to several.

However, this poses a problem if we're distributing our Sparkl app, as we would need to know in advance the configuration of the database so we can connect to it; On top of that, we'd need to do the initialization of the database, create tables, etc.

Francesco Corti took a great approach to it on the AAAR plugin, a pentaho - alfresco integration, where he built a configuration screen to configure all the necessary dependencies

Embedding a database

But wouldn't it be cool if we could remove that dependency and somehow have a database that would just work (tm)?

Not only it would be cool, it actually works as a charm. We did that when we built the SDR sample form that was part of the Pentaho 5.2 release.

What we did was using the h2 database in embedded mode; Just by playing with the connection string, we can create a database on the fly that's persisted on a file. Obviously this is hardly something that should be used to store huge amounts of data. I'm giving you the tool, use it wisely.

So, how does it work? The trick, like I mentioned, is on the connection string. All we need to do is to create a connection in PDI with the following connection string:

jdbc:h2:${cpk.solution.system.dir}/${cnp_db};DB_CLOSE_DELAY=-1;AUTO_SERVER=TRUE;INIT=RUNSCRIPT FROM '${cpk.plugin.dir}/resources/ddl/cnp_db.ddl'
This uses some of the CPK magic variables; CPK, which stands for Community Plugk Kickstarter is the library that powers Sparkl. In runtime it passes some variables that give system environment information, which is going to be very useful for this connection. So using this connection string we can define a new database connection in spoon:

New connection in PDI


The available variables can be seen in the transformation properties of the Sparkl endpoints (when the UI is used to create them).

CPK Variables

In here you see the available variables. By default they are "commented" out (a sparkl convention) and to use them all you need to do is un-comment them. Their meaning is described in the description field. Obviously, the standard parameters can be used, and in here you see that I defined our database name as _cnp_db. There's another sparkl convention here; this parameter, that holds the database name, starts with an underscore, which means that this parameter is internal only - can't be passed from the endpoint calls.

Initializing the database

We're building applications, utilities, tools; While it's cool to have access to a database connection, it's kind'a pointless to have an empty database. That's what the following instruction in the init connect string is for:

INIT=RUNSCRIPT FROM '${cpk.plugin.dir}/resources/ddl/cnp_db.ddl'
I literally have in my sparkl app a cnp_db.ddl file with the initialization script to be run at startup. And it's simply a set of create X if not exists instructions. My script currently reads:

create table if not exists notifications (
id INT PRIMARY KEY AUTO_INCREMENT,
eventtype VARCHAR(64) NOT NULL,
author VARCHAR(1024) NOT NULL,
rcpt VARCHAR,
title VARCHAR(2048),
message VARCHAR,
style VARCHAR(64) NOT NULL,
link VARCHAR
);
You can complement this with whatever sql instruction that will make sure the system is properly initialized.

Developing the KTR from within spoon

As you won't be surprised to hear, those variables are only populated when the transformation or job is executed from the BA server. It's CPK / Sparkl that's responsible for injecting them; spoon has no idea what cpk is. But we still need to develop and test the transformation in our environment.

So what we'll do is simply define in kettle.properties the values that would be replaced by cpk. Here's the snippet of my kettle.properties (I'm working on a plugin called cnp, but more on that later)

cpk.plugin.id = cnp
cpk.plugin.dir = /home/pedro/tex/pentaho/project-sugar/solution/system/cnp
cpk.plugin.system.dir = /home/pedro/tex/pentaho/project-sugar/solution/system/cnp/system
cpk.solution.system.dir = /home/pedro/tex/pentaho/project-sugar/solution/system
cpk.webapp.dir = /home/pedro/tex/pentaho/target-dist/server/webapps/pentaho/
cpk.session.username = admin
cpk.session.roles = Administrator,Authenticated
Now we can work as we usually do, and the behaviour will be the same wither from within spoon or running from the server

Exploring the database contents

Not only for the development process but to do some extra debugging, we need access to the database contents. PDI offers some abilities to do database exploration but I don't think that's a replacement for a proper sql client.

But this is exactly why the flag AUTO_SERVER=TRUE is on the init string. This allows us to connect to the same database using other clients. I'm a user of SQuirreL for a long time, and in order to use it all I needed to do was using the same connect string

Configuring SQuirreL to use our database

The only care we need is to make sure we use the expanded path, as squirrel has no idea what kettle variables are. So my connect string is:

jdbc:h2:file:/home/pedro/tex/pentaho/project-sugar/solution/system/.cnp_dbx;DB_CLOSE_DELAY=-1;AUTO_SERVER=TRUE;INIT=RUNSCRIPT FROM '/home/pedro/tex/pentaho/project-sugar/solution/system/cnp/resources/ddl/cnp_db.ddl'
This allows me to just use this database as if it was a regular server, from within spoon, the BA server or squirrel.

Putting it all together

In the end I was able to achieve what I wanted - a plugin that includes a storage layer and... just works.

Plugin using H2 embedded database


Cheers!


-pedro



More...

Pivotal Greenplum useful DBA SQL queries

$
0
0
1. Viewing table data distribution across segment Servers To view the data distribution of a table’s rows (the number of rows on each segment), you can run a query such as: To see the data distribution of a table’s rows … Continue reading →

More...

Troublemaker BTable

$
0
0
BTable is giving hard time to me.

Using Bi Server 5.2 with cde,cda,cdf version 14.10.15 and BTable version 2.1 stable. I got into trouble 2-3 times never got the answer why BTable never work sometimes.

its not throughing any exception except saying version issue. tried all tricks like uninstall all c-tools and uninstall Btable.

Even after installing fresh still not getting btable working.

I just had a question to BTable guys:- what version of BTable is stable and what is the supported version of c-tools in which it works perfectly.

Any help will be appreciated .

UI/UX Analyzer bug in IE9

$
0
0
I am the UI designer on a team integrating Pentaho into an analytics application. One of my tasks has been to create a new visual theme for Pentaho Analyzer as this piece will be integrated with our product via iframe. I created the theme by first making a duplicate of Crystal. All is working well aside from one bug in Analyzer that I have verified is also standing in the Crystal theme.

In IE9, when you collapse the properties panel, the panel disappears completely. The only way to view this collapsed panel again is to first collapse the layout panel. Along with my QA team, I have tested this on 4 machines with Win7/IE9. 3 out of 4 machines exhibited this issue. See below:

IE9-Analyzer-Cristal-Issue.jpg

In other modern browsers (Chrome, Firefox, IE10 & 11), the collapsed properties panel sits at the bottom of the view port until opened. It appears to be controlled by javascript which sets the inline heights of both the layout and properties panels when the collapse buttons for either panel is clicked. It seems as though this js needed to be adjusted for IE when the crystal theme was developed.

Is this a bug I should report or can this be fixed on my end? If a bug, where to report?
Attached Images

日本語Language Packのインストールに失敗します

$
0
0
お世話になります。
標題の通りなのですが、BI Server CEで日本語LanguagePackのインストールに失敗します。BI Serverのバージョンは5.0.1ですが、5.1でも同じように失敗しております。
OSについてもWindows、Linuxどちらでも試したのですが、同じように失敗します。Javaは1.7.0_45です。

実は私が以前立てた5.0.1のBI Serverでは、少し苦労はしたもののちゃんと日本語がインストールできたのですが、今やってみるとどうにも入りません。何かコツがあったでしょうか。

インストールの手順は、
(1) biserver-ceのインストール(単にZipを展開するだけ)
(2) biserverを起動
(3) マーケットプレイスから日本語LangPackを選択
  ->成功のメッセージは確認
(4) biserverを再起動
(5) ツールメニューからInstall Japanese Language Packを選択
 ->エラー発生

最後のエラーのログを掲載します。
どなたかご指南いただけないでしょうか。

(以下がInstall実行時のエラーメッセージ)
java.lang.NullPointerException
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.getBeanFactory(DefaultPluginManager.java:606)
at pt.webdetails.cpf.InterPluginCall.getBeanObject(InterPluginCall.java:185)
at pt.webdetails.cpf.InterPluginCall.run(InterPluginCall.java:215)
at pt.webdetails.cpf.InterPluginCall.call(InterPluginCall.java:365)
at pt.webdetails.cpf.InterPluginCall.call(InterPluginCall.java:67)
at pt.webdetails.cpk.InterPluginBroker.run(InterPluginBroker.java:38)
at pt.webdetails.cpk.elements.impl.DashboardElement.callCDE(DashboardElement.java:92)
at pt.webdetails.cpk.elements.impl.DashboardElement.processRequest(DashboardElement.java:44)
at pt.webdetails.cpk.CpkCoreService.createContent(CpkCoreService.java:81)
at pt.webdetails.cpk.CpkApi.callEndpoint(CpkApi.java:382)
at pt.webdetails.cpk.CpkApi.genericEndpointGet(CpkApi.java:102)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$VoidOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:167)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:62)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:67)
at org.pentaho.platform.web.servlet.PluginDispatchServlet.service(PluginDispatchServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:161)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:83)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:59)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:112)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:66)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:744)

Books recommendation for Pentaho Data Integration

$
0
0
Hi,

Could you please suggest the best book for Pentaho.

we are planning to work on Pentaho on AWS
so could like to refer few best books to be kept handson.
i would like to have list of best books for

Pentaho Data Integration.
Continous Integration for Pentaho on AWS.

Pentaho 5.0.1 stable - Unable to run PRPT file

$
0
0
Hi,

for any kind of PRPT file I try to execute I get the message:

WARN [org.pentaho.platform.web.http.api.resources.RepositoryResource] End of the resolution chain. No resource [cdf/uriQueryParser/jquery-queryParser.js] found in context [:home:admin:XXX.prpt].


and no content is shown. Even with very simple report with static content (label) and no data connection I get the same issue.

Log level is DEBUG.

Regards,
Fabrizio

Problema connessione JNDI SQL Server Pentaho 5.2

$
0
0
Ciao, ho un problema nel creare una connessione JNDI al server sql 2012 in Pentaho 5.2. Ho fatto la stessa procedura che ho usato nella versione 5.1, che funziona correttamente, ma nella nuova versione continua a darmi l'errore 0009 quando provo ad eseguire il test di connessione. Guardando i file di log ho visto che l'errore è: CAN'T CREATE JDBC DRIVER. Qualcuno ha avuto lo stesso errore e ha una soluzione al mio problema??

Grazie in anticipo!

Embedding saiku 3 (as pentaho plugin) to website

$
0
0
Hi. I use saiku as a plugin to pentaho. I need to embed a saiku on site.
Earlier (saiku 2.6), I used url /pentaho/content/saiku/ui/index.html?biplugin5=true, but now (saiku 3.0) it does not work. Which url should I use?
And how can I remove the home page that would query editor displayed on the site right away?

Problem Connection JNDI Pentaho 5.2

$
0
0
Hi!,
I've a problem to create one JNDI connection to SQL Server2012 in Pentaho 5.2. I've used the same step that I've used in Pentaho 5.1, where the connection is right, but in Pentaho 5.2 I get the error_0009.
In the log file I found the error: CAN'T CREATE JDBC DRIVER..., the sqljdbc4.jar in is present in the folder \tomcat\lib. Anyone have the same problem or know how resolve him?

Thank you.

Need to integrate CDE/CDF in Pentaho EE 5.2

$
0
0
Hi Team,

I have downloaded and installed 'pentaho-business-analytics-5.2.0.0-209-x64'.

I want to design customised Dashboards using CDE/CDF.

Can I integrate the CDF/CDE into this pentaho enterprise edition 5.2 ? If , yes then can you please help or share any link/doc on doing the same. Thank you.

Regards,
Naseer

Error with PRD opening data sets from OpenERP

$
0
0
Hello,

Im trying to look at PRD for use with OpenERP. Under "Data Sets">"Advanced">"OpenERP Data Access", I can get data from the server, It will list out my databases, it lists out Model Names (objects in openerp) but quite a few of important ones like res.partner return an error when tested: "java.lang.String cannot be cast to java.lang.Integer" Also, under "Search Fields" the model name will list under available fields, but the list will not display any fields to select. On the other hand some objects do open, yet I cant find a rhyme or reason to which objects work and which throw the error.

Windows 7 64
PRD 5.2 AND 3.9 (i've tried both, same error)
openERP 7 on a linux server


EDIT: Added a period.

Kitchen issue with logging

$
0
0
Hi All,

I am having Pentaho 5.1.0 on windows machine. I created a Job using Spoon and tried to run from Kitchen.bat like below
kitchen.bat /rep:"pentahoid" /job:"JOB_MASTER" /dir:/Transit/Jobs /level:Basic > C:\pentahoerr.log
In the log file output I am seeing unwanted information(Fields, Columns, Filters etc., for each row) as shown in red.
I tried changing the log level to nothing. Then it shows only the ones in red.
By the way, I am using SAP Input action in my transformation. Can't this global logging will not work for that step?
Not sure where the issue. Any help would be highly appreciated.

Code:

DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.7.0_55
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jdk1.7.0_55\bin\java.exe
2014/11/12 01:46:39 - Kitchen - Logging is at level : Basic logging
2014/11/12 01:46:39 - Kitchen - Start of run.
2014/11/12 01:46:39 - RepositoriesMeta - Reading repositories XML file: C:\Users\kumaravi\.kettle\repositories.xml
2014/11/12 01:46:43 - JOB_MASTER - Start of job execution
2014/11/12 01:46:43 - JOB_MASTER - Starting entry [JOB_AUFK_AFKO_AFPO_AFFV_STXL_LOAD]
2014/11/12 01:46:43 - JOB_AUFK_AFKO_AFPO_AFFV_STXL_LOAD - Starting entry [SAP_AUFK_AFKO_AFPO_AFFV_STXL_LOAD_RUN_STDT_UPDATE]
2014/11/12 01:46:43 - SAP_AUFK_AFKO_AFPO_AFFV_STXL_LOAD_RUN_STDT_UPDATE - Loading transformation from repository [SAP_AUFK_AFKO_AFPO_AFFV_STXL_LOAD_RUN_STDT_UPDATE] in directory [/TRANSIT/Transformations/SAP_AUFK_AFKO_AFPO_AFFV_STXL]
2014/11/12 01:46:45 - SAP_AUFK_AFKO_AFPO_AFFV_STXL_LOAD_RUN_STDT_UPDATE - Dispatching started for transformation [SAP_AUFK_AFKO_AFPO_AFFV_STXL_LOAD_RUN_STDT_UPDATE]
2014/11/12 01:46:46 - Update Run Start Date AUFK AFKO AFPO AFFV STXL.0 - Finished reading query, closing connection.
2014/11/12 01:46:46 - Update Run Start Date AUFK AFKO AFPO AFFV STXL.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
2014/11/12 01:46:46 - JOB_AUFK_AFKO_AFPO_AFFV_STXL_LOAD - Starting entry [SAP_AUFK_STG_LOAD]
2014/11/12 01:46:46 - SAP_AUFK_STG_LOAD - Loading transformation from repository [SAP_AUFK_STG_LOAD] in directory [/TRANSIT/Transformations/SAP_AUFK_AFKO_AFPO_AFFV_STXL]
2014/11/12 01:46:46 - SAP_AUFK_STG_LOAD - Dispatching started for transformation [SAP_AUFK_STG_LOAD]
2014/11/12 01:46:46 - Load into STG_SAP_AUFK.0 - Connected to database [TRANSIT_ORA_DB] (commit=1000)
2014/11/12 01:46:47 - Generate Single Row.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
2014/11/12 01:46:47 - Generate Rows.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
2014/11/12 01:46:47 - Get Last Load Details.0 - Finished reading query, closing connection.
2014/11/12 01:46:47 - Get Last Load Details.0 - Finished processing (I=1, O=0, R=0, W=1, U=0, E=0)
2014/11/12 01:46:47 - AUFK Pay Load.0 - Optimization level set to 9.
2014/11/12 01:46:48 - AUFK Pay Load.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2014/11/12 01:46:48 - Formula.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
Singlevalue Name: DELIMITER, Value: @
Singlevalue Name: QUERY_TABLE, Value: AUFK
Singlevalue Name: ROWCOUNT, Value: 0
Singlevalue Name: ROWSKIPS, Value: 0
Table: FIELDS, Name: FIELDNAME, Row: 0, Value: MANDT
Table: FIELDS, Name: FIELDNAME, Row: 1, Value: AUFNR
Table: FIELDS, Name: FIELDNAME, Row: 2, Value: AUART
Table: FIELDS, Name: FIELDNAME, Row: 3, Value: AUTYP
Table: FIELDS, Name: FIELDNAME, Row: 4, Value: ERDAT
Table: FIELDS, Name: FIELDNAME, Row: 5, Value: ERFZEIT
Table: FIELDS, Name: FIELDNAME, Row: 6, Value: AEDAT
Table: FIELDS, Name: FIELDNAME, Row: 7, Value: AEZEIT
Table: FIELDS, Name: FIELDNAME, Row: 8, Value: KTEXT
Table: FIELDS, Name: FIELDNAME, Row: 9, Value: ASTNR
Table: FIELDS, Name: FIELDNAME, Row: 10, Value: STDAT
Table: OPTIONS, Name: TEXT, Row: 0, Value:  ( ERDAT >= '20141110' AND ERDAT <= '20141112' )
Table: OPTIONS, Name: TEXT, Row: 1, Value:  OR ( AEDAT >= '20141110' AND AEDAT <= '20141112' )
2014/11/12 01:46:55 - Format AUFK Fields.0 - Optimization level set to 9.
2014/11/12 01:46:56 - RFC READ TABLE AUFK.0 - Finished processing (I=0, O=0, R=1, W=13303, U=0, E=0)
2014/11/12 01:46:56 - Split AUFK Work Area.0 - Finished processing (I=0, O=0, R=13303, W=13303, U=0, E=0)
2014/11/12 01:46:58 - Cartesian Join.0 - Finished processing (I=0, O=0, R=13304, W=13303, U=0, E=0)
2014/11/12 01:46:58 - Format AUFK Fields.0 - Finished processing (I=0, O=0, R=13303, W=13303, U=0, E=0)
2014/11/12 01:46:58 - Select AUFK Columns.0 - Finished processing (I=0, O=0, R=13303, W=13303, U=0, E=0)
2014/11/12 01:47:02 - Load into STG_SAP_AUFK.0 - Finished processing (I=0, O=13303, R=13303, W=13303, U=0, E=0)
2014/11/12 01:47:02 - JOB_AUFK_AFKO_AFPO_AFFV_STXL_LOAD - Starting entry [SAP_AFKO_STG_LOAD]
2014/11/12 01:47:02 - SAP_AFKO_STG_LOAD - Loading transformation from repository [SAP_AFKO_STG_LOAD] in directory [/TRANSIT/Transformations/SAP_AUFK_AFKO_AFPO_AFFV_STXL]
2014/11/12 01:47:02 - SAP_AFKO_STG_LOAD - Dispatching started for transformation [SAP_AFKO_STG_LOAD]
2014/11/12 01:47:03 - Load into STG_SAP_AFKO.0 - Connected to database [TRANSIT_ORA_DB] (commit=1000)
2014/11/12 01:47:03 - Generate Rows 2.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
2014/11/12 01:47:03 - Generate Single Row.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
2014/11/12 01:47:03 - Formula 2.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2014/11/12 01:47:03 - AFKO Pay Load.0 - Optimization level set to 9.
Singlevalue Name: DELIMITER, Value: @
Singlevalue Name: QUERY_TABLE, Value: AFKO
Singlevalue Name: ROWCOUNT, Value: 0
Singlevalue Name: ROWSKIPS, Value: 0
Table: FIELDS, Name: FIELDNAME, Row: 0, Value: MANDT
Table: FIELDS, Name: FIELDNAME, Row: 1, Value: AUFNR
Table: FIELDS, Name: FIELDNAME, Row: 2, Value: GSTRI
Table: FIELDS, Name: FIELDNAME, Row: 3, Value: GSUZI
Table: FIELDS, Name: FIELDNAME, Row: 4, Value: AUFPL
Table: OPTIONS, Name: TEXT, Row: 0, Value:  MANDT = '100' AND AUFNR = '000050520377'
Singlevalue Name: DELIMITER, Value: @
Singlevalue Name: QUERY_TABLE, Value: AFKO
Singlevalue Name: ROWCOUNT, Value: 0
Singlevalue Name: ROWSKIPS, Value: 0
Table: FIELDS, Name: FIELDNAME, Row: 0, Value: MANDT
Table: FIELDS, Name: FIELDNAME, Row: 1, Value: AUFNR
Table: FIELDS, Name: FIELDNAME, Row: 2, Value: GSTRI
Table: FIELDS, Name: FIELDNAME, Row: 3, Value: GSUZI
Table: FIELDS, Name: FIELDNAME, Row: 4, Value: AUFPL
Table: OPTIONS, Name: TEXT, Row: 0, Value:  MANDT = '100' AND AUFNR = '000050520380'
Singlevalue Name: DELIMITER, Value: @
Singlevalue Name: QUERY_TABLE, Value: AFKO
Singlevalue Name: ROWCOUNT, Value: 0
Singlevalue Name: ROWSKIPS, Value: 0
Table: FIELDS, Name: FIELDNAME, Row: 0, Value: MANDT
Table: FIELDS, Name: FIELDNAME, Row: 1, Value: AUFNR
Table: FIELDS, Name: FIELDNAME, Row: 2, Value: GSTRI
Table: FIELDS, Name: FIELDNAME, Row: 3, Value: GSUZI
Table: FIELDS, Name: FIELDNAME, Row: 4, Value: AUFPL
Table: OPTIONS, Name: TEXT, Row: 0, Value:  MANDT = '100' AND AUFNR = '000050519741'
Singlevalue Name: DELIMITER, Value: @
Singlevalue Name: QUERY_TABLE, Value: AFKO
Singlevalue Name: ROWCOUNT, Value: 0
Singlevalue Name: ROWSKIPS, Value: 0
Table: FIELDS, Name: FIELDNAME, Row: 0, Value: MANDT
Table: FIELDS, Name: FIELDNAME, Row: 1, Value: AUFNR
Table: FIELDS, Name: FIELDNAME, Row: 2, Value: GSTRI
Table: FIELDS, Name: FIELDNAME, Row: 3, Value: GSUZI
Table: FIELDS, Name: FIELDNAME, Row: 4, Value: AUFPL
Table: OPTIONS, Name: TEXT, Row: 0, Value:  MANDT = '100' AND AUFNR = '000050516821'

Thanks,
Ravi Kumar

Biserver 5.2.0 and MySQL


Display row with condition in pentaho data-integration

$
0
0
Hi,

We have created a pentaho transformation and generate output in Microsoft Excel Output. Now we need to highlight in amber if the value of a column <54000 and in red if it is <50,000.

Please suggect the transformation option for that.

Thanks and regards,
Rupam

crash sometimes

$
0
0
Hi, I hope you are fine. I need your help with PDI and PostgreSQL. In my client I've PDI version 5.1 and my OS is WindowsXP with Java 7 (1.7.0_67-b01)


The problem is that sometimes the PDI crash leaving a filename hs_err_pid784.log with "EXCEPTION_ACCESS_VIOLATION" in header. I just attach the file but I don't understand what could be the cause, some idea help or a way to test something is welcome, really.


Regards and thanks.
Attached Files

Merge Rows Diff - Date Issue

$
0
0
Hello, I am trying to compare if two streams of data (a table in DB2 and another SQL Server) using Merge Rows Diff step and running into an issue related to date value comparison. In DB2 stream I have a column "CREATED" defined as timestamp. In SQL Server I have the column "CREATED" defined as datetime2. In PDI I used a Select Value step to read this column and convert the metadata to kettle timestamp datatype and used the format yyyy-MM-dd HH:mm:ss.SSS. No matter what format I use the Merge Rows Diff is treating the values as different and marking the output as "changed". I noticed SQL Server is rounding the millisecond so I changed my Select Value metadata datatype format to yyyy-MM-dd but the Merge Rows Diff still marks them as changed even though the date part is the same. Can someone shed some light on this issue? What would you suggest to compare datetime values from different data sources? Thanks.

Custom Step Classpath horribleness(linkage errors)

$
0
0
Hello

I'm trying to bootstrap tika and have it read its mime type files, but whenever Tika does

Code:

Document document = builder.parse(new InputSource(stream));
It freaks out and throws
Code:

java.lang.LinkageError: loader constraint violation: when resolving method "javax.xml.parsers.DocumentBuilder.parse(Lorg/xml/sax/InputSource;)Lorg/w3c/dom/Document;" the class loader (instance of org/pentaho/di/core/plugins/KettleURLClassLoader) of the current class, org/apache/tika/mime/MimeTypesReader, and the class loader (instance of <bootloader>) for resolved class, javax/xml/parsers/DocumentBuilder, have different Class objects for the type ;)Lorg/w3c/dom/Document; used in the signature
I've looked into the code and builder.parse is from xml-api's and Tika and PDI seem to depend on the same one so I'm struggling to work out why its freaking out. I've tried shipping my own in the lib directory and it didn't make a difference. Anyone got any ideas?

Tom

Pentaho CE 5.2 OpenI Plugin Throws error "Could not get the slicer value null"

$
0
0
I created a Cube in BI 5.2 which seems to be working great in Saiku Analytics yet when I try to run the OpenI Visualization Plugin I get a dialog box with the "Could not get the slicer value null" error.


I am attaching log file "OpenI_error.txt". I see various types of errors and one of them about my Customer Dimension "contains more than one hierarchy". I'm definitely not an OLAP expert so this does not mean a lot to me yet it does not appear to have more than one hierarchy.


I do not have any problems nor errors with Saiku and the Cube.


Any help or hints would be greatly appreciated!


Thx!


Brig
Attached Files
Viewing all 16689 articles
Browse latest View live