Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Interpanel communication between two table components

$
0
0
Hi,

I am suppose to provide a drill down between 2 table components. such a way like 2nd table need to be update / refresh when i click on 1st column of 1st table. Can any one help me with code to achieve above requirement.

Thanks in advance.

Pentaho 5.2 DI Server Logs

$
0
0
I see below errors when i start pentaho-di server. Can anyone help me how to resolve these issues.

Attached log is the pentaho.log for the di-server.
Attached Files

Where I can download PDI 4.4.1?

$
0
0
Dear all,

I met a problem and need help. Could anyone kindly help me?

We are using PDI 4.4.0 but we need the fix of PDI-9292. However, we don't want to upgrade to PDI-5.0.0 at this stage.

I notice that the fix of PDI-9292 is committed to 4.4.1 as PDI-9717. However, I cannot find the download of PDI 4.4.1.

Does anyone know where I can find it?

Your help will be highly appreciated.

IO Error: Socket read timed out when using the Table Input in transformation

$
0
0
Hello:

The Table input transformation is getting error with the following

ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unexpected error
2014/11/07 20:07:39 - - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2014/11/07 20:07:39 - - Couldn't get row from result set
2014/11/07 20:07:39 - - IO Error: Socket read timed out
2014/11/07 20:07:39 -
2014/11/07 20:07:39 - at org.pentaho.di.core.database.Database.getRow(Database.java:2302)
2014/11/07 20:07:39 - at org.pentaho.di.core.database.Database.getRow(Database.java:2270)
2014/11/07 20:07:39 - at org.pentaho.di.trans.steps.tableinput.TableInput.processRow(TableInput.java:153)
2014/11/07 20:07:39 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2014/11/07 20:07:39 -- at java.lang.Thread.run(Unknown Source)
2014/11/07 20:07:39 - - Caused by: java.sql.SQLException: IO Error: Socket read timed out
2014/11/07 20:07:39 - - at oracle.jdbc.driver.T4CStatement.fetch(T4CStatement.java:1083)
2014/11/07 20:07:39 - at oracle.jdbc.driver.OracleResultSetImpl.close_or_fetch_from_next(OracleResultSetImpl.java:369)
2014/11/07 20:07:39 - - at oracle.jdbc.driver.OracleResultSetImpl.next(OracleResultSetImpl.java:273)
2014/11/07 20:07:39 - - at org.pentaho.di.core.database.Database.getRow(Database.java:2290)
2014/11/07 20:07:39 - - ... 4 more
2014/11/07 20:07:39 - - Caused by: oracle.net.ns.NetException: Socket read timed out

There are more than a million records in the db and this error is experienced after the DB input fetches around 200000 records and running for 1hr 30 mins. I am using the Pentaho 5.1 and am connected to the Oracle DB through JDBC. I am connected to the DB through VPN.

Is there are configuration I can change in pentaho to fix this error?

Any help will be much appreciated.

Thanks.

Problema al publicar cubos olap en JPivot

$
0
0
Buenos dias,
Soy novato en Pentaho y he creado un pequeño cubo con 2 dimensiones para probar Jpivot pero me da problemas. En JPivot me dice que la pagina no puede ser accedida directamente y he estado revisando el Schema y no encuentro el problema, no se si es el esquema que está mal diseñado o es que algún tipo de datos está mal.
Os pongo el xml junto con el Log:
El XML es :
<Schema name="shemaprueba1">
<Cube name="cubo1" visible="true" cache="true" enabled="true">
<Table name="fact_ventas" alias="">
</Table>
<Dimension type="StandardDimension" visible="true" foreignKey="id_producto" name="producto">
<Hierarchy name="producto" visible="true" hasAll="true" primaryKey="id_producto">
<Table name="dim_producto" alias="">
</Table>
<Level name="articulo" visible="true" table="dim_producto" column="ARTICULO" nameColumn="ARTICULO" internalType="String" uniqueMembers="false">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" foreignKey="id_poblacion" name="New Dimension 1">
<Hierarchy name="fecha" visible="true" hasAll="true" primaryKey="id_fecha">
<Table name="dim_fecha" alias="">
</Table>
<Level name="a&#241;o" visible="true" table="dim_fecha" column="a&#241;o" nameColumn="a&#241;o" internalType="int" uniqueMembers="false">
</Level>
</Hierarchy>
</Dimension>
<Measure name="New Measure 1" column="VENTAS" datatype="Integer" aggregator="max" visible="true">
</Measure>
</Cube>
</Schema>

Y el log de pentaho pone esto:

2014-11-09 09:57:15,750 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2014-11-09 09:57:15,751 ERROR [org.pentaho.platform.util.logging.Logger] misc-MondrianModelComponent: MondrianModel.ERROR_0001 - [es_82] getInitialQuery(): Connection is not valid: {DataSource=FoodMart, PoolNeeded=false, EnableXmla=false, Provider=mondrian, Catalog=mondrian:/shemaprueba1}
2014-11-09 09:57:15,751 ERROR [org.pentaho.jpivot.PivotViewComponent] 6705d423-67ee-11e4-b008-701a04f1ff79:COMPONENT:context-7912574-1415523435627:PivotView.ERROR_0010 - !PivotView.ERROR_0010_QUERY_GENERATION_FAILED!
2014-11-09 09:57:15,759 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] 6705d423-67ee-11e4-b008-701a04f1ff79:SOLUTION-ENGINE:default.xjpivot: Action Sequence execution failed, see details below
| Error Time: domingo 9 de noviembre de 2014 09H57' CET
| Session ID: admin
| Instance Id: 6705d423-67ee-11e4-b008-701a04f1ff79
| Action Sequence:
| Execution Stack:
EXECUTING ACTION: Pivot View (PivotViewComponent)
| Action Class: PivotViewComponent
| Action Desc: Pivot View
| Loop Index: 0
Stack Trace:org.pentaho.platform.api.engine.ActionExecutionException: RuntimeContext.ERROR_0017 - [es_18] Activity failed to execute
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeComponent(RuntimeContext.java:1211)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeAction(RuntimeContext.java:1151)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.performActions(RuntimeContext.java:1063)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeLoop(RuntimeContext.java:1013)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:895)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:797)
at org.pentaho.platform.engine.services.solution.SolutionEngine.executeInternal(SolutionEngine.java:352)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:282)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:188)
at org.pentaho.jpivot.AnalysisViewService.getNewAnalysisViewRuntime(AnalysisViewService.java:553)
at org.pentaho.jpivot.Pivot_jsp._jspService(Pivot_jsp.java:472)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
at org.pentaho.platform.web.servlet.PluginDispatchServlet.service(PluginDispatchServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:879)
at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:617)
at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1774)
at java.lang.Thread.run(Thread.java:745)

Error cuando ejecuto la MDX en mi esquema WorkBench

$
0
0
Bueno días,
Cuando ejecuto mi esquema WorkBench me da el siquiente error mondrian:
Mondrian Error:Failed to parse query ' '
Mondrian Error:Internarl error: While parsing
Mondrian Error: Syntax error at line, column,1, token ' '

Este es mi esquema realizado, he hecho 2 dimensiones sencillas, una con los años y otra con tipos de articulos, como agregado le he puesto una suma de las ventas, es muy sencilla pero no se cual es el problema.
Abajo os pongo el post de mi Xml:
<Schema name="shemaprueba1">
<Cube name="cubo1" visible="true" cache="true" enabled="true">
<Table name="fact_ventas" alias="">
</Table>
<Dimension type="StandardDimension" visible="true" foreignKey="id_producto" name="producto">
<Hierarchy name="producto" visible="true" hasAll="true" primaryKey="id_producto">
<Table name="dim_producto" alias="">
</Table>
<Level name="articulo" visible="true" table="dim_producto" column="ARTICULO" nameColumn="ARTICULO" internalType="String" uniqueMembers="false">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" foreignKey="id_poblacion" name="New Dimension 1">
<Hierarchy name="fecha" visible="true" hasAll="true" primaryKey="id_fecha">
<Table name="dim_fecha" alias="">
</Table>
<Level name="año" visible="true" table="dim_fecha" column="año" nameColumn="año" internalType="int" uniqueMembers="false">
</Level>
</Hierarchy>
</Dimension>
<Measure name="New Measure 1" column="VENTAS" datatype="Integer" aggregator="max" visible="true">
</Measure>
</Cube>
</Schema>

Por favor ,necesitaría ayuda porque me he atrancado aqui y no se cual es el problema.

Gracias.

Not able to install Pentaho CE, spoon.bat

$
0
0
I'm trying to install pdi-ce-5.2.0.0-2091.zip which I downloaded from sourceforge. When i click on spoon.bat the file unzips but then nothing else happens.

When I go to cmd prompt and type in spoon.bat I get:

C:\Users\JeffDT\Pentaho\DATA-I~1>spoon
WARNING: Using java from path
DEBUG: _PENTAHO_JAVA_HOME=
DEBUG: _PENTAHO_JAVA=javaw.exe
The system cannot find the path specified.
The system cannot find the path specified.

C:\Users\JeffDT\Pentaho\DATA-I~1>start "Spoon" "javaw.exe" "-Xmx512m" "-XX:MaxP
ermSize=256m" "-Djava.library.path=libswt\win32" "-DKETTLE_HOME=" "-DKETTLE_REPO
SITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKE
TTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-la
uncher-5.2.0.0-209.jar -lib ..\libswt\win32

C:\Users\JeffDT\Pentaho\DATA-I~1>

I have Java installed (jre7, jdk1.8.0_20)

I'm not an IT person, so please dumb any answer all the way down so a newbie like me can understand it. tia.

Set variable for every row read of a CSV File

$
0
0
Hi all, can someone help me with the below problem?

I've got a csv file (or a table in a database) which has database connection details for multiple sources (mysql in this case), like below:

hostname database username password
12.345.678.90 dbname abcd 1234
01.234.567.89 dbname abcd 1234
90.123.456.78 dbname abcd 1234


what I'd like to do is, read the csv file and for every record set the parameters, and then use a database connection (parameterized) to extract the data (say select * from <table>) into another csv file


So far, I was able to created a job with two transformations
a. transformation #1 - reads csv and sets variables
b. transformation #2 - runs a query on database (connection is parameterized) and extracts data into a csv file

Now I'd like the job/ transformation to be run for every row in the csv file and extract data into a csv file, which I can't get it working.

Limit number of rows in mongodb input of Kettle

$
0
0
I want to retrieve data in mongodb input by limiting the number of rows. But I found $limit operation is not working in Kettle.
There is a similar post which uses $maxScan to solve it (link). But there are some conditions in my query.

For example:

{ "$query" : { "type" : " view " } , $orderby : { "time" : -1}, $limit : 100 } // not working

The results by using $maxScan are completely different. $maxScan may only return 10 rows.

How could I solve this problem? And why the $limit operation is not supported in Kettle? I think it is a basic operation.
Many thanks!

Login Error after installation

$
0
0
Hello all,

i have installed Pentaho CE 5.2 from http://community.pentaho.com/ on Windows 8 using Java 7. I have set the PENTAHO_JAVA_HOME variable.
When accessing http://localhost:8080/ when trying to Login with "Admin" and "passwort" i always get a Logine error message.

Did i miss a installation step?
Any idea how to fix this?
Where can i find an official install document for community edition?


Thanks for your help!

David

Installation problem, Pentaho won't run

$
0
0
I downloaded pdi-cd-5.0.1.A-stable.zip and ran the spoon.bat file. It unzipped but does nothing more.

When I run spoon.bat from the command prompt I get the following:

C:\Users\JeffDT\Pentaho\DATA-I~1>spoon
DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jre7
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jre7\bin\javaw.exe

C:\Users\JeffDT\Pentaho\DATA-I~1>start "Spoon" "C:\Program Files\Java\jre7\bin\javaw.exe" "-Xmx512m" "-XX:MaxPermSize=256m" "-Djava.library.path=libswt\win32" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY="
"-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-launcher-5.2.0.0-209.jar -lib ..\libswt\win32

I'm running Windows 7 32 bit. Any thoughts appreciated.

SFTP Filename Wildcard

$
0
0
Hi, I'm having trouble creating a wild card to grab specific files from an SFTP site.

The file I'm trying to get will be named as follows:

productiondb_out_f_monthly_20141110.csv

The date within the file name will change, but is not known (ie I is not today's date).

I wildcard that will identify files beginning with "productiondb_out_f_monthly_" should work, but I'm not sure how to form that wildcard

Thanks,

label add to bar chart

$
0
0
How to change the size of x-axis font and how to display the x axis label and same for y axis.

i have tried this xAxisLabel_font -13 px Arial in extension point but not got any success so far ..

can any one tell me how to add x axis label..

Pentaho Report Parameters not Displaying Correctly (v5.2)

$
0
0
Hi,

I have two problems in how the BI integrated PRD is behaving:

1. String parameters reformat themselves like #,### after tabing out of them.
It didn't happen before: 2014->2,014
2. Multiselect lists doesn't resize vertically to the numbers of display lines.
I haven't tried this one before but 1.5 lines are not confortable to browse through.

they work but it's strange.

Any suggestion will be apreciated, even funny ones.

Table Component - dataBar with different colors

$
0
0
Hi everyone!
I'm making a table component, and I try to put differents colors in a dataBar value
Is it possible?

Thanks and regards

Kitchen not work 5.2

$
0
0
Kitchen.bat not work in the new PDI version 5.2. I'm trying to run a JOB but it does not work, it seems a problem with "pentaho-application-launcher-5.2.0.0-209.jar".

Reading the HTML in a transformation

$
0
0
I am new in penthao world
Need suggestion regarding Reading the HTML in a transformation (Load file content into memory) .


How to reset the dashboard.firechange function on the basis of filter

$
0
0
Hi Forum
I have created 1 filter like a in my interactive dashboard.

Right now i am doing inter panel communication between bar chart to table its working fine.but problem is if i select different value from the filter

then i want to reset the dashboard.firechange function,so it should work as like selected value from filter .

could you please guide me how i can do that

Is PDI the right tool?

$
0
0
We are planning to rewrite an old process creating an output file for an external system (out of our control).

It's a very classical process that writes several output rows for each input row and I was wondering if PDI could handle easily this behaviour.

Every single row has to create several output lines on the same text file. Depending on several conditions some output lines are created. The conditions themselves are not the problem, but I cannot figure out how to create several lines on the same file for each row in a reasonable way.

One (crazy) idea was to create diferent files for each kind of output line and as a final step merge all files streaping the keys used to sort the files.
Another (not so crazy?) idea was to build a row with a bunch of columns and then use a Row normalizer step.

And the real feeling is that PDI is not the right tool for this kind of job ;) ;) (that for sure we can do easily with any programming language).

Any ideas?

Access session variable with webserice

Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>