Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

weka Support and confidence

$
0
0
Hi,

I have been working with weka, and despite being getting the results I want, this work has been based on trial and error. thus enjoyed better understand issues related to LowerBoundMinSupport, UpperBoundMinSupport and minMetric (in confidence).

I know more or less what it means each of them, but would like to know the following:


what is the relationship between LowerBoundMinSupport and UpperBoundMinSupport?

minMetric is independent of support or is there any correlation between them?

How the combination of these measures influence the quality of the rules i get?


Thanks for your help

weka.jpg
Attached Images

Newbie -Help Appreciated in setting up job

$
0
0
Hello,

I am very new to Kettle and would like to ask for help in setting up a job/transformation with teh following capability:
1. Check Database table to see if any rows currently exists
2. If rows exists, move the data to a backup table and clear current table
3. Read excel file
4. Store excel file into table that was cleared
5. Done

My problem has been with the parallel execution of the transformation. I have tried flow control but guess i am not doing correctly.
What is the best way of performing these actions?

Thanks

This prompt value is of an invalid value

$
0
0
Hi all,
I got this error :Element overlaps with other content and will note be printed in table-exports.
Have you any idea,what does it mean? whene I try to publish the report an an other error appear in the user console This prompt value is of an invalid value.
Here's the log:
2013-06-02 20:17:58,577 WARN [org.pentaho.reporting.libraries.base.boot.PackageManager] Unresolved dependency for package: org.pentaho.reporting.engine.classic.extensions.datasources.cda.CdaModule
2013-06-02 20:17:58,658 WARN [org.pentaho.reporting.libraries.base.boot.PackageSorter] A dependent module was not found in the list of known modules.
2013-06-02 20:18:12,803 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] c8efd75d-cbb0-11e2-a425-2d49cf696209:SOLUTION-ENGINE:scheduler.xaction: Action Sequence execution failed, see details below
| Error Time: dimanche 2 juin 2013 20 h 18 CEST
| Session ID: scheduler.xaction
| Instance Id: c8efd75d-cbb0-11e2-a425-2d49cf696209
| Action Sequence: scheduler.xaction
| Execution Stack:
EXECUTING ACTION: Scheduler (org.pentaho.platform.engine.services.solution.PojoComponent)
| Action Class: org.pentaho.platform.engine.services.solution.PojoComponent
| Action Desc: Scheduler
| Loop Index (1-based): 0
Stack Trace:org.pentaho.platform.api.engine.ActionExecutionException: RuntimeContext.ERROR_0017 - [fr_18] Activity failed to execute
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeComponent(RuntimeContext.java:1325)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeAction(RuntimeContext.java:1262)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.performActions(RuntimeContext.java:1161)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeLoop(RuntimeContext.java:1105)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:987)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:897)
at org.pentaho.platform.engine.services.solution.SolutionEngine.executeInternal(SolutionEngine.java:399)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:317)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:193)
at org.pentaho.platform.engine.services.BaseRequestHandler.handleActionRequest(BaseRequestHandler.java:159)
at org.pentaho.platform.scheduler.QuartzExecute.execute(QuartzExecute.java:198)
at org.quartz.core.JobRunShell.run(JobRunShell.java:203)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:520)


2013-06-02 20:18:16,280 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:18,724 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:19,127 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:20,117 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:20,713 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:21,090 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:21,447 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:21,702 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]
2013-06-02 20:18:21,737 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]
2013-06-02 20:18:21,897 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:22,300 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:23,595 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:23,789 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:24,076 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:24,476 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:24,931 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:25,300 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:25,623 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:25,847 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]
2013-06-02 20:18:25,875 WARN [org.apache.axis2.description.java2wsdl.DefaultSchemaGenerator] We don't support method overloading. Ignoring [public java.lang.String serializeModels(org.pentaho.metadata.model.Domain,java.lang.String,boolean) throws java.lang.Exception]
2013-06-02 20:18:26,048 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:18:26,370 WARN [org.apache.axis2.description.AxisService] Unable to generate EPR for the transport : http
2013-06-02 20:24:37,177 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 20:24:37,277 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Libelle commission' with value 'null'
2013-06-02 20:24:37,492 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 20:24:37,493 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Libelle commission' with value 'null'
2013-06-02 20:24:42,602 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:24:42,605 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Libelle commission' with value 'null'
2013-06-02 20:24:45,292 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 20:24:52,814 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:25:08,352 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:25:11,230 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2013'
2013-06-02 20:25:23,560 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:25:26,901 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2013'
2013-06-02 20:25:32,506 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 20:34:22,148 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:34:25,339 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2013'
2013-06-02 20:34:28,289 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:50:39,415 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
2013-06-02 20:54:11,248 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2013'
2013-06-02 21:03:08,262 ERROR [org.pentaho.platform.repository.solution.SolutionRepositoryBase] SolutionRepository.ERROR_0023 - Invalid publish location: Root Folder
2013-06-02 21:05:02,375 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 21:05:02,383 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Libelle commission' with value 'null'
2013-06-02 21:05:02,524 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 21:05:02,526 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Libelle commission' with value 'null'
2013-06-02 21:05:06,516 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value 'null'
2013-06-02 21:05:09,375 WARN [org.pentaho.reporting.engine.classic.core.parameters.DefaultReportParameterValidator] Parameter validation error: No such value in the result for 'Année' with value '2012'
Any help pleaase?
Thanks in adavance

Intermittent error: Exception while loading class

$
0
0
Hello:

I am getting this error most of the time when I run this job. It seems to work most often (but not always) from the Pentaho Designer, and never seems to work when I run the job from a batch file. Here is a sample of the log. The driver for the table input which is failing is net.sourceforge.jtds.jdbc.Driver, which is pointing to a SQL Server. Pentaho is version 4.2. Since it works sometimes obviously the variable is o.k and the class does exist.

Code:

INFO  02-06 22:35:49,717 - write_to_csv_files - Dispatching started for transformation [write_to_csv_files]
ERROR 02-06 22:35:49,763 - dm_get_prev_quarter - An error occurred, processing will be stopped:
Error occured while trying to connect to the database


Exception while loading class
${CON_DM_DRIVER}




ERROR 02-06 22:35:49,764 - dm_get_prev_month - An error occurred, processing will be stopped:
Error occured while trying to connect to the database


Exception while loading class
${CON_DM_DRIVER}




ERROR 02-06 22:35:49,765 - dm_get_prev_week - An error occurred, processing will be stopped:
Error occured while trying to connect to the database


Exception while loading class
${CON_DM_DRIVER}




ERROR 02-06 22:35:49,765 - dm_get_prev_month - Error initializing step [dm_get_prev_month]
ERROR 02-06 22:35:49,765 - dm_get_prev_week - Error initializing step [dm_get_prev_week]
ERROR 02-06 22:35:49,765 - dm_get_prev_quarter - Error initializing step [dm_get_prev_quarter]
ERROR 02-06 22:35:49,766 - write_to_csv_files - Step [dm_get_prev_month.0] failed to initialize!
ERROR 02-06 22:35:49,766 - write_to_csv_files - Step [dm_get_prev_quarter.0] failed to initialize!
ERROR 02-06 22:35:49,766 - write_to_csv_files - Step [dm_get_prev_week.0] failed to initialize!
INFO  02-06 22:35:49,766 - dm_get_prev_month - Finished reading query, closing connection.
INFO  02-06 22:35:49,766 - dm_get_prev_quarter - Finished reading query, closing connection.
INFO  02-06 22:35:49,767 - dm_get_prev_week - Finished reading query, closing connection.
ERROR 02-06 22:35:49,775 - write_to_csv_files - Unable to prepare for execution of the transformation
ERROR 02-06 22:35:49,775 - write_to_csv_files - org.pentaho.di.core.exception.KettleException:
We failed to initialize at least one step.  Execution can not begin!

How to have ranking in my report

$
0
0
Hi,

I need to create a report with ranks. Heard that this is not possible in Pentaho.

Could you please let me know if this is really not possible.

Also let me know how to have drilling enabled in my report ?

running Pentaho from a .exe file instead of .bat files

$
0
0
Hi everyone,
I need to replace the .bat files with .exe files and run Pentaho solution that way. I have used some tools to do so but each had a problem, is there any solution for that? or any tools that can do the job?


These are links to software I have used:
1. http://download.cnet.com/Bat-To-Exe-...-10555897.html : This one is the most suitable one but start-up fails at first steps. I think it is due to dependencies is required to be added to the exe file and it will be the whole solution directories and sub-directories.
2. http://sourceforge.net/projects/bat2exe/?source=dlp : which is a command/bash based tool. everything is fine with it except the connection between BI-server and admin-console is lost after the solution is started.
3. http://www.computerhope.com/dutil.htm#00 : another windows based tool which has problems of the first one.


regards,

In My situation how to design the schema.xml

$
0
0
Sorry, I have no idea how to design schema.xml base my situation.

I have three dimension table

Account
accountID AccountName
A1 AAA
B1 BBB
C1 CCC

Customer
CustomerID CustomerName
A2 AA2
A3 AA3
A4 AA4
B2 BB2
C2 CC2

Product
ProductID ProductName
A Apple
B Ball
C Cat
D Dog
E Egg

and Fact table
accountID CustomerID Product QTY
A1 A2 A 5
A1 A3 C 3
A1 A4 D 4
B1 B2 C 10
B1 B2 D 20
B1 B2 A 33
C1 C2 D 40

Now I create a cube Composed of the dimensions' ID and fact table
and the data show on the BI is that.

accountID CustomerID Product QTY
A1 A2 A 5
A1 A2 B 0
A1 A2 C 0
A1 A2 D 0
A1 A2 E 0
A1 A3 A 0
A1 A3 B 0
A1 A3 C 2
A1 A3 D 0
A1 A3 E 0
A1 A4 A 0
A1 A4 B 0
A1 A4 C 0
A1 A4 D 4
A1 A4 E 0
A1 B2 A 0
A1 B2 B 0
A1 B2 C 0
A1 B2 D 0
A1 B2 E 0
A1 C2 A 0
A1 C2 B 0
A1 C2 C 0
A1 C2 D 0
A1 C2 E 0
B1 A2 A 0
B1 A2 B 0
B1 A2 C 0
B1 A2 D 0
B1 A2 E 0
B1 A3 A 0
B1 A3 B 0
B1 A3 C 0
B1 A3 D 0
B1 A3 E 0
B1 A4 A 0
B1 A4 B 0
B1 A4 C 0
B1 A4 D 0
B1 A4 E 0
B1 B2 A 0
B1 B2 B 0
B1 B2 C 10
B1 B2 D 20
B1 B2 E 0
B1 C2 A 0
B1 C2 B 0
B1 C2 C 0
B1 C2 D 0
B1 C2 E 0
C1 A2 A 5
C1 A2 B 0
C1 A2 C 0
C1 A2 D 0
C1 A2 E 0
C1 A3 A 0
C1 A3 B 0
C1 A3 C 0
C1 A3 D 40
C1 A3 E 0
C1 A4 A 0
C1 A4 B 0
C1 A4 C 0
C1 A4 D 4
C1 A4 E 0
C1 B2 A 0
C1 B2 B 0
C1 B2 C 0
C1 B2 D 0
C1 B2 E 0
C1 C2 A 0
C1 C2 B 0
C1 C2 C 0
C1 C2 D 0
C1 C2 E 0

but I want the data show like the fact table.

if the value not exists in fact table and the cube not show.

if i describe not clear please leave msg let me know.

Or Please give me some suggestion about design the schema.xml

XMLHttpRequest (XHR) within Pentaho Data Integration

$
0
0
I'm using the step "REST Client" to call a web service that returns the initial data in json format.
The number of the returned records is limited to 50.

Code:

{

Therefore I have to read the nested property "nextPage" within the json object to have access to the next 50 records.
I tried to work with a XMLHttpRequest in Javascript but it failed with the following log message "ReferenceError: "XMLHttpRequest" is not defined".

Code:

var jsonObject = JSON.parse(jsonAllJobsStream.getString());
var nextPage = jsonObject.metadata.links.nextPage;
var xmlhttp = new XMLHttpRequest();

So how can I do a XMLHttpRequest within Pentaho Data Integration?
Or is there any other way to follow nested links within a json stream?

hello

$
0
0
newbie from england, joined to share some useful suggestions and also to be an active user on this community

How to set up non mandatory parameters?

$
0
0
Hi,

Anybody could explain me how to set up non mandatory parameters in report designer?

Latest date from source file

PRD Open formula Substitute more than one character

$
0
0
Hi everyone
In one of my report I have text without space. Instead of space they have _ or -
So i'm using the function SUBTITUTE to replace it
Code:

=SUBSTITUTE([nom_Projet]; "_" ;" ")
With this I only replace _
So the question is, can I substitute with a single function the two characters?


The other solution that I found is kind of dirty (in my opinion): I do it with two functions. The first function replace the "_" and the second function use the text of the first one and remove "-"

Publish xaction, cda, wcdf, wcde

$
0
0
Hi,

I was looking for a way to publish files (xaction, cda, wcdf, wcde) to the server without having to access it through a ssh or ftp (I mean without 3rd part tools) and I endup finding the "RepositoryFilePublisher" as somethigs that could work... I read up the source from:
http://grepcode.com/file/repository.pentaho.org$artifactory$pentaho@pentaho$pentaho-bi-platform-web-servlet@4.4-SNAPSHOT@org$pentaho$platform$web$servlet$RepositoryFilePublisher.java

And as far as I notice it alredy works for PME... My guess is... PME post a xmi file plus some fields:

Code:

  String publishPath = request.getParameter("publishPath"); //$NON-NLS-1$

86      String publishKey = request.getParameter("publishKey");//$NON-NLS-1$



87      String jndiName = request.getParameter("jndiName");//$NON-NLS-1$



88      String jdbcDriver = request.getParameter("jdbcDriver");//$NON-NLS-1$



89      String jdbcUrl = request.getParameter("jdbcUrl");//$NON-NLS-1$


90      String jdbcUserId = request.getParameter("jdbcUserId");//$NON-NLS-1$


91      String jdbcPassword = request.getParameter("jdbcPassword");//$NON-NLS-1$



92      boolean overwrite = Boolean.valueOf(request.getParameter("overwrite")).booleanValue(); //$NON-NLS-1$



93      boolean mkdirs = Boolean.valueOf(request.getParameter("mkdirs")).booleanValue(); //$NON-NLS-1$

So... I plot at my machine a webserver and try to make a simple form to send a "file" (cda) with the fields filled... like:

Code:

    <form  action="http://MY_URL/pentaho/RepositoryFilePublisher?userid=MYUSER&password=MYPASS" method="post" enctype="multipart/form-data">
        Publish Path: <input type="text" name="publishPath" value="MY_FOLDER" />
        Publish pass: <input type="text" name="publishKey" value="MY_PUBLISH_PASS" /><BR>
        Userid: <input type="text" name="userid" value="MY_USERID" />
        Password: <input type="text" name="password" value="MY_USER_PASSWORD"/><BR>
        JNDI: <input type="text" name="jndiName" value="JNDINAME"/>
        JDBC Driver: <input type="text" name="jdbcDriver" value="org.postgresql.Driver"/><BR>
        JDBC URL: <input type="text" name="jdbcUrl" value="jdbc:postgresql://MYIP/MYDB"/>
        JDBC User: <input type="text" name="jdbcUserId" value="DBUSER"/><BR>
        JDBC Pass: <input type="text" name="jdbcPassword" value="DBPASS"/><BR> 
        Overwrite: <input type="checkbox" name="overwrite" />
        Mkdir: <input type="checkbox" name="mkdirs" /><BR>             
        <input type="file" name="fileItems" /><BR>
        <input type="submit"/>
    </form>

The publish process works fine with PME (with passwords etc...) so the info are currect... but when I submit I got:

HTTP Status 500 -


type Exception report
message
description The server encountered an internal error () that prevented it from fulfilling this request.
exception

java.lang.NullPointerException org.pentaho.platform.web.servlet.RepositoryFilePublisher.doPublish(RepositoryFilePublisher.java:133) org.pentaho.platform.web.servlet.RepositoryFilePublisher.doGet(RepositoryFilePublisher.java:107) org.pentaho.platform.web.servlet.RepositoryFilePublisher.doPost(RepositoryFilePublisher.java:76) javax.servlet.http.HttpServlet.service(HttpServlet.java:637) javax.servlet.http.HttpServlet.service(HttpServlet.java:717) org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:92) org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378) org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109) org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91) org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53) org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390) org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175) org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99) org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:60) org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113) note The full stack trace of the root cause is available in the Apache Tomcat/6.0.29 logs.

Apache Tomcat/6.0.29



So... I missing somethings... but... what? I see no file type validation and there is no error at catalina.out

Accessing getSolutionRepositoryDoc Webservice using Spring Security with preAuthFilte

$
0
0
Hi,

I am using Spring Security with a preauthenticationFilter for getting SSO working (Pentaho 4.8).
I can now login to pentaho preauthenticated from my other webapplication.
But my SolutionRepositoryBrowser throws a Nullpointer Exception and i dont know why.

Code:

ERROR [[SolutionRepositoryService]] Servlet.service() for servlet SolutionRepositoryService threw exception
java.lang.NullPointerException
        at org.pentaho.platform.repository.solution.SolutionRepositoryServiceImpl.getSolutionRepositoryDoc(SolutionRepositoryServiceImpl.java:555)


I removed the securityContextHolderAwareRequestFilter from the spring configuration to get the preauth working. if i add the securityContextHolderAwareRequestFilter again, the repositoryservice works (but sso not anymore).


This is my current Spring filter config:
Code:

  <bean id="filterChainProxy" class="org.springframework.security.util.FilterChainProxy">
    <property name="filterInvocationDefinitionSource">
      <!--
          You can safely remove the first pattern starting with /content/dashboards/print, if you're not using
          Enterprise Dashboards or not allowing printing of Dashboards,
      -->
      <value>
        <![CDATA[CONVERT_URL_TO_LOWERCASE_BEFORE_COMPARISON
        PATTERN_TYPE_APACHE_ANT
        /**=httpSessionContextIntegrationFilter,httpSessionReuseDetectionFilter,logoutFilter,j2eePreAuthFilter,authenticationProcessingFilter,basicProcessingFilter,requestParameterProcessingFilter,anonymousProcessingFilter,pentahoSecurityStartupFilter,exceptionTranslationFilter,filterInvocationInterceptor]]>
      </value>
    </property>
  </bean>

and heres my auth config:

Code:

<bean id='j2eePreAuthFilter' class='org.springframework.security.ui.preauth.j2ee.J2eePreAuthenticatedProcessingFilter'>
    <property name='authenticationManager' ref='authenticationManager' />
    <property name='authenticationDetailsSource' ref='authenticationDetailsSource' />
  </bean>
 
  <bean id='preAuthenticatedAuthenticationProvider' class='org.springframework.security.providers.preauth.PreAuthenticatedAuthenticationProvider'>
    <property name='preAuthenticatedUserDetailsService' ref='preAuthenticatedUserDetailsService' />
  </bean>
  <bean id='preAuthenticatedUserDetailsService' class='de.test.MyUserDetailsService' />
   
  <bean id='authenticationDetailsSource' class='org.springframework.security.ui.preauth.j2ee.J2eeBasedPreAuthenticatedWebAuthenticationDetailsSource'>
    <property name='mappableRolesRetriever' ref='j2eeMappableRolesRetriever' />
    <property name='userRoles2GrantedAuthoritiesMapper' ref='j2eeUserRoles2GrantedAuthoritiesMapper' />
  </bean>
 
  <bean id='j2eeUserRoles2GrantedAuthoritiesMapper' class='org.springframework.security.authoritymapping.SimpleAttributes2GrantedAuthoritiesMapper'>
    <property name='convertAttributeToUpperCase' value='false' />
    <property name='attributePrefix' value='' />
  </bean>
 
  <bean id='j2eeMappableRolesRetriever' class='org.springframework.security.ui.preauth.j2ee.WebXmlMappableAttributesRetriever'>
  <property name='webXmlInputStream'>
    <bean factory-bean='webXmlResource' factory-method='getInputStream' />
  </property>
</bean>


 <bean id='webXmlResource' class='org.springframework.web.context.support.ServletContextResource'>
  <constructor-arg ref='servletContext' />
  <constructor-arg value='/WEB-INF/web.xml' />
 </bean>
    <bean id='servletContext' class='org.springframework.web.context.support.ServletContextFactoryBean' />
  <bean id="authenticationManager" class="org.springframework.security.providers.ProviderManager">
    <property name="providers">
      <list>
        <ref bean='preAuthenticatedAuthenticationProvider' />
        <ref bean="daoAuthenticationProvider" />
        <ref local="anonymousAuthenticationProvider" />
      </list>
    </property>




Can anybody help me to get the solutionrepositoy service working with the spring preauthentication configuration?



Thx in advance,
student

MS Excel Writer - formula problems

$
0
0
So, new week, new problems needing help =p I'm playing now with formulas on Excel Writer output... but I can't seem to get it right.

First few problems were kinda dumb, actually... MS in PT-BR uses ; instead of , to separate arguments inside functions, but that was easy to find out. Translation of functions from pt-br to en was also easy to find... but now I'm getting a strange message, but searching google for it is returning me almost only this source code.

Formulas used:
Code:

















VLOOKUP("100",$B:$F,5,0) - VLOOKUP("105",$B:$F,5,0)
ROUND(VLOOKUP("100",PR!B:F,5,0),2)
IF(VLOOKUP("960",'ATIVO PERMANENTE - LIMITE'!B:D,3,0)>=0,0,ABS(VLOOKUP("960",'ATIVO PERMANENTE - LIMITE'!B:D,3,0)))
VLOOKUP("720",$B:$F,5,0) + VLOOKUP("800",$B:$F,5,0) + VLOOKUP("810",$B:$F,5,0) + VLOOKUP("820",$B:$F,5,0) + VLOOKUP("830",$B:$F,5,0) + VLOOKUP("840",$B:$F,5,0) + VLOOKUP("850",$B:$F,5,0) + VLOOKUP("860",$B:$F,5,0) + VLOOKUP("870",$B:$F,5,0) + VLOOKUP("880",$B:$F,5,0)
VLOOKUP("720",PEPR!$B:$K,10,0)
VLOOKUP("870",POPR!B15:N49,12,0)
VLOOKUP("100",$B:$F,5,0)*0.11/VLOOKUP("900",$B:$F,5,0)
VLOOKUP("890",RBAN!B7:H26,7,0)

Errors received, running (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy):
Code:

ERROR : Unexpected errorERROR : java.lang.IllegalStateException: evaluation stack not empty
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateFormula(WorkbookEvaluator.java:541)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateAny(WorkbookEvaluator.java:288)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateReference(WorkbookEvaluator.java:702)
ERROR :    at org.apache.poi.ss.formula.SheetRefEvaluator.getEvalForCell(SheetRefEvaluator.java:51)
ERROR :    at org.apache.poi.ss.formula.LazyAreaEval.getRelativeValue(LazyAreaEval.java:51)
ERROR :    at org.apache.poi.ss.formula.eval.AreaEvalBase.getValue(AreaEvalBase.java:109)
ERROR :    at org.apache.poi.ss.formula.functions.LookupUtils$ColumnVector.getItem(LookupUtils.java:99)
ERROR :    at org.apache.poi.ss.formula.functions.Vlookup.evaluate(Vlookup.java:61)
ERROR :    at org.apache.poi.ss.formula.functions.Var3or4ArgFunction.evaluate(Var3or4ArgFunction.java:36)
ERROR :    at org.apache.poi.ss.formula.OperationEvaluatorFactory.evaluate(OperationEvaluatorFactory.java:132)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateFormula(WorkbookEvaluator.java:525)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateAny(WorkbookEvaluator.java:288)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateReference(WorkbookEvaluator.java:702)
ERROR :    at org.apache.poi.ss.formula.SheetRefEvaluator.getEvalForCell(SheetRefEvaluator.java:51)
ERROR :    at org.apache.poi.ss.formula.LazyAreaEval.getRelativeValue(LazyAreaEval.java:51)
ERROR :    at org.apache.poi.ss.formula.eval.AreaEvalBase.getValue(AreaEvalBase.java:109)
ERROR :    at org.apache.poi.ss.formula.functions.LookupUtils$ColumnVector.getItem(LookupUtils.java:99)
ERROR :    at org.apache.poi.ss.formula.functions.Vlookup.evaluate(Vlookup.java:61)
ERROR :    at org.apache.poi.ss.formula.functions.Var3or4ArgFunction.evaluate(Var3or4ArgFunction.java:36)
ERROR :    at org.apache.poi.ss.formula.OperationEvaluatorFactory.evaluate(OperationEvaluatorFactory.java:132)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateFormula(WorkbookEvaluator.java:525)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluateAny(WorkbookEvaluator.java:288)
ERROR :    at org.apache.poi.ss.formula.WorkbookEvaluator.evaluate(WorkbookEvaluator.java:230)
ERROR :    at org.apache.poi.xssf.usermodel.XSSFFormulaEvaluator.evaluateFormulaCellValue(XSSFFormulaEvaluator.java:264)
ERROR :    at org.apache.poi.xssf.usermodel.XSSFFormulaEvaluator.evaluateFormulaCell(XSSFFormulaEvaluator.java:151)
ERROR :    at org.pentaho.di.trans.steps.excelwriter.ExcelWriterStep.recalculateAllWorkbookFormulas(ExcelWriterStep.java:240)
ERROR :    at org.pentaho.di.trans.steps.excelwriter.ExcelWriterStep.closeOutputFile(ExcelWriterStep.java:217)
ERROR :    at org.pentaho.di.trans.steps.excelwriter.ExcelWriterStep.processRow(ExcelWriterStep.java:172)
ERROR :    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
ERROR :    at java.lang.Thread.run(Unknown Source)


Programming in weka

$
0
0
Hi everyone
I have some question about programming and the cooperation of weka 's section to each other(for example select attributes with classifire ).
If I want to classifire the attribiute in attribute selection how should I do?

How should I write a program in weka and add it to classifire or select attribiute ?
In my research I want to MIX genetic algorithm and aco for feature selection and then use svm classifire?
Is there anybody help me?
thanks a lot

How do I get "Get rows from result" to work when running transformation alone?

$
0
0
I have a simple job that calls two transformations, passing result from one to the other. I cannot figure out has to run the 2nd transformation in Spoon, "faking" the result row.

First transformation calls "Get file names" to find a file, and then passes the filename, directory, and full path to the result stream using "Copy rows to result".

The second transformation reads the result stream using "Get rows from previous result", then reads the file and transforms the data (XML query).

It works when I run the job, which calls the two transforms in turn. however, for debugging, I need to be able to run the second transformation. I can't figure out where/how I enter the three result fields and values (in my case, directory, short_filename, filename). When I run the 2nd transformation from Spoon, I see grids for Parameters, Arguments, and Variables, but not results. What am I missing?

Out of the box time dimension table?

$
0
0
I'm trying to create a time dimension table on my db to connect to my fact table with several time metrics. However, using my date field in the data and several date_format() syntax to convert the time stamp I'm finding that I am missing certain hours in the day.

Is there an out of the box transformation for PDI for time dimensions?

Pentaho MapReduce - Run as user

$
0
0
How do I identify what user to use when running a map/reduce job? In the Pentaho MapReduce job step, I can configure the cluster information, but there isn't a place to provide the user to run under.

It appears to be getting my Windows user and submitting that to the Hadoop cluster, but that's not what I want.

Thanks.

-Barry

Gathering requirements for olap4j 2.0

$
0
0
It's time to start thinking about olap4j version 2.0.

My initial goal for olap4j version 1.0 was to decouple application developers from Mondrian's legacy API. We've far surpassed that goal. Many applications are using olap4j to connect to OLAP servers like Microsoft SQL Server Analysis Services, Palo and SAP BW. And projects are leveraging the olap4j-xmlaserver sister project to provide an XMLA interface on their own OLAP server. The need is greater than ever to comply with the latest standards.

The difference between products and APIs is that you can't change APIs without pissing people off. Even if you improve the API, you force the developers of the drivers to implement the improvements, and the users of the API get upset because they don't have their new drivers yet. There are plenty of improvements to make to olap4j, so let's try to do it without pissing too many people off!

Since olap4j version 1.0, there has been a new release of Mondrian (well, 4.0 is not released officially yet, but the metamodel and API are very nearly fully baked) and a new release of SQL Server Analysis Services, the home of the de facto XMLA standard.

Also, the Mondrian team have spun out their XMLA server as a separate project (olap4j-xmlaserver) that can run against any olap4j driver. If this server is to implement the latest XMLA specification, it needs the underlying olap4j driver to give it all the metadata it needs.

Here's an example of the kind of issue that we'd like to fix. In olap4j 1.x, you can't tell whether a hierarchy is a parent-child hierarchy. People have asked for a method

boolean isParentChild();
Inspired by the the STRUCTURE attribute of the MDSCHEMA_HIERARCHIES XMLA request, we instead propose to add

enum Structure {
FULLYBALANCED,
RAGGEDBALANCED,
RAGGED,
NETWORK
}
Structure isParentChild();
We can't add this without requiring a new revision of all drivers, but let's be careful gather all the requirements so we can do it just this once.

Here are my goals for olap4j 2.0:


  • Support Analysis Services 2012 metamodel and XMLA as of Analysis Services 2012.
  • Create an enum for each XMLA enum. (Structure, above, is an example.)
  • Support Mondrian 4.0 metamodel. Many of the new Mondrian features, such as measure groups and attributes, are already in SSAS and XMLA.
  • Allow user-specified metadata, such as those specified in Mondrian's schema as annotations, to be passed through the olap4j API and XMLA driver.We'll know that we've done the right thing if we can remove MondrianOlap4jExtra.

I'd also like to maintain backwards compatibility. As I already said, drivers will need to be changed. But any application that worked against olap4j 1.1 should work against olap4j 2.0, and any driver for olap4j 2.0 should also function as an olap4j 1.x driver. That should simplify things for the users.

I'll be gathering a detailed list of API improvements in the olap4j 2.0 specification. If you have ideas for what should be in olap4j version 2.0, now is the time to get involved!

More...
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>