Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Calculator Locale

$
0
0
I'm trying to use the Calculator's "Week of year of date A" function to figure out the number of the week for a particular date (i.e. - January 1 is week 1 and Dec 31 is usually week 52). The problem is that my kettle server is running in some strange java Locale and it returns week 1 for December 30 2015. This is a documented problem on stackoverflow when the Locale is set incorrectly in java. I do not have the ability to control the Locale on my server. Is there any way to set the Locale through the kettle job?

I glanced at the source code for ValueDataUtil which is used by the calculator and here is the problematic method:

Code:

    public static Object weekOfYear(ValueMetaInterface metaA, Object dataA) throws KettleValueException    {
        if (dataA==null) return null;


        if (metaA.isDate())
        {
            Calendar calendar = Calendar.getInstance();
            calendar.setTime( metaA.getDate(dataA) );
            return new Long( calendar.get(Calendar.WEEK_OF_YEAR) );
        }
       
        throw new KettleValueException("The 'weekOfYear' function only works with dates");
    }

Source: https://github.com/cwarden/kettle/bl...eDataUtil.java (see lines 978-990)

The Calendar.getInstance uses the default Locale unless it is passed in to the constructor like Calendar.getInstance(requestedLocale). Although I did not see any way to pass the Locale to the Calendar.getInstance method :(

Support (paid) request for community edition

$
0
0
Hi. The story: I quite like Pentaho. I live in Australia and have a small consulting business serving small to medium sized businesses. My workload is growing fast, particularly in migrations of core systems to cloud accounting/supply chain stacks.

Pentaho is really effective tool once you've learnt MDX; I've been using it for a few years and a recent survey of alternatives lead me back to Pentaho.
But I find it very hard to look after. I find upgrading beyond a nightmare: I just can't work out how to do it (migrate my "repository" of queries). I have never, ever worked out how to upgrade from one version to the next, not for hours of trying.

I also find setting it up a bit challenging, particularly on Windows.
I have asked Pentaho how much an enterprise version would cost, but I never get a reply despite several requests over the years. I suspect that my client base, with one or two users per business, would not fit the enterprise business model. I need something priced accordingly to this client base (so fees of a few hundred dollars are ok, but not thousands).

So at the moment, only the developers of the Saiku plugin have ever received any money from me & my client (apart from an excellent tutorial in the community dashboard I bought once).

So I wonder if I can pay an expert to help me look after Pentaho CE. This would be knowledge transfer: I want to learn as well as get things done. It would be for specific jobs, not an ongoing monthly fee. I would prefer to fund someone working Pentaho CE development.

Right now, I have a 5.1 community edition which has mysteriously just stopped working. The log files indicate that the jackrabbit repository is readonly, although there are many entries in the log. This happened spontaneously and there is almost no chance it was due to administrator activity on the windows server; I think maybe corruption. I need help to migrate to 6.0.1 (which also means a move to 64 bit java). The repositories are using the defaults: they don't use an underlying SQL database. The instructions to do that look very complicated, and I have no idea if they would make migrations between versions easier.

If there is any interest, please reply here or to tim@growthpath.com.au
I'd pay with PayPal.

Web application

$
0
0
I know that my company wants to use both the pdi and reporting capabilities, however, given how much this product offers already, I was wondering the path to take in making this into a web application. I believe, at this point ctools and sparkl can be used to essentially create what has been thought of as a 'traditional' web application in terms of entering data, displaying, etc... Is that correct, and if not, would someone point me in the correct direction?

Thank You

J45 output

$
0
0
Hello there,
In the TextArea of the ClassifierPanel we can see a model in the form of a decision tree, when we train J48 algorithm.
In this decision tree (x/y) is depicted in each leaf of the tree.
My question is: Is possible to have the same decision tree, but with the values (x/y) representing the results of the classifier after testing the classifier in a test dataset?

Best regards,
Adrian

Metadata and logging size

$
0
0
Hello,

We are using PDI6 and I would like to know the metadata and logging size? Guesstimates would be okay.

Thank you in advance.

Running multiple indenpendent Jobs(JVM) in parallel

$
0
0
Hi all,

I've got a problem when running multiple Jobs in parallel - powershell script starts 10 threads where every thread is separate kitchen.bat program, so I start 10 JVM at the same time - every single job loads one table and stores info about it's own processing like num of records, time etc. in metadata. I'm loading over 8000 tables every day and everything is almost OK except one - every day there are about 20 tables(random) which were not loaded becouse of JAVA fail. I've noticed that sometimes JAVA has nullPointerException and the process is terminated, I've also noticed that this fail always occurs in the SetVariables step. When I run jobs sequentially(just one thread) all tables are loaded properly but off course it takes much more time. So it looks like the problem is when few steps SetVariables are run in exact the same time. Do You have any ideas what I can do to fix it?

Kitchen.bat : java.lang.NullPointerException
At line:11 char:4
+ & "$($pdiApp)\Kitchen.bat" /file:$($dwhApp)'\etl\jobs\Main.kjb' /param:'set_D ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (java.lang.NullPointerException:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError

at org.pentaho.di.trans.steps.setvariable.SetVariable.processRow(SetVariable.java:84)
2016/02/09 01:40:43 - metadata - Connection to database closed!
2016/02/09 01:40:43 - Table input.0 - Finished processing (I=1, O=0, R=0, W=2, U=0, E=0)


at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)


at java.lang.Thread.run(Unknown Source)


2016/02/09 01:40:43 - Set Variables.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : Unexpected error child index = 2, logging object : org.pentaho.di.core.logging.LoggingObject@1286868 parent=af8f2726-b66a-4171-83be-211b375b5cf0
2016/02/09 01:40:43 - Set Variables.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : java.lang.NullPointerException
2016/02/09 01:40:43 - Set Variables.0 - at org.pentaho.di.trans.steps.setvariable.SetVariable.processRow(SetVariable.java:84)
2016/02/09 01:40:43 - Set Variables.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/02/09 01:40:43 - Set Variables.0 - at java.lang.Thread.run(Unknown Source)


transformation with this step looks like:

setvariables.png
Attached Images

Pentaho Data Service run through PUC generates SQL errors

$
0
0
Hi all,

I have set up two kettle transformations, one with a Pentaho Data Service called test2 and one with the Pentaho Data Service called test3.
Both are just Data Grid Steps outputting a few rows to a dummy step to illustrate my problem.

The steps works individually both when I run the Test Data Service and run the transformations.

But when I create a Datasource in the PUC with both services and I try to use the Join between the two tables I get the following errors:

Code:

ERROR [org.pentaho.di] 2016/02/10 11:57:07 - Servlet - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : Error executing SQL query: SELECT DISTINCT            "BT_TEST3_TEST3"."sku" AS "COL0"          ,"BT_TEST3_TEST3"."name" AS "COL1"          ,"BT_TEST2_TEST2"."xyz" AS "COL2"  FROM            "Kettle"."test2" "BT_TEST2_TEST2"          ,"Kettle"."test3" "BT_TEST3_TEST3"  WHERE            ( "BT_TEST2_TEST2"."sku" = "BT_TEST3_TEST3"."sku" )
ERROR [org.pentaho.di] 2016/02/10 11:57:07 - Servlet - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : org.pentaho.di.core.exception.KettleSQLException:
2016/02/10 11:57:07 - Servlet - Found 4 parts for the FROM clause when only a table name and optionally an alias is supported: "Kettle"."test2" "BT_TEST2_TEST2"          ,"Kettle"."test3" "BT_TEST3_TEST3"

I can preview the data if I restrict to only one table, if I try the Join it errors out with the above errors. Also find attached the Join setup.
previewdata_onetable.jpgpreviewdata_twotables.jpgsetting up join.jpg

I als have tried to find out which syntax is the correct one to execute this as SQL directly but everything I try results in the errors about the From clause.

Is this a bug or have I done something wrong here?
Attached Images

Need to insert identity column in msssql DB using pentaho 5.3

$
0
0
Hi

I am using pentaho 5.3.My input step is table input and output step is insert/output step.It is very simple mapping just read from source and write into the target.I have only these two steps in my mapping.

My source is MSSQL DB and target also MSSQL DB.I need to read from source identity column and insert the same identity column value in the target.I have tried many ways but i am not able to achieve this. I can able to write Set IDENTITY_INSERT TABLE ON in the advanced tab of input/output step and it is working fine only when i store my ktr files in local folder or file repository.If try the same in database repository the ktr file not reading information from advanced tab.If i save the ktr file in database repository that advanced tab changes alone not saving in database repository .But i can able to save and run the ktr file in file repository.Why i am not able do this in database repository?. Do we need to any setup to achieve this in database repository?.

If try to use execute sqlsrcipt step and write Set IDENTITY_INSERT TABLE ON before insert/update step.It says [field 1] is required and can not be found .

What is best way do this in database repository?

Thanks
Shekar

Interpretation of cfssubseteval with Genetic Search in WEKA

$
0
0
Dear All,

I have carried out feature selection on my training set using genetic search with the evaluator cfsSubsetEval. However i am a little confused with the output from the generations in WEKA and how this ties in with the final subset selected and outputted in WEKA.

Using this procedure i obtain 65 features. However none of these features in this combination are present in my last generation (20 populations). So i am wondering how WEKA got to these 65 features when the cfssubset evaluates a subset as a whole and not individual features? Surely it would select the subset in my final generation with the highest merit?

Any help would be much appreciated!

Regards
Danielle
Attached Files

"Auto Size Columns" does not work as expected on server where there is no MS Excel

$
0
0
Hello,

I have created a transformation which writes and excel report using "Excel Writer Step". The feature "Auto Size Columns" works as expected on server where MS Excel is installed. Whereas the same script does not work on the other server where MS Excel is not installed. I've validated version of PDI on both the servers. It is same (4.4).

Here are other details:
OS: Windows 2008 R2 SP1 64bit
Java: 1.7.0_25 64 bit

Kindly help me understand what could be the issue. I tried to check documentations to see if it has dependency on MS Excel to be installed on the server. But I could not find any.

Thanks a ton in advance!

Checking if there's no rows in result

$
0
0
I have a transformation that has these steps:

- Get rows from results
- Set Variables

Can happen that no rows is copied to the results, then Get rows from results will become empty and the set variables step breaks. I'd tried to put a Detect Empty Stream in the transformation but did not worked as I expected. Is there another way to check if the Get rows from results is empty? The job, the transformation and the txt file are attached.
tra.rar
Thanks!
Attached Files

Excel writer in Pentaho Data Integration - Password protection not working

$
0
0
I am using a simple transformation to get the data from a CSV file which is then getting copied in an excel file using Microsoft Excel writer which in turn is getting mailed to users as an attachment. I have given password protection in Excel writer step. Now the issue is the attachment in mail which is password protected is asking for password when i am trying to open the excel file after saving it to my local from the mail but if i am previewing the attachment directly from my mail, Its not asking for any password. If anyone have some idea about this,it would be very helpful.

Using series of variables for table name in table input

$
0
0
Hi,

New to this community and somewhat new to Pentaho. I have a problem I am solving where I am transferring data from a SQL Server DB to Vertica, across about 50 tables. I've figured out how to do this table by table, but now I want to write the job so I don't have to have 50 different table input/bulk loader combos.

My solution is to extract a list of tables from the vertica system table, and then use this list to feed into the table input such as "select * from ${table_name}". The problem is, I am a bit stuck as to how to do this. Ill describe my current process:

Step 1: Get list of tables using a table input with a query "select table_name from tables".
Step 2: Feed these tables into a "copy rows to result" job.
Step 3: Get these rows using "get rows from result" and set a variable called "table_name"
Step 4: Execute data extraction using table input and the query "select * from ${table_name}"
Step 5: Use the vertica bulk loader and set the target table as ${table_name}

What I can tell is that I am successfully pulling the list of table names and something is looping, but not correctly because I am not having data load. I feel the problem has something to do with how I am setting the variable, how not understanding properly how "copy rows to result" and "get rows from result" are interacting with the variable setting. Really, some sort of for each loop that is looping on my list of table names is fine.

Any help appreciated!

rdudejr

Duplicate entry for key 'PRIMARY' error on empty table

$
0
0
Hi all. I have the transformation below:

Screen Shot 2016-02-10 at 15.09.19.jpgScreen Shot 2016-02-10 at 15.09.26.jpg

When I run it i get the following error: Duplicate entry '2211-94683-43' for key 'PRIMARY'

Problem is that my table FT_VENDA_DIARIA is empty, and when I try to insert these values manually, it works! Also when I check the query returns only 1 line, so it shouldn't be raising this error.

Table DDL is:

Screen Shot 2016-02-10 at 15.16.55.jpg

Any suggestion? Please!

Help me understand community edition

$
0
0
On the Pentaho website, under community, I click the BA Platform. It redirects me to a download site and starts downloading BI server. I then clicked the Pentaho link below and downloaded pdi-ce-6.0. This includes spoon, crate, and kitchen. It would seem that I have the BI community version, but I need the BA community edition. I need to be able to create a custom dashboard with maps and charts. Is what I downloaded the correct thing for what I am looking to do? I will also need to connect to AWS Redshift to get data.
So you can get the big picture of that we are trying to do. The client wants us to connect a Pentaho BI front end to their back end API through a live data stream hosted on Amazon's cloud. We would appreciate any tips, as we are time boxed on this project and really not sure about many things.

Right now our idea is Pentaho queries hitting a web API we create, connected to a consumer application we design hosted on an EC2 instance that queries their database, which gives us a JSON object, which goes into a Kenesis Firehose, is pulled out by Lambda code into S3, transferred into Redshift, and then visualized in Pentaho.

Please get back to me on what i need to download and how to get started in the right direction.

SEVERE [http-apr-8080-exec-3] on SAIKU (or Mondrian configuration)

$
0
0
Hi all. I am trying to load Account names in lines and gross sales as measure in Saiku, but it returns me the following log:

Code:

10-Feb-2016 18:58:43.858 SEVERE [http-apr-8080-exec-1] null.null The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
 java.lang.NullPointerException
    at org.saiku.plugin.resources.PentahoQueryResource.execute(PentahoQueryResource.java:55)
    at sun.reflect.GeneratedMethodAccessor215.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
    at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
    at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
    at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
    at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
    at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:112)
    at org.saiku.plugin.resources.ExtendedJAXRSPluginServlet.service(ExtendedJAXRSPluginServlet.java:58)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
    at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:117)
    at org.pentaho.platform.web.servlet.PluginDispatchServlet.service(PluginDispatchServlet.java:89)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:399)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:191)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
    at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
    at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:188)
    at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
    at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
    at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2503)
    at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2492)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
    at java.lang.Thread.run(Thread.java:745)

My Schema Workbench in configured like this:

Screen Shot 2016-02-10 at 19.11.39.jpgScreen Shot 2016-02-10 at 19.11.44.jpgScreen Shot 2016-02-10 at 19.11.47.jpgScreen Shot 2016-02-10 at 19.11.52.jpg

I am sure this is a big Dimension, BUT, if this is a "too-much-data" error I think Saiku should tell me something because with this log I can't figure out what is happening.

Has anyone ever experienced this? Thanks in advance!

LDAP Authentication Methods

$
0
0
Does the LDAP Input support DIGEST-MD5 (SASL) or CRAM-MD5 methods?
If so how do you configure the node?

thank you

Update step -- batch update much faster --- getting stuck though??

$
0
0
Hi there--

I find when I use the update step, the step is much faster if I make a large commit size (1000, 5000, whatever) --- and check batch updates.


However there's a problem.

For some odd reason, the step freezes/ stops/ gets stuck at one record before the batch.

For instance, for the 'Update' step ... the number Read, Written, Input, and Updated will gradually increase .... and then, the Read/ Input will stop at 1000, and the Written/ Updated will get stuck at 999.


Same thing if I set the commit to 5000. First go through, it will get stuck at 5000 read and 4999 updated. What's going on here?

The other steps (inputs and other branches) --- continue to move along just fine. And I'm not receiving any error messages. I'm using SQL server by the way.


EDIT: Hmm...... okay I guess it's not stuck, it just counts from 0-999, then waits a LOOOONG time to increase again .... maybe it's frozen during the 'batch update' or some kind of other feedback (aka SQL actually writing the 1000 records).

How to clear selection in Select Component

$
0
0
Hi everyone,

I'm new on Pentaho. I'm working on a dashboard with a filter (selector component)
And by default this component return all the members of the query, but when a vluu is select, it cant return to the original state. Its possible to include an attribute All to the selector or create a button to clear the selection made?

Column Label name and position changes based on parameter passed

$
0
0
I am a beginner in using pentaho reporting. Please help me suggesting if the following is feasible or not.

I am using pentaho report design for a report. Based on a parameter passed, I need to have different header names for the same column. e.g. if the parameter 'TEMPLATE' passed is 'A', the label of column 1 of the report should be 'Serial Number' but if the parameter 'TEMPLATE' passed is 'B', the label of column 1 of the report should be 'Serial Number/BBN Number'. I need to put it for all columns in the report.
Also,is it possible to change the sequence of the columns based on the parameter passed? e.g. column1 should be at first position if the parameter 'TEMPLATE' passed is 'A', but column1 should be at 4th position if the parameter 'TEMPLATE' passed is 'B'.
Please help me in these two doubts.
Thanks,
Megha
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>