Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

EDI x12 Output step

$
0
0
Hi,

I wanted to know if there is a step that can convert the stream data into the EDI x12 output.
I found that there is the XML to EDI step, but can we do the reverse?


Thanks in Advance!

Running Pentaho 6 in Mac OS X 10.11 El Captain

$
0
0
The problem - and the fix

When El Captain was released we immediately noticed problems running parts of the Pentaho stack.

PDI

PDI-14470 tracked this issue. Spoon doesn't even start, which is... let's say... somewhat inconvenient. This wasn't actually an issue with pentaho, but with SWT itself, and a lot more people were affected.

Our devs got a lot of help from the community (thanks André, Scott and Christian, among others). In the end, what you want to know is this (courtesy of MDamour):
Remove (or move away): data-integration/lib/pentaho-xul-swt-6.0.0.0-353.jar
Replace it with the file (pentaho-xul-swt-EXPERIMENTAL.jar).
We have tested all of the reported scenarios above: spoon won't start, can't explore repository, can't edit database connection, etc. Please report any additional findings.
Our EE customers will have this already available in the next service patch.

As you may understand, all our focus was on getting this to work on Pentaho 6.0. But we've heard comments of users that did exactly the same on 5.x versions and reported it to work as well. Use at your own risk.

If you still find any issues, please add comments to the case PDI-14470 or create a new one.

Platform installer

A new feature of El Captain, System Integrity Protection or SIP also broke our installers. Simply put, postgresql couldn't be added to the init scripts. BISERVER-12894 tracks this issue.

This has been successfully resolved now, and as soon as we do the last set of tests we'll be updating the download bundles.

Going forward

You may be asking yourselves - "How could this get them by surprise?". Well, don't worry, we ask ourselves the same question... The truth here is that we internally tested a beta version back in July and everything worked great; all this issues appeared after.

But regardless - we _have_ to get better!


-pedro







More...

Time Series forecasting

$
0
0
Hi, I am post graduate student doing my research in data mining. I find the time series feature in weka to be very useful.

What I want to do is to implement vote ensemble in time series. I can do it in GUI, but I do not know how to code it in Java. I have google for sample code but couldn't find any. I did try to change the code using normal classifier method that implement vote. but I get error setClassifiers() does not work in wekeForecaster().

"The method setClassifiers(Classifier[]) in the type MultipleClassifiersCombiner is not applicable for the arguments(WekaForecaster[])"

Thank You

Delay from load repository and allocate job

$
0
0
Hi guys,
I'm using PDI 6.0 and I have a job defined in a file repository which is executed via kitchen.sh.


I noticed a strange (and not deterministic) behaviour in the time between the:
2015/11/10 18:15:06 - Kitchen - Load the repositories defined on this system.
2015/11/10 18:16:13 - Kitchen - Allocate job...


This time varies from 10 seconds up to one minute.


Could someone explain me the possible cause of this behaviour?


Nico

Ignore or delete a row when a field is certain value

$
0
0
Hi again!

I keep working on my excel files (joining several files into one, taking in account only certain rows and columns from the original files), and now I need to ignore a row when a certain field in that row has a certain value.

I've looked into Select Values, Filter Rows, but couldn't find a way to do precisely that.

Is there a way to do it in one step, or shall I look into performing several steps?

Thanks,

Newbie thoughts about trasformation

$
0
0
Hi all,
I'm a Kettle user from just 3 weeks and I'm starting with more complex exeperiments..
I have to build an ODS layer of datawarehouse and at moment I build a Job which contains n trasformations running in parallel.
Each one is responsible of the tables copy in ODS database. In detail, each transformation has to do following operations:
1 insert record in audit table that with some information about new execution starting
2 read data from ERP database
3 insert data in to ODS table
4 update record in audit table with other information about execution stopping
I need to know more information about how I can organize my project.

First, which is the difference between OpenErp object input e simple table input?
Can I share db connection among trasformations?
I use variables (job scope) but I'm not able to figure out which is the difference between arguments and named parameters.

Thanks to who can gives me a help!

J

BIRTOutput Step for PDI

$
0
0
I'm using the following plugin for bursting Birt reports out of kettle: http://b-e-o.blogspot.com/search/lab...IRT%20bursting
While this works well while executing kettle, I'd like to be able to also use it while running pan or kitchen for scheduled jobs/transforms. Unfortunately, I'm seeing the following errors in my log. Any input into how I could resolve the below would be very appreciated:

Code:

2015/11/13 09:00:55 - Pan - Start of run.2015/11/13 09:00:55 - RESULTS_REPORT - Dispatching started for transformation [RESULTS_REPORT]
2015/11/13 09:00:55 - REPORT INFO.0 - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) : Unexpected error
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) : There was an unexpected error processing report '\\fac-storage\Scripts\Automation\DMV\SETUP\DMV_REPORT.rptdesign' to produce file '\\fac-storage\Scripts\Automation\DMV\RESULTS\DMV_REPORT.pdf' with processor: PDF.
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) : Unable to parse the document: ResourceKey{schema=org.pentaho.reporting.libraries.resourceloader.loader.URLResourceLoader, identifier=file:////fac-storage/Scripts/Automation/DMV/SETUP/DMV_REPORT.rptdesign, factoryParameters={}, parent=null}
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:246)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:115)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at java.lang.Thread.run(Unknown Source)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) : Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceCreationException: Unable to parse the document: ResourceKey{schema=org.pentaho.reporting.libraries.resourceloader.loader.URLResourceLoader, identifier=file:////fac-storage/Scripts/Automation/DMV/SETUP/DMV_REPORT.rptdesign, factoryParameters={}, parent=null}
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.reporting.libraries.xmlns.parser.AbstractXmlResourceFactory.create(AbstractXmlResourceFactory.java:249)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.create(DefaultResourceManagerBackend.java:272)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:434)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:370)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createDirectly(ResourceManager.java:207)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.loadMasterReport(PentahoReportingOutput.java:151)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:167)
2015/11/13 09:00:58 - BIRT Output.0 - ERROR (version 4.4.2-GA, build 18750 from 2013-07-19 11.23.04 by buildguy) :    ... 3 more
2015/11/13 09:00:58 - BIRT Output.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2015/11/13 09:00:58 - RESULTS_REPORT - RESULTS_REPORT
2015/11/13 09:00:58 - RESULTS_REPORT - RESULTS_REPORT
2015/11/13 09:00:58 - Pan - Finished!
2015/11/13 09:00:58 - Pan - Start=2015/11/13 09:00:55.617, Stop=2015/11/13 09:00:58.902
2015/11/13 09:00:58 - Pan - Processing ended after 3 seconds.

Distinct Clause issue for MySQL in PDI-CE-5.3

$
0
0
Hi Sir,Madam,

I am using PDI-CE-6.0, Java 1.7 64 bit , MySQL database. Trying to load data nearly like 4000000 records from source to target. to avoid duplicate records i kept DISTINCT clause in my extraction query. as per my DBA input when i kept DISTINCT clause then query is taking long time to process the records and CPU usage is going to take 100%.

we can use distinct clause to avoid duplicate records in SQL queries and alternative solution is we can write sub select queries but my data is 4000000 records so it is not good idea to kept sub select for my joins(why because i am using nearly 12 table left joins)

What is the alternative DISTINCT clause in PDI ? i hope all most of MySQL database users are facing issue...am i correct.

below is the process, trying to loading the data. Please suggest me...

data.jpg
Attached Images

how to modify all transformations/jobs of a repository (for ex. edit all log config)

$
0
0
Hi folks,

We're migrating from pdi 3.2 CE to pdi 5.4 EE.
We have imported everything in repository, and everything's fine.
However, know we need to configure the logging into database for all jobs and transformations.
Doing it by hand is cumbersum and time consumming, haiving thousand of objects to edut.

So the question is: is there a mean to edit some configuration of all transformation/jobs?

I though I can export repository to xml, write XSLT to tweak some part like the logging configuration,
And then import the result into repository.
I could write prrof of concept XSLT, but have two problems:
1. The numerique entity references are translated jn the output. I mean, stuff like $ arr converted
Into the real caractere after the xslt transform. Which can cause problème with quotes or content
Of scripts!

2. The repository XML is too big (hundreds of Mo), getting Java heap space.

Do you have some ideas to adress theses problems, or another method to do it?
Was also thibking of using the java API to load the TransMeta objectd, tweeak them with API,
then save into repo. But sounds risky, while more powerfull!

Anonymous access in LDAP/JDBC configuration?

$
0
0
I have been attempting to allow anonymous access to one of my reports. I have an LDAP/JDBC security config with AD providing authentication and JDBC as my role provider. I added an anonymousUser to my USERS table and gave it the Anonymous role in GRANTED_AUTHORITIES. I then followed the instructions in the link below.

http://stackoverflow.com/questions/2...without-log-in

Now when I go to view my report anonymously it gives me a 500 error ("Sorry. We really did try...") and I get the following error in catalina.out and pentaho.log.

Code:

2015-11-13 09:38:43,582 ERROR [org.pentaho.platform.repository2.unified.jcr.sejcr.GuavaCachePoolPentahoJcrSessionFactory] Error obtaining session from cache. Creating one directly instead: javax.jcr.SimpleCredentials@6502ad92java.util.concurrent.ExecutionException: javax.jcr.LoginException: LoginModule ignored Credentials
        at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
        at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
        at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
        at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
        at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2346)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2318)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2280)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2195)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3934)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3938)
        at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4821)
        at org.pentaho.platform.repository2.unified.jcr.sejcr.GuavaCachePoolPentahoJcrSessionFactory.getSession(GuavaCachePoolPentahoJcrSessionFactory.java:93)
        at org.pentaho.platform.repository2.unified.jcr.sejcr.CredentialsStrategySessionFactory.getSession(CredentialsStrategySessionFactory.java:355)
        at org.springframework.extensions.jcr.jackrabbit.LocalTransactionManager.doBegin(LocalTransactionManager.java:120)
        at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:372)
        at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:417)
        at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:255)
        at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
        at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
        at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
        at com.sun.proxy.$Proxy78.getFile(Unknown Source)
        at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator$20.call(ExceptionLoggingDecorator.java:262)
        at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator$20.call(ExceptionLoggingDecorator.java:260)
        at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.callLogThrow(ExceptionLoggingDecorator.java:489)
        at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.getFile(ExceptionLoggingDecorator.java:260)
        at org.pentaho.platform.web.http.api.resources.RepositoryResource.doService(RepositoryResource.java:651)
        at org.pentaho.platform.web.http.api.resources.RepositoryResource.doGet(RepositoryResource.java:598)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
        at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
        at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
        at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
        at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
        at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
        at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
        at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
        at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
        at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
        at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
        at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
        at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
        at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
        at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
        at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:109)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
        at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:114)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:399)
        at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
        at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:191)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
        at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
        at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
        at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
        at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:188)
        at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
        at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
        at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1527)
        at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1484)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
        at java.lang.Thread.run(Thread.java:745)
Caused by: javax.jcr.LoginException: LoginModule ignored Credentials
        at org.apache.jackrabbit.core.RepositoryImpl.login(RepositoryImpl.java:1526)
        at org.pentaho.platform.repository2.unified.jcr.sejcr.NoCachePentahoJcrSessionFactory.getSession(NoCachePentahoJcrSessionFactory.java:24)
        at org.pentaho.platform.repository2.unified.jcr.sejcr.GuavaCachePoolPentahoJcrSessionFactory.access$001(GuavaCachePoolPentahoJcrSessionFactory.java:29)
        at org.pentaho.platform.repository2.unified.jcr.sejcr.GuavaCachePoolPentahoJcrSessionFactory$1.load(GuavaCachePoolPentahoJcrSessionFactory.java:77)
        at org.pentaho.platform.repository2.unified.jcr.sejcr.GuavaCachePoolPentahoJcrSessionFactory$1.load(GuavaCachePoolPentahoJcrSessionFactory.java:75)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3524)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2317)
        ... 110 more
Caused by: javax.security.auth.login.FailedLoginException: LoginModule ignored Credentials
        at org.apache.jackrabbit.core.security.authentication.LocalAuthContext.login(LocalAuthContext.java:87)
        at org.apache.jackrabbit.core.RepositoryImpl.login(RepositoryImpl.java:1498)
        ... 116 more

Most or best step to insert or update the records

$
0
0
Hi ,

I am using windows, PDI-CE-6.0, Java 1.7 64 bit, MySQL-innoDB. i am facing performance issue to insert or update 5000000 of records in my target database. i am following below approaches, could you please suggest if another approach is there ??.

decreased Nr of rows in rowset from 10000 to 1000 and feedback size from 50000 to 5000 in .ktr settings. i hope it is good, is it right?

case 1 :Initially i was tried, Table input step ----> Insert/Update step , i can say it is suitable for minimal records.
case 2 : after that i was tried, Sync after merge with merge rows (diff) combination. , i can say better than case 1.
case 3 :after that i was tried, Table input step ----> Serialize to file then De serialize from file ---> Insert/Update step ., I can say excellent but as per PDI wiki it is not suitable for huge records. so we better to ignore this.
case 4 :after that i was tried, MySQL Bulk Loader, as per PDI wiki it is not available in MS Windows.

Can some one suggest me which steps are best to load huge/millions of records.
Or
am i need to change my Amazon Cloud system configuration ?
oltp.jpg
Attached Images

weka.classifiers.functions.LibSVM: Cannot handle numeric class!

$
0
0
The documentation states that SVM can be used for Regression.

Why then am I getting this error when I try it:

weka.classifiers.functions.LibSVM: Cannot handle numeric class!

Drillthrough: org.olap4j.OlapException: Cannot do DrillThrough operation on the cell

$
0
0
Hi,

I'm trying to run a DRILLTHROUGH query on my cube but I'm getting an exception:

Code:

org.olap4j.OlapException: Cannot do DrillThrough operation on the cell
  mondrian.olap4j.MondrianOlap4jStatement.executeQuery2(mondrian/olap4j/MondrianOlap4jStatement.java:107)
  mondrian.olap4j.MondrianOlap4jStatement.executeQuery(mondrian/olap4j/MondrianOlap4jStatement.java:65)

The query is rather simple:

Code:

DRILLTHROUGH
SELECT {[Measures].[Vigente]} ON COLUMNS,
[Fecha].Year.Members ON ROWS
FROM [dataset_ejecucion_gastos_bahia_user_bahia]

The schema can be found here: https://gist.github.com/jazzido/62f1d20986360b9d7c42

Any help would be greatly appreciated. I've been scratching my head about this for a couple days :confused:

Thanks!

M

Setting default values in selectors

$
0
0
Hi everybody,

I'm new with Pentaho and I'm having some troubles with my Dashboard. I'm creating it with CDE 6.0. I'll try to explain my problem. I have:
  • 2 Datasources (years and months).
  • 2 Simple Parameters (Param_Year, Param_Month).
  • 2 Simple Selectors (Selector_Year, Selector_Month).


The problem is I can´t select a default value on the selectors. I want the selection be the actual year and the actual month, but it always choose the first one. Example:
  • DS_Years = 2013, 2014, 2015
  • DS_Month = 1, 2, 3, ..., 11, 12
  • Selections I want: 2015; 11;
  • Selections I have: 2013; 1;


Anyone can help me with this? Regards.

Document for RandomCommittee

$
0
0
Hi senior members and Mark.

I'm a student computer science. I'm researching about ensemble in time series. I found that there are 2 learner in Weka, they had good accuracy. RandomForest and RandomCommittee.
I found a reference for RandomForest in Weka "
Leo Breiman (2001). Random Forests. Machine Learning. 45(1):5-32".
But RandomCommittee which i cannot see document for it.

My questions:
What
references can i find for RandomCommittee(Papers, authors) ?
I saw that multivariate forecasting is also applied in Weka in order to forecast multi series. where can i find references for multivariate forecasting which are implemented in weka ?

Thanks for your help.
RG

converting to date problem

$
0
0
Hi,

pentaho di 5.4.0.1-130
on step GetVariables when put variable value to the field of data type Date, format yyyy/MM/dd HH:mm:ss.SSSSSSSSS, I have got issue

2015/11/15 14:24:41 - Get Variables.0 - last_update String(29) : couldn't convert string [2015/11/15 12:58:45.004000000] to a date using format [yyyy/MM/dd HH:mm:ss.SSSSSSSSS] on offset location 29

variable actually contains 2015/11/15 12:58:45.004000000

what's wrong?

Mikhail

Pentaho adds empty columns while export to CSV

$
0
0
Hello!
I use Pentaho 3.9.1 to design a simple report. As soon as a number of string-field controls in Group Header\Details Body\Details exceeds a certain amount, CSV export starts adding empty columns (which can be identified by ;; in the result file)
Any ideas how to prevent PRD from doing such a stupid thing?
Thanks!
P_CSV_Issue 01.jpg
P_CSV_Issue 02.jpg
Attached Images

spoon.sh crashes with webjarsurlconnection error

$
0
0
spoon.sh won't start. Errors out (errors listed below) with [WebjarsURLConnection] Error (Error Transforming Zip).

It looks like others have seen this.

Any ideas what I might be doing wrong?

Kettle
Version: pdi-ce-6.0.0.0-353.zip
Location: /opt/kettle/data-integration
Permissions:
owner cloudera, group cloudera (recursive, including /opt/kettle/)
changed permsissions on *.sh per advice (for easier kickoff)


Java
java version "1.7.0_67"
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

VM:
Linux quickstart.cloudera 2.6.32-573.3.1.el6.x86_64 #1 SMP Thu Aug 13 22:55:16 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
Centos 6.7
Memory 12G, 3 Processors, 64 GG h/d


Command: ./SpoonDebug.sh (Y,Y as input)

Results:
...
Nov 15, 2015 9:36:58 AM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess
INFO: New Caching Service registered
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/kettle/data-integration/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/kettle/data-integration/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
09:36:59,528 ERROR [WebjarsURLConnection] Error Transforming zip
java.io.IOException: Pipe closed
at java.io.PipedInputStream.checkStateForReceive(PipedInputStream.java:261)
at java.io.PipedInputStream.receive(PipedInputStream.java:227)
at java.io.PipedOutputStream.write(PipedOutputStream.java:149)
at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:253)
at java.util.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:238)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection.transform(WebjarsURLConnection.java:190)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection.access$000(WebjarsURLConnection.java:54)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection$2.call(WebjarsURLConnection.java:90)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection$2.call(WebjarsURLConnection.java:87)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Nov 15, 2015 9:37:00 AM org.apache.cxf.endpoint.ServerImpl initDestination
INFO: Setting the server's publish address to be /lineage
09:37:00,523 ERROR [WebjarsURLConnection] Error Transforming zip
java.io.IOException: Pipe closed
at java.io.PipedInputStream.checkStateForReceive(PipedInputStream.java:261)
at java.io.PipedInputStream.receive(PipedInputStream.java:227)
at java.io.PipedOutputStream.write(PipedOutputStream.java:149)
at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:253)
at java.util.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:238)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection.transform(WebjarsURLConnection.java:190)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection.access$000(WebjarsURLConnection.java:54)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection$2.call(WebjarsURLConnection.java:90)
at org.pentaho.osgi.platform.webjars.WebjarsURLConnection$2.call(WebjarsURLConnection.java:87)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Nov 15, 2015 9:37:00 AM org.apache.cxf.endpoint.ServerImpl initDestination
INFO: Setting the server's publish address to be /marketplace
2015/11/15 09:37:04 - Spoon - Logging is at level : Debugging
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x0000003fef00e0fc, pid=20759, tid=139662808360704
#
# JRE version: Java(TM) SE Runtime Environment (7.0_67-b01) (build 1.7.0_67-b01)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (24.65-b04 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# C [ld-linux-x86-64.so.2+0xe0fc]
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /opt/kettle/data-integration/hs_err_pid20759.log
#
# If you would like to submit a bug report, please visit:
# http://bugreport.sun.com/bugreport/crash.jsp

Parquet Snappy Plugin

Query won't finish with too many members

$
0
0
Hey all, i've been having a hell of a time trying to figure out the cause of my extremely poor performance on a very small data set.

I made a post about this on the Saiku forum first, but this is really a Mondrian issue so I was directed here. https://groups.google.com/a/saiku.me...er/tEQMjtVAc0M

First of all, my fact table is about 120k rows atm. Very small, fits entirely in memory.
I left the query running for over an hour, and it didn't finish. The issue it seems to have, is when I have company name and company number in the result set. They should be pretty close to 1 to 1 (there are about 1200 distinct company names, and 1250 distinct company numbers), but Mondrian seems to just die when both are together in the same query.

With logging on, I can see that all queries that my DB is being asked to return are done within a few hundred ms. The DB is not the issue in this instance.

I don't know if it's an issue with how I set up my schema, or if it's a Mondrian configuration issue... any help as to what to look at would be greatly appreciated.

The query in question is:

Code:

WITH
SET [~ROWS_Company_Company Name] AS
    {[Company].[Company Name].[Company Name].Members}
SET [~ROWS_Company_Company Number] AS
    {[Company].[Company Number].[Company Number].Members}
SET [~ROWS_Company_Internal Company Type] AS
    {[Company].[Internal Company Type].[Internal Company Type].Members}
SELECT
NON EMPTY {[Measures].[Net Sales]} ON COLUMNS,
NON EMPTY ([~ROWS_Company_Company Name]  *  [~ROWS_Company_Company Number]  *  [~ROWS_Company_Internal Company Type]) ON ROWS
FROM [Demo - Sale]


I did turn on debug logging for Mondrian, and I can see what it's spending almost all it's time on. It generated about 3gb in the mondrian.log file after a couple min of running the query, but it was pretty much this repeating:
Code:

2015-11-14 12:52:09,288 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_aryz] because of a union of roles.2015-11-14 12:52:09,288 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,288 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,288 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_gosimple_llc] because of a union of roles.
2015-11-14 12:52:09,288 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_isl_oa] because of a union of roles.
2015-11-14 12:52:09,288 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_krn] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_mff] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_pin] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_test_client_1] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[Unknown] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_aryz] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_gosimple_llc] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_isl_oa] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_krn] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_mff] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_pin] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_test_client_1] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[Unknown] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_aryz] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_gosimple_llc] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_isl_oa] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_krn] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_mff] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_pin] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[client_test_client_1] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level NONE granted to member [Client].[Client Name].[Unknown] because of a union of roles.
2015-11-14 12:52:09,289 DEBUG [mondrian.olap.UnionRoleImpl] Access level ALL granted to member [Client].[Client Name].[client_demo] because of a union of roles.

Attached is my schema this is running against: sales.xml

Any help is greatly appreciated,
-Adam
Attached Files
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>