Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Reading arguments from a custom properties file

$
0
0
Hi All,

I need a solution for the problem i am facing really badly. I have a usecase where i have to integrate kettle to my webapp library and need to execute the kettle jobs through a cron. I want the database connection parameters to be configurable and need that file somewhere in my application.

Till now, i was trying to read these connection inputs from a property file and then setting them as a jvm variable using the "set variables" component. This worked for me in the development environment i.e. on all the systems where spoon was used and a kettle.properties was there in the users/xyz/.kettle directory. But as soon as we tried it on a fresh system....it failed saying that the metadata for the connection is not correct. A .kettle directory was created in the user directory but was empty, adding the connection parameters to this file got it working again.

Whats the best way to read some arguments from a custom properties file?

Connection reset by peer: socket write error

$
0
0
Hello All,

I have few dashboards developed in pentaho CE 5.0 on database server HP vertica. I am getting "Connection reset by peer: socket write error" error frequently earlier i was getting same error with InfiniDB.

Here with I have attached log file also for your refrence.

Please assist me to resolve this issue.

Thanks & Regards

Nitesh


Log Details :

2015-01-16 06:28:11,404 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0004 - Resource /pentaho-cdf-dd/lang/messages_en.properties not found in plugin pentaho-cdf-dd
2015-01-16 06:28:11,417 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0004 - Resource /pentaho-cdf-dd/lang/messages_en-GB.properties not found in plugin pentaho-cdf-dd
2015-01-16 07:31:11,543 ERROR [hive.ql.exec.HiveHistory] Unable to create log directory /tmp/pentaho
2015-01-16 07:31:14,531 ERROR [hive.ql.exec.HiveHistory] Unable to create log directory /tmp/pentaho
2015-01-16 07:31:14,684 ERROR [hive.ql.exec.HiveHistory] Unable to create log directory /tmp/pentaho
2015-01-16 09:24:10,155 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0004 - Resource /pentaho-cdf-dd/lang/messages_en.properties not found in plugin pentaho-cdf-dd
2015-01-16 09:24:10,161 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0004 - Resource /pentaho-cdf-dd/lang/messages_en-GB.properties not found in plugin pentaho-cdf-dd
2015-01-16 09:29:31,680 ERROR [org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput] Error generating content from content generator with id [generatedContent]
ClientAbortException: java.net.SocketException: Connection reset by peer: socket write error
at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:369)
at org.apache.tomcat.util.buf.ByteChunk.append(ByteChunk.java:368)
at org.apache.catalina.connector.OutputBuffer.writeBytes(OutputBuffer.java:392)
at org.apache.catalina.connector.OutputBuffer.write(OutputBuffer.java:381)
at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:89)
at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:83)
at org.apache.commons.io.IOUtils.write(IOUtils.java:1140)
at pt.webdetails.cdf.dd.DashboardDesignerContentGenerator.createContent(DashboardDesignerContentGenerator.java:98)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.generateContent(GeneratorStreamingOutput.java:229)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.write(GeneratorStreamingOutput.java:156)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:58)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:37)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1479)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:111)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:116)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:161)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:83)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:59)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:112)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:66)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Unknown Source)
Caused by: java.net.SocketException: Connection reset by peer: socket write error
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(Unknown Source)
at java.net.SocketOutputStream.write(Unknown Source)
at org.apache.coyote.http11.InternalOutputBuffer.realWriteBytes(InternalOutputBuffer.java:756)
at org.apache.tomcat.util.buf.ByteChunk.flushBuffer(ByteChunk.java:448)
at org.apache.tomcat.util.buf.ByteChunk.append(ByteChunk.java:363)
at org.apache.coyote.http11.InternalOutputBuffer$OutputStreamOutputBuffer.doWrite(InternalOutputBuffer.java:780)
at org.apache.coyote.http11.filters.ChunkedOutputFilter.doWrite(ChunkedOutputFilter.java:126)
at org.apache.coyote.http11.InternalOutputBuffer.doWrite(InternalOutputBuffer.java:593)
at org.apache.coyote.Response.doWrite(Response.java:560)
at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:364)
... 75 more

Any simple Chloropleth Maps available for CDE?

$
0
0
Hello all,

I have a dashboard requirement for a very basic US State map which is shaded by number of customers per state. I have been trying to convert d3/protovis examples i find on the web, but am not a javascript guy, so I am not having any luck.

Any of you have tips on how to accomplish this?

${Internal.Job.Filename.Directory} is empty

$
0
0
Hi guys,

I'm trying to embed a python script to download csv files. I attempt to do this by executing a shell step. I put the python script in a folder relative to the current job path (say, ../scripts/download_reports.py). Then, in the shell script step dialog, I use

Code:

python ${Internal.Job.Filename.Directory}/../scripts/download_reports.py
However, it seems that the variable ${Internal.Job.Filename.Directory} is empty. Can someone help me? Or is there any better solution for this?

Thanks!

Replace by value field is missing at position 0

$
0
0
Greetings experts,

As a complete novice using Pentaho, I again seek your help. Perhaps this issue may be extremely simple for you, but I have found that the documentation is lacking and maybe incomplete.

My environment:
Pentaho: 5.2.0 on Windoze 7 Pro SP1
Source Database: Oracle 11.2 on AIX 7.1
Target Database: DB2 10.5 on AIX 7.1

My requirement is to replace a field that is sourced from a csv file with a sequential number.

Currently I have this which add the sequence as additional field at the end of the record:

CSV file input -> Add sequence

For the next step I need to replace a field in the csv stream with the computed sequence. I have tried using the "Set Field Value" transformation, but it gives me the above error "Replace by value field is missing at position 0"

I am sure there is an simple solution for such an elementary task.
Please enlighten me.
:)

Data integration server (community edition)

$
0
0
I have evaluated the DI 5.2.0 with Spoon designer
Created two 2-3 transformation and jobs now I want to run in server jboss /Tomcat Server

My question is that : is there a community edition "data integration server" ?

If not then how can I run kettle file in remote server if there any other way to run ?

Classifiers in the classify tab

$
0
0
Hi. I've just downloaded and started with WEKA and I'm following Ian Wittens's excellent tutorials. Why is it that only a fraction of the classifiers show up under my tab? --Only ZeroR shows up under the rules folder. And no Bayes, meta, lazy etc.

Unable to connect to the SAP ERP server:

$
0
0
I am using Linux system. I have already copy file sapjco3.jar to JRE Location and libext folder.
But i got below error.
Unable to connect to the SAP ERP server:
Possibly the SAP JCo implementation library (e.g. sapjco3.dll) does not exist or cannot be loaded. Please copy it to your libext directory! If you use version 3.0.5 or higher on Windows be sure to have MS Visual C++ 2005 SP1 Redistributable Package ATL Security Update installed.
org.pentaho.di.trans.steps.sapinput.sap.SAPException:
Possibly the SAP JCo implementation library (e.g. sapjco3.dll) does not exist or cannot be loaded. Please copy it to your libext directory! If you use version 3.0.5 or higher on Windows be sure to have MS Visual C++ 2005 SP1 Redistributable Package ATL Security Update installed.
at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.open(SAPConnectionImpl.java:144)
at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.open(SAPConnectionImpl.java:88)
at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.open(SAPConnectionImpl.java:65)
at org.pentaho.di.trans.steps.sapinput.sap.SAPConnectionFactory.getConnectionTestReport(SAPConnectionFactory.java:60)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2685)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:546)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:138)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:389)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:318)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:59)
at org.pentaho.di.ui.trans.step.BaseStepDialog$4.widgetSelected(BaseStepDialog.java:740)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.tableinput.TableInputDialog.open(TableInputDialog.java:435)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:124)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8720)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3027)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:744)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1310)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7931)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9202)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:648)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Please give the solutions.

Out of memory when using output task

$
0
0
I have a simple transform that reads from a file ,generates a number of checksums and then writes to a file/table

In either case when writing to a file or a table, the memory usage goes through the roof after 10 seconds and crashes spoon. If I turn off the output the rest of the transform runs at around 50k rows/sec and gets through the entire dataset without issue.

I feel like I must be missing something simple, but I am new to Kettle

I am running JRE 1.8, kettle 5.2, OSX 10.10

thanks

Changing XML file with PDI

$
0
0
Hi.

I am making a Transformation to enter roles in XML files of Mondrian, Using basically XML Add and Replace in string; The lines are being generated correctly, such as the following:
<Role name="Store001"><SchemaGrant access="all"><CubeGrant access="all" cube="Sales"><HierarchyGrant access="custom" hierarchy="[Store.Stores]" rollupPolicy="parcial"><MemberGrant access="all" member="[Store.Stores].[Cambui Porto]"> </MemberGrant></HierarchyGrant></CubeGrant></SchemaGrant></Role>

The problem is to insert it in the XML file. I'm trying to use the XML output, but instead of just insert the element it is overwriting the entire file with the Roles.
When I open the file in notepad it is like that:

Code:

<?xml version="1.0" encoding="ISO-8859-1"?>
<Schema>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja001&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Matriz&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja002&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Ivaipor&#xe3;&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja003&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Paran&#xe1; Max&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja004&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;C.D.&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja005&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Familia&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja006&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Ivaipor&#xe3; Hiper&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja007&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Cianorte&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja050&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Bob&#xb4;s Burgers&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja051&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Bob&#xb4;s Shake&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
 <Role><role>&#x3c;Role name&#x3d;&#x22;Loja055&#x22;&#x3e;&#x3c;SchemaGrant access&#x3d;&#x22;all&#x22;&#x3e;&#x3c;CubeGrant access&#x3d;&#x22;all&#x22; cube&#x3d;&#x22;Movimento&#x22;&#x3e;&#x3c;HierarchyGrant access&#x3d;&#x22;custom&#x22; hierarchy&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;&#x22; rollupPolicy&#x3d;&#x22;parcial&#x22;&#x3e;&#x3c;MemberGrant access&#x3d;&#x22;all&#x22; member&#x3d;&#x22;&#x5b;Unidade.Unidades&#x5d;.&#x5b;Cacau Caf&#xe9;&#x5d;&#x22;&#x3e; &#x3c;&#x2f;MemberGrant&#x3e;&#x3c;&#x2f;HierarchyGrant&#x3e;&#x3c;&#x2f;CubeGrant&#x3e;&#x3c;&#x2f;SchemaGrant&#x3e;&#x3c;&#x2f;Role&#x3e;</role> </Role>
</Schema>


So I have two questions: Can a Transformation change the XML file to add these Roles? And why the special characters are being changed by codes (&#X3C;, &#X3D;, &#x22;) where the XML Add generate the correct format?

Problem with report with parameter

$
0
0
Hello, world
Help please with problem when using report parameters.
Let me explain. I've created data set as OLAP in Report Designer. MDX request is valid. Report generates correctly when mdx has no any parameter. But, when i use any parameter in mdx like this:
.....
FROM [Balance]
WHERE ([Filial].[Name].[${filial}])

is this case, when i Run -> Print preview, no result is appear even though i change input parameter.

I think problem with caching of sql. Mondrian caches sql, generated by first call of mdx, which has been invoked at once when data set was created.
Maybe, i am mistaken and problem not related with cache.

Anyway, i need some help...:confused:
I dont know if i explained problem correctly, if u have question plz ask for more information

Passing input and output directory value in compressionOutputStream and compressionIn

$
0
0
Hi,


We are using Pentaho(PDI) 5.2 version with Linux. And also Hadoop(HDP 2.1) is installed in Linux OS .PDI and Hadoop are working fine. I am transferring and loading data to and fro to HDFS using "Hadoop file Input" and "Hadoop file output".
If we want to compress the data then compression support provided at - Hadoop file input/output -> content -> compression (GZIP,Snappy, Hadoop-snappy, Zip). But here we want to provide the compression type other than present in drop down list(Custom compression type). Inside that we want to pass input and output directory name. Is there any way to get the directory name?


Is there any other way to do it?


If any one know then kindly provide some solution.


Thanks.

Saiku MDX Mode - no way back

$
0
0
Hi,

I am using Saiku in BI-Server CE and after switching to MDX Mode there is no way back to normal mode.

I want to add custom Measures in MDX like [Measure][%B of A] as [Measure].[B]/[Measure].[A].

This all works but as I can not switch back to normal mode, I can not filter in the GUI because the three bars "Column", "Row" and "Filter" are missing.

The MSX looks like this:

Quote:

WITH MEMBER [Measures].[%B_of_A] AS [Measures].[B] / [Measures].[A], FORMAT_STRING = 'Percent'

SELECT
NON EMPTY {[Measures].[A], [Measures].[B], [Measures].[%B_of_A]} ON COLUMNS,
NON EMPTY CrossJoin([Info1].[Info1].Members, CrossJoin([Info2].[Info2].Members, [Info3].[Info3].Members)) ON ROWS
FROM [my_table]
I would appreciate any suggestions :)

Question about adding a new Instance to an Instances object

$
0
0
I'm having some trouble with something that should be pretty straightforward with the WEKA Java API but can't for the life of me figure out what I'm doing wrong. I have a number of .arff files generated by WEKA's TextDirectoryLoader that I'm trying to merge into one dataset to apply a StringToWordVector filter on. Each arff file has a class attribute and a serialized text attribute.

I'm approaching this by first creating an instances object from the first arff file, then for each remaining arff file, adding the label to the class attribute and appending each instance in the file to the instances object, as follows:
Code:

Instances data = null;
if(devices.size()>0){
            //otherwise, get the first device data
            data = getFromArff(path+devices.get(0)+".arff");
            //then for the rest, you have to get the Instances for the device and add each Instance one by one
            for(int i=1; i<devices.size(); i++){
                //get the arff data for this set
                Instances currentDeviceInstances = getFromArff(path+"/"+devices.get(i)+".arff");
                //create an addvalues filter to add the class label for this set
                AddValues addv = new AddValues();
                addv.setAttributeIndex("last");
                addv.setLabels(devices.get(i)+"");
                //add the attributes of the new instances to the data set
                try {
                    addv.setInputFormat(data);
                    Instances newDS = Filter.useFilter(data,addv);
                    data = newDS;
                    //for each instance from the arff file, add it to the data set
                    if(currentDeviceInstances != null){
                        for(int j=0; j<currentDeviceInstances.numInstances(); j++){
               
                            data.add(currentDeviceInstances.instance(j));
                           
                        }
                    }

However, every time that I call data.add(currentDeviceInstances.instance(j)), it adds a copy of the 1st instance in the instances object to the instances object instead of the new instance. Thus, when I am finished I have an arff file of all the same instances. When I step through with a debugger, the instance returned by currentDeviceInstances.instance(j) is different than the (copied) instance that gets appended to the instances object when calling add.

I initially thought it was some mistake I had made with references/scope but tried the copy method, making a new instance and manually copying, etc. and am getting the same result.

Any ideas would be very welcomed. I know I'm probably just missing something stupid here, but can't see the forest for the trees at this point.

Thanks,
Matt

Invalid attribute error.

$
0
0
I have following part of mondrian 4.0 schema:
<Schema name="Sales" metamodelVersion="4.0">
<PhysicalSchema>
<AutoGeneratedDateTable name='time_by_day_generated' startDate='2013-01-01' endDate='2013-12-31'/>
</PhysicalSchema>

<Cube name="Sales">
<Dimensions>
<Dimension name="Time" table="time_by_day_generated" type="TIME" key="Id">
<Attribute name="Id" keyColumn="time_id"/>
<Attribute name="Year" keyColumn="the_year" levelType="TimeYears"/>
<Attribute name="Month" keyColumn="month_of_year" nameColumn="the_month"
levelType="TimeMonths"/>
</Dimension>
...
</Dimensions>
...
</Cube>
</Schema>

When loading, i get error
Code:

mondrian.rolap.RolapSchema$MondrianSchemaException: Key attribute 'Id' is not a valid attribute of this dimension (in Dimension 'Time')
I had to change value of attribute 'type' in dimension from 'TimeDimension' (as seen in Mondrian in Action book) to 'TimeDimension' to 'TIME' not to get error, and there is no 'TimeDimension' value in mondrian 4 schema definition available online.

Suggestions?

Getting Chart Color Data from OnClick Event

$
0
0
Hi all,

So after struggling on and off for several weeks, I've decided to turn to the experts for some advice!

I am trying to create a dashboard using Pentaho with CCC charts. The dashboard has a number of interactive charts, where changing a value on one changes the output of the others. To this end, I have managed to make most of the functionality work, but I am having one major problem.

When I click on a data point on one chart, I need a second chart to create a plot that is the same color as the datapoint I clicked in the original chart. I have had some success with this by using "this.pvMark.fillStyle().color" in the clickAction, storing this color value in an array, and then plotting the new chart using this color data. My problem is, this only occasionally works, and I can't seem to figure out why it doesn't work in all cases. When the solution doesn't work, all of the plots come out as grey instead of the same color as the clicked data point (See the image below).

DashExample.jpg

I haven't been able to find any useful documentation at all to help me even know what functions/properties these objects have. What I have found has just been tidbits of information left on forums here and there. If anyone could point me to some real, complete documentation that would be amazing, and if anyone knows how to deal with this problem specifically, I would really appreciate it! ;)

Here is my onClick function:

Code:

function (e){    if (e.vars.series.value != "DNI")
    {
        var dt = e.vars.category.value; // Parsing the datetime into a time string
        h = dt.getHours();
        m = dt.getMinutes();
        s = dt.getSeconds();
        if(h<10) h= '0'+h;
        if(m<10) m= '0'+m;
        if(s<10) s= '0'+s;
       
        t = h + ":" + m + ":" + s; // Building the time string
       
        var daqval = Dashboards.getParameterValue('aDaqid');
        var newData = e.vars.series.value;
        inArray = false;
        for (i=0; i < daqval.length; i++) // Loop to ensure curve isn't already in list
        {
            if (daqval[i] == newData)
            {
                inArray = true;
                break;
            }
        }
        if (!inArray)
        {
            var color = this.pvMark.fillStyle().color;
            console.log(this.pvMark.strokeStyle());
            var colorArray = Dashboards.getParameterValue('aColorArray');
            var newarray = [];
            var newcolorArray = [];
            daqval[daqval.length] = newData;
            colorArray[colorArray.length] = color;
            console.log(colorArray);
           
            for (i=0; i< daqval.length; i++)
            {
                for (j=0; j < newarray.length; j++)
                {
                    if (daqval[i] < newarray[j])
                    {
                        newarray.splice(j, 0, daqval[i]);
                        newcolorArray.splice(j, 0, colorArray[i]);
                        break;
                    }
                   


                }
                if ((newarray.length === 0) || (daqval[i] > newarray[newarray.length-1]))
                {
                    newarray[newarray.length] = daqval[i];
                    newcolorArray[newcolorArray.length] = colorArray[i];
                }
            }
            daqval = newarray;
            colorArray = newcolorArray;
           
            Dashboards.fireChange('aDaqid', daqval);
            Dashboards.fireChange('aColorArray', colorArray);
        }
       
        Dashboards.fireChange('aTime', t);
        Dashboards.update([render_IV_Chart]);
        Dashboards.update([render_Spectrum_Chart]);
        Dashboards.update([render_Weather_Table]);


    }
}

Thanks!
Attached Images

Number of Distinct Cases

$
0
0
Hi Mark,

Does a variable which has a higher number of distinct cases have an advantage over another variable(as far as ranking in variable importance) with far fewer distinct cases?

Thanks,
Mike

using suzy can‘t see report but admin can in pentaho bi5.2

$
0
0
i found using suzy only can see report which don't with order by or where ,but if i using admin can see it ,pentaho metadata and pentaho report ,pentaho bi 5.2
log execption:
11:34:26,716 ERROR [SimplePmdDataFactory] error
org.pentaho.pms.core.exception.PentahoMetadataException: QueryXmlHelper.ERROR_0010 - model BV_MODEL_1 not found
at org.pentaho.metadata.query.model.util.QueryXmlHelper.fromXML(QueryXmlHelper.java:363)
at org.pentaho.metadata.query.model.util.QueryXmlHelper.fromXML(QueryXmlHelper.java:353)
at org.pentaho.metadata.query.model.util.QueryXmlHelper.fromXML(QueryXmlHelper.java:339)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.SimplePmdDataFactory.parseQuery(SimplePmdDataFactory.java:207)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.SimplePmdDataFactory.getReferencedFields(SimplePmdDataFactory.java:742)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdDataFactory.getReferencedFields(PmdDataFactory.java:194)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdDataFactoryCore.getReferencedFields(PmdDataFactoryCore.java:44)
at org.pentaho.reporting.engine.classic.core.metadata.DefaultDataFactoryMetaData.getReferencedFields(DefaultDataFactoryMetaData.java:79)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactoryCore.getReferencedFields(CompoundDataFactoryCore.java:46)
at org.pentaho.reporting.engine.classic.core.sorting.SortingDataFactoryCore.getReferencedFields(SortingDataFactoryCore.java:39)
at org.pentaho.reporting.engine.classic.core.metadata.DefaultDataFactoryMetaData.getReferencedFields(DefaultDataFactoryMetaData.java:79)
...


11:34:26,717 ERROR [SimplePmdDataFactory] error
org.pentaho.pms.core.exception.PentahoMetadataException: QueryXmlHelper.ERROR_0010 - model BV_MODEL_1 not found
at org.pentaho.metadata.query.model.util.QueryXmlHelper.fromXML(QueryXmlHelper.java:363)
at org.pentaho.metadata.query.model.util.QueryXmlHelper.fromXML(QueryXmlHelper.java:353)
at org.pentaho.metadata.query.model.util.QueryXmlHelper.fromXML(QueryXmlHelper.java:339)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.SimplePmdDataFactory.parseQuery(SimplePmdDataFactory.java:207)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.SimplePmdDataFactory.queryData(SimplePmdDataFactory.java:523)
at org.pentaho.reporting.engine.classic.extensions.datasources.pmd.PmdDataFactory.queryData(PmdDataFactory.java:159)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:205)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:505)
...

不好意思,我的英语不好,如果您能看懂中文,我用中文再次描述下:
我使用suzy这种power user,使用pentaho metadata 元数据模型制作报表发布,如果报表只是查询语句,suzy能在bi'上浏览看到,但是如果添加了order by 或者where 语句则看不到,但是admin超级用户是可以看到的,使用jdbc的方式是完全没有问题


哪位大神能赐教~谢谢~

Kettle table input step running slow; any way to improve it ?

$
0
0
Hello everyone;


I am using a mssqlserver database, and a little bit complex query. This query takes around 9 seconds to execute in a local machine, using a database client (navicat).


Kettle table input step takes "a lot more" than 9 seconds to execute the same query (40min the last time). Usually 10r/s aprox. :eek:


At this moment pentaho and the source database are in the same machine. So, network communication should not be the problem.


If anyone has any idea on how to improve this performance, please reply.


If another client is doing the same query in 9sec; why not the input step ? Maybe I have a concept error about the "table/input" step :confused:.

No option have being changed on kettle or the table input component; everything has its default options.


Thanks in advance.

Expandable table locks if changing rows

$
0
0
Hi all,

I just noticed that expandable tables get locked and stuck on a loading animation if we open a new row and the old one is not visible any more. The steps are the following:

1. Click on a expandable table, a new row with the details will appear below the clicked row.
2. Enter any search in the search with a result that does not include the current row.
3. Click on the found row(s). The table will get stuck.

Is this something that happens to anyone else?

Thanks and regards.
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>