Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

How to Show Yesterday's Business date while opening Analyzer Report

$
0
0
Hi All,

I have developed many analyzer report, but by default it will display date based on the date of the published report, for example if i published the report on 2015-01-01, it will show these data only unless if I filter by date.

so how am going to display yesterday's date by default in Pentaho 4.3??

slowly changing dimension identical records getting flagged as different

$
0
0
running pentaho 6.0. seeing an odd situation. I populate the dimension and immediately after run an update. It still flags most records as changed (see below). What could be the problem?

Comparing account_id Integer(9) and account_id Integer(9) (cmp=0) --> identical=true, insert=false, punch=false
Comparing status_id Integer(4) and status_id Integer(9) (cmp=0) --> identical=true, insert=false, punch=false
Comparing date_created Timestamp and date_created Timestamp (cmp=-1) --> identical=false, insert=true, punch=false
Comparing account_name String(60) and account_name String(60) (cmp=0) --> identical=false, insert=true, punch=false
Comparing company String(100) and company String(100) (cmp=0) --> identical=false, insert=true, punch=false

multiple charts and tables on the same dashboard panel

$
0
0
Hi I need your help!! maybe it's too simple,but i'm not able to do it yet..
I've followed some tutorials to display tables and charts using ktr as datasources, passing parameters and so now i need to do one more thing
The idea is to show a diferent table and chart depending on the list item selected... like this:

tableSample.jpg

the table on the right panel listens to both combos and changes it's data (working nice), now i need to show another table if the user clicks on the list items on the left, i hope you understand my issue

Maybe i'm not searching the right key-words, if that's the case please tell me!!

Thanks in advance
Attached Images

Documentation for ETL Metadata Injection

$
0
0
Sorry if this reads a bit like "There's a hole in my Bucket".

I'm running PDI v 6.0.

I am having trouble streaming data into and out of a transformation generated by the ETL Metadata Injection step.

So I went to the Help page, but it looks like it is out of date: "Streaming source step" and "Streaming target step" are missing, and I think "Template step to read form (optional) might be a new name for "Source step to read from (optional)?.

So I tried to post a comment on the help page, or maybe even update the page itself, but I couldn't log in.

So I tried to register as a user on the wiki, but I couldn't find anywhere to register.

So I thought I would post a query about that on the "Documentation" forum, but it's in the graveyard.

So I'm posting here.

Is there anywhere where the ETL metadata injection step -- and particularly the streaming fields -- is documented?

Hello, is there anyone here? I am Chinese, my English is not good, I have a problem,

$
0
0
That's the way it is.


Because we need to use kettle to obtain data, then java code through or.Kjb.Ktr acquisition data, also need to access to the data sent to log on,
but there is no need to save to the database, it is probably. So I would like to know the bottom of the Kitchen Kettle and Span in the end is how to run.

I do not know if you can understand my problem, because I am using translation software translation, but I hope someone can help me, give me a talk about the principles or processes, the best information can be sent to me! Thank you!

Yeild or IRR formulas in Pentaho Spoon

$
0
0
Can I use Yeild or IRR formulas with pentaho spoon 5.4?

How to handle skewed data?

$
0
0
I have binary classification problem. I have total 9000 records. 1200 records are with class 'Yes' and rest with class 'No'. I want to do Feature Selection. Do I need to use SMOTE for balancing? Should i use SMOTE before or after Feature Selection? Someone, pl elaborate how to handle this type of data.

Lock wait timeout exceeded; try restarting transaction

$
0
0
Hi,

Sorry for raising thread here, i was place same info in MySQL forum as well i may get proper info if it is related to PDI. that's why i created here in this forum.

I am using mysql innodb engine,java 1.7, pdi-ce-6.0,windows OS and facing issue like Lock wait timeout exceeded; try restarting transaction when i am trying to load 4800000 records from source to target database (both are MySQL only).

i am getting issue exactly at 3600000 records set, alternative solution i given to my database with command like : set innodb_lock_wait_timeout=100

previously the size = 50 now increased to 100 but i hope this solution is not accurate for long time usage

below is the snap shot , i am getting issue at Synchronization after merge step.
sync after merge.jpg


Thank you
Attached Images

Apache Hive Fetch Size

$
0
0
Does anyone know when there will be a fix that will allow fetch size of Hive queries to stream > 50 records?

It makes it nearly impossible to use Hive as a input source when trying to run a query and steam data in a transformation. It does appear as if the changes have been made in the JDBC but I'm wondering how long it will take for them to get integrated in to data integration (pardon the redundancy).

If anyone know a work around and has been successful steaming data from a Hive query ... I'd love to know the secret. At 50 records per, it's like watching paint dry on large result sets.

Appreciate it.

Any compatibility issues running PDI 6.0 transformations on 3.0.0 biserver-ce?

$
0
0
We're using fairly old community Pentaho software in a reporting tool: 3.0.0 biserver-ce and (probably similarly versioned) kettle jobs and transformations. If we modify some of the kettle transformations, using Spoon 6.0 (CE), will there be any compatibility problems? If not, is the appropriate version of kettle (CE) available somewhere?

Thanks

How to append date to the file generated in pentaho report designer?

$
0
0
Hi All,

Does any one know how to appended the date and time to the report file which is exported in CSV format. For example: Report file name should like "Report_Jan_2016". date and time should change dynamically. Please do needful

Regards,
Naveen

Exception while sharing the report to a jdbc role in Pentaho BI server 5.0 CE

$
0
0
Hi,

I am using Pentaho BI server 5.0 Community edition and I recently implemented the Manual LDAP/JDBC hybrid configuration by following the steps given in this link https://help.pentaho.com/Documentati...P0/150/010/050.

Everything works fine. I am able to login using LDAP username and password and able to see the jdbc roles in administrator page.
But the problem is with assigning/sharing the report to role/user.

I logged into BI server as a administrator and try to assign/share a report to one of the jdbc user/role. While doing this I am getting a popup
showing that "Sorry, You can't do that right now." and the following java exception shown in Catalina logs.


SEVERE: The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
org.pentaho.platform.api.repository2.unified.UnifiedRepositoryException: exception while updating ACL for file with id "3f216692-5f8e-44b9-934a-5beae9a96f2b"


Reference number: 66910080-5049-405e-bec7-7c6b1466b1ca
at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.callLogThrow(ExceptionLoggingDecorator.java:476)
at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.updateAcl(ExceptionLoggingDecorator.java:400)
at org.pentaho.platform.repository2.unified.webservices.DefaultUnifiedRepositoryWebService.updateAcl(DefaultUnifiedRepositoryWebService.java:266)
at org.pentaho.platform.web.http.api.resources.FileResource.setFileAcls(FileResource.java:628)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:111)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:116)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:161)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:83)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:59)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:112)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:66)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:879)
at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:600)
at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1703)
at java.lang.Thread.run(Unknown Source)


Kindly help me if anyone aware of this problem and how to fix it.

Thanks in advance.

cas security and access to pentaho resources

$
0
0
i have succesfully configured pentaho to authenticate users with cas .
if i try to access the pentaho home , i get redirect to the cas login .
if i try to access a pentaho resource ( a report a dashboard ) i don't get redirect to the cas login .
it seems to me a lack of functionality .

why the second case does not work ?
is there any way to make it work ( for example a spring security filter ) ?

thanks for any answer

Error comparing fields - cannot find lookup field

$
0
0
Hi everybody,

I get the following exception when I run an etl transformation the second time:

Code:

2016/01/07 15:15:40 - Load dim table.0 - Comparing last_modified Date and last_modified Timestamp (cmp=1) --> identical=false, insert=true, punch=false
2016/01/07 15:15:40 - Load dim table.0 - Comparing description String and description String(5000) (cmp=0) --> identical=false, insert=true, punch=false
2016/01/07 15:15:40 - Load dim table.0 - Comparing creation_date Date and creation_date Timestamp (cmp=1) --> identical=false, insert=true, punch=false
2016/01/07 15:15:40 - Load dim table.0 - Comparing name String and name String(2000) (cmp=0) --> identical=false, insert=true, punch=false
2016/01/07 15:15:40 - Load dim table.0 - Comparing state String and state String(30) (cmp=0) --> identical=false, insert=true, punch=false
2016/01/07 15:15:40 - Load dim table.0 - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : Because of an error this step can't continue:
2016/01/07 15:15:40 - Load dim table.0 - Error comparing fields - cannot find lookup field [account_key]
2016/01/07 15:15:40 - Load dim table.0 - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2016/01/07 15:15:40 - Load dim table.0 - Error comparing fields - cannot find lookup field [account_key]
2016/01/07 15:15:40 - Load dim table.0 -
2016/01/07 15:15:40 - Load dim table.0 -        at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.lookupValues(DimensionLookup.java:645)
2016/01/07 15:15:40 - Load dim table.0 -        at org.pentaho.di.trans.steps.dimensionlookup.DimensionLookup.processRow(DimensionLookup.java:229)
2016/01/07 15:15:40 - Load dim table.0 -        at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/01/07 15:15:40 - Load dim table.0 -        at java.lang.Thread.run(Thread.java:745)
2016/01/07 15:15:40 - Load dim table.0 - Signaling 'output done' to 0 output rowsets.

I am using version 6.0.0.0-353, build 1 of kettle. The only thing I have found on the web that resembles my problem is http://jira.pentaho.com/browse/PDI-8341 that was apparently fixed in version 5.0.0.

Any help is highly appreciated.

Thank you.

Issues publishing a cube from mondrian workbench schema to Pentaho BI Server

$
0
0
Hi, good day. I am having a problem with publishing a cube schema from mondrian workbench schema to Pentaho 6. The issue is, it publishes, however, when I try to use in Pentaho as a datasource, I am able to do so. I cannot use it with Saiku either. It pretty much says it is blank.

How to use Get Variable Parameter value in R Script Executor through PDI.

$
0
0
Hi Sir,

I have installed the R plugin in my PDI 5.1 CE and successfully working fine. I have written one transformation, in that transformation I am getting a few of parameter through Get variable step and passing that parameter value in my R script step dynamically. but unfortunately I am unable to pass the parameter in the R script and getting error which I have mentioned below:

2016/01/07 14:25:27 - Write to log.0 -
2016/01/07 14:25:27 - Write to log.0 - ------------> Linenr 1------------------------------
2016/01/07 14:25:27 - Write to log.0 - log---
2016/01/07 14:25:27 - Write to log.0 -
2016/01/07 14:25:27 - Write to log.0 - input_path = D:\survey\study\urc\input
2016/01/07 14:25:27 - Write to log.0 - processed_path = D:\survey\study\urc\processed
2016/01/07 14:25:27 - Write to log.0 -
2016/01/07 14:25:27 - Write to log.0 - ====================
2016/01/07 14:25:27 - Write to log.0 - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
2016/01/07 14:25:27 - R script executor.0 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : Unexpected error
2016/01/07 14:25:27 - R script executor.0 - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : java.lang.ArrayIndexOutOfBoundsException: -1

2016/01/07 14:25:27 - R script executor.0 - at java.util.ArrayList.get(ArrayList.java:324)
2016/01/07 14:25:27 - R script executor.0 - at org.pentaho.di.trans.steps.rscriptexecutor.RScriptExecutorData.pruneNullRowsFromSample(RScriptExecutorData.java:399)
2016/01/07 14:25:27 - R script executor.0 - at org.pentaho.di.trans.steps.rscriptexecutor.RScriptExecutor.processBatch(RScriptExecutor.java:224)
2016/01/07 14:25:27 - R script executor.0 - at org.pentaho.di.trans.steps.rscriptexecutor.RScriptExecutor.processRow(RScriptExecutor.java:184)
2016/01/07 14:25:27 - R script executor.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/01/07 14:25:27 - R script executor.0 - at java.lang.Thread.run(Thread.java:662)
2016/01/07 14:25:27 - R script executor.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)

Also i have mentioned my R script where i am using those parameter value:

library(foreign)
library(surveydata)

#####getting variable script start here#####

input_path<-as.character(input_path[1,1])

processed_path<-as.character(processed_path[1,2])

#####getting variable script end here#####

file <- dir(input_path, pattern =".sav")
file_path<- paste(input_path,file[1],sep='/')
file.copy(file_path,processed_path)
file.remove(file_path)
found1<-as.data.frame(input_path)
found1

Let me know where i am doing wrong.

Start position of a string

$
0
0
I need to find the start position of a string within a field.

I would like to receive an example.

Thanks.

Osmar

401 Unauthorized from REST step, works in CURL

$
0
0
This works in a shell (powershell)
$Cred = Get-Credential
$Url = 'http://intranet/some/api'
$Response = Invoke-WebRequest -Uri $Url -Credential $Cred

Copy/paste the URL to Kettle REST step
Add row generator to generate a row
In Authentication step, add user and password

Result: 401 Unauthorized

What am I missing? Tried to search forums, but does not allow to search for 401

How to read unicode with the JSON Input?

$
0
0
I am using the JSON Input node to read a JSON file. The file is unicode. I am getting a bad character error, which makes me think this node can't read unicode. Unicode has a 0xff 0xfe byte order mark which makes it stumble.

Are there any workarounds?

And how can I search the forums for two search terms in AND not OR mode?

Thanks
Martin

Multiple database table joins

$
0
0
Is it possible to join table data from different database tables in a single report ?
Viewing all 16689 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>