Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Mobile pentaho BI

$
0
0
Hi
How to setup configuration for mobile to view pentaho reports.

what are the steps need to be done for mobile view????


Thanks in advance

error after installink sparkl plugin

$
0
0
i have installed the sparkl plugin and restarted the server .
when I try to access to tools -> Sparkl I get an error .

in the log file pentaho.log :

2014-01-09 11:50:25,780 ERROR [pt.webdetails.cpk.CpkEngine] Error initializing element type pt.webdetails.cpk.elements.impl.DashboardElement: [ java.lang.IllegalArgumentException ] - wrong number of arguments
2014-01-09 11:50:25,791 ERROR [pt.webdetails.cpk.CpkEngine] Error initializing element type pt.webdetails.cpk.elements.impl.KettleJobElement: [ java.lang.ClassNotFoundException ] - pt.webdetails.cpk.elements.impl.KettleJobElement
2014-01-09 11:50:25,791 ERROR [pt.webdetails.cpk.CpkEngine] Error initializing element type pt.webdetails.cpk.elements.impl.KettleTransformationElement: [ java.lang.ClassNotFoundException ] - pt.webdetails.cpk.elements.impl.KettleTransformationElement

in the catalina.log :

SEVERE: The exception contained within MappableContainerException could not be mapped to a response, re-throwing to the HTTP container
pt.webdetails.cpk.NoElementException: Unable to get element!
at pt.webdetails.cpk.CpkCoreService.createContent(CpkCoreService.java:92)
at pt.webdetails.cpk.CpkApi.callEndpoint(CpkApi.java:391)
at pt.webdetails.cpk.CpkApi.genericEndpointGet(CpkApi.java:116)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$VoidOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:167)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:62)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:67)
at org.pentaho.platform.web.servlet.PluginDispatchServlet.service(PluginDispatchServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:161)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:83)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:59)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:112)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:66)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:879)
at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:600)
at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1703)
at java.lang.Thread.run(Thread.java:722)



thanks for any help .

BTable problems!!!

$
0
0
I'm really excited for this new plugin but I have an error trying to make it work in Bi-server 5.0.1 CE.

FIRST:
In the PUC, when I click on the tab "Tools" --> "BTable" this appears:
"Could not load dashboard: Unknown plugin"

SECOND:
In the PUC, when I click on the button "Create New" --> "BTable Analyzer" when I select my Catalog this red warning appears:
"A BTable datasource for the selected catalog does not exists!"
(even if the "Cube", "Measure", "Dimension" are correctly configured)

Are the First and the Second error related to each other? Do you have any idea?

Thank you all

Dashboard design

$
0
0
Hi ,

I am trying to create my own template for the dashboard design. Currently all the templates are with boxes. I would like to create a template which is having different tab or pages instead of showing all the components in the same page.

I tied modifying the xul file with tabs. But it is not working. Is anybody used the tab creations in dashboard design?

Thanks
fabeena

Problem saving in repository from antoher workstation than database server (5.0.1 CE)

$
0
0
Hi everyone!

I'm working on PDI 5.0.1 CE. Repository type - database (postgres).
When working on server everything works fine. When trying to work on my laptop I can do everything except save into repository.
On save I get error:
java.lang.reflect.InvocationTargetException: Error saving transformation: org.pentaho.di.core.exception.KettleException:
Unable to save Job in repository, database rollback performed.

Error inserting/updating row
BŁĄD: wartość zbyt długa dla typu znakowego (1)


at org.pentaho.di.ui.spoon.dialog.SaveProgressDialog$1.run(SaveProgressDialog.java:82)
at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:113)
Caused by: org.pentaho.di.core.exception.KettleException:
Unable to save Job in repository, database rollback performed.

Error inserting/updating row
BŁĄD: wartość zbyt długa dla typu znakowego (1)


at org.pentaho.di.repository.kdr.delegates.KettleDatabaseRepositoryJobDelegate.saveJob(KettleDatabaseRepositoryJobDelegate.java:215)
at org.pentaho.di.repository.kdr.KettleDatabaseRepository.save(KettleDatabaseRepository.java:381)
at org.pentaho.di.repository.kdr.KettleDatabaseRepository.save(KettleDatabaseRepository.java:366)
at org.pentaho.di.repository.AbstractRepository.save(AbstractRepository.java:113)
at org.pentaho.di.ui.spoon.dialog.SaveProgressDialog$1.run(SaveProgressDialog.java:78)
... 1 more
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error inserting/updating row
BŁĄD: wartość zbyt długa dla typu znakowego (1)

at org.pentaho.di.core.database.Database.insertRow(Database.java:1193)
at org.pentaho.di.core.database.Database.insertRow(Database.java:1114)
at org.pentaho.di.core.database.Database.insertRow(Database.java:1098)
at org.pentaho.di.core.database.Database.insertRow(Database.java:1086)
at org.pentaho.di.repository.kdr.delegates.KettleDatabaseRepositoryJobDelegate.insertJob(KettleDatabaseRepositoryJobDelegate.java:809)
at org.pentaho.di.repository.kdr.delegates.KettleDatabaseRepositoryJobDelegate.saveJob(KettleDatabaseRepositoryJobDelegate.java:149)
... 5 more
Caused by: org.postgresql.util.PSQLException: BŁĄD: wartość zbyt długa dla typu znakowego (1)
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2161)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1890)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:560)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:417)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:363)
at org.pentaho.di.core.database.Database.insertRow(Database.java:1153)
... 10 more

Anyone can help?

report of picture adress insert in database

$
0
0
Hello,
We work with pentaho 4.8 and PDR 3.9.
In a Oracle DB some picture adresse (on same server as be server) have been insert.
I try to use this adress to use it on a report.
I put picture objet in detail row and use variable with adress in parameter.
On PDR and in server after publishing, we see only the first picture :eek:.
Do you know why? :confused:

Do you any other idea to call different picture from there adress? :confused:

thank you for help


mathieu

error after upgrading CTOOLS in 5.0.1 CE

$
0
0
As in title, after launchong the script:
./ctools-installer.sh -s /home/pentaho/server/biserver-ce/pentaho-solutions -w /home/pentaho/server/biserver-ce/tomcat/webapps/pentaho -y

and restarting the server, the markeplace shows everything is up to date BUT now all the dashboard I did not work anymore, when I launch them it say:
"Sorry we really did try. please try again etc.."

and I'm not able to create a new CDE as the button shows "${Launcher.NEW_CDE}".

Anyone can help me please?

How do I download the older version of Pentaho EE 4.3.0


PostgreSQL Bulk Loader - don't fail on bad input

$
0
0
Is there a way for the PostgreSQL Bulk Loader to continue doing inserts even if it encounters bad input? As in, it should continue inserting the rows that are valid, and if it encounters one that is invalid, it will drop it and continue processing. When bulk loading large amount of data, I would prefer it to not die immediately just because of one bad data point. Unchecking "Stop on error" doesn't seem to help. It will continue, but I don't see any inserts occurring.

Fixed Flat File with record split on 2 rows

$
0
0
Very new to Pentaho... I need to read a fixed flat file with the following format

recA ----- fixed colum data here colums in format A -----
recB ----- fixed colum data here colums in format B -----
recA ----- fixed colum data here colums in format A -----
recB ----- fixed colum data here colums in format B -----
recA ----- fixed colum data here colums in format A -----
recB ----- fixed colum data here colums in format B -----
.
.
.

I am trying to figure out a way to read the file and merge the recA/recB into one row and map the fields to that I can then select the fields I want to write to a database table

so wind up with:
recA ----- fixed colum data here colums in format A -----recB ----- fixed colum data here colums in format B -----
recA ----- fixed colum data here colums in format A -----recB ----- fixed colum data here colums in format B -----
recA ----- fixed colum data here colums in format A -----recB ----- fixed colum data here colums in format B -----

Then I can write to a database table..

Any suggestions?

Cannot create JDBC datasource in Pentaho BI CE 5

$
0
0
Hi there,

I am trying to create a Mysql JDBC datasource en Pentaho 5CE. After filling all info and pressing test button, it takes about two minutes an then a Bad Gateway message appears. Both mysql and pentaho and mysql db are at the same machine. I cannot find the logs for this operation. Any help appreciated.


Regards,

Rafael

PS. I had also Pentaho 4.8 installed previously, and datasources where created without problems.

Designers website chat room, any recommendation?

$
0
0
I am a designer and would like to add a chat room with ability to design it so it will look like my current site design. Lately I added a chat room (text chat) that let me add YouTube videos and images. This made users stay longer.


The issue is that I cannot integrate my user base (I have around 1500 users) to the chat room and all need to re-login.


I checked both 2 good services RumbleTalk chat and c-box chat but then the lack of 3rd party users integration is a problem, any recommendation?

Creating buckets by comparing Dimension member against Measure using MDX

$
0
0
I have a Product dimension which has hierarchies - product_id, initial_price, product_name and Measures - units_sold, cost_per_unit, revenue_per_unit. For each product_id I need to compare initial_price against revenue_per_unit and then assign it to a bucket based on the comparison.


Consider the following example:
Input:
product_id initial_price revenue_per_unit
1 10 12
2 20 18
3 30 30


Output:
product_id bucket_name
1 Profit
2 Loss
3 Same


How can I achieve this using MDX?

KettleDatabaseException - Invalid URL when trying to connect to Sybase

$
0
0
Hi all

I've been googling for around 5 hours now and I can't figure out why none of my both created DB connections to Sybase ASE 15 is not working.


1. Using the jconn3.jar (from Jconnect Sybase) which I put at \biserver-ce\tomcat\lib (Windows 7 environment)

URL jdbc:sybase:Tds:host:port (I also tried jdbc:sybase:Tds:host:port/dbname)
Class Name: com.sybase.jdbc3.jdbc.SybDriver
Username: sa
Password: password

Error Message:
Error connecting to DB using class com.sybase.jdbc3.jdbc.SybDriver
Invalid URL: jdbc:sybase:Tds:host:port

I made a telnet on host port and it worked, so it can't be a network problem.


2nd try with jtds-1.3.1.jar

URL: jdbc:jtds:sybase://host:port/dbname
Class name: net.sourceforge.jtds.jdbc.Driver
Username: sa
password: password

Error Message, the same as above:
Error connecting to DB using class net.sourceforge.jtds.jdbc.Driver
Invalid URL: jdbc:jtds:sybase://host:port/dbname


Another DB connection to a mysql DB works great.

I would appreciate any kind of help provided on this topic!

make "batch inserts in table output" generate single multi-rows insert statement

$
0
0
Hello All,

The option "Use batch update for inserts" in table output step seem to generate a group of insert statements, as opposed to "single multi-row insert statements", as stated in wiki
"Enable if you want to use batch inserts. This feature groups inserts statements to limit round trips to the database.".

But this feature seem to have been bottleneck in terms of performance in the columnar db platforms like Amazon Redshift.
So I would like to explore, if there is anything that could make the table output step generate the single multi-row insert statements.

Please share your thoughts or let me know, if I am missing something...

Thanks,
Sujen

problem with character (á) in cvs file name

$
0
0
Hi,

I have some cvs file with the name: ExámenMedico1012011_30062011.cvs, ExámenMedico1012013_30612011.cvs and I nedd to store in a database.

For process I have a server linux 64 bits and PDI 4.2.1

The problem is that PDI doesn't recognize the file for processing in linux, but in windows it works.
The problem is the name of the file (Exámen), the character (á). I tried replace (á) for (a) and works, but the name of the file is with á.
I tried with a regular expresion *.cvs o something like that but doesn't works.

The solution can be, change the name of the file in linux, but I dont know how because I tried with the command:
iconv -f latin1 -t utf8 Exámen01012011_30062011.csv> Examen01012011_30062011.csv

but the problem is that I need change the name at all files called Exámen... and keep the same name
and the command only works if I say the exactly name and also this command change the name of the file and content and I just need change tha name.

I just need that PDI recognize the file for process in linux.

Thank for your help.

Cannot start PDI 4.0 or 4.4 on win XP 32 bit machine

$
0
0
Hi All,

I've been using PDI/Kettle for about 5 years now intermittently, and it's still awesome! However, I'm struggling on my new machine to get PDI running . I've set the environment java_home variable to my java directory. I've tried with Java 1.6.04 and 1.7.045. I've also tried PDI 4.0 and 4.4

The error message I'm getting when I try starting with the java command (instead of 'start javaw') is as follows:

Code:

C:\Documents and Settings\me\Desktop\pdi>spoon.bat
DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jre1.6.0_04\bin
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jre1.6.0_04\bin\bin\javaw
 
C:\Documents and Settings\me\Desktop\pdi>java "-Xmx512m" "-XX:MaxPermSize=256
m" "-Djava.library.path=libswt\win32" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-
DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SI
ZE_LIMIT=" -jar launcher\launcher.jar -lib ..\libswt\win32
java.lang.UnsatisfiedLinkError: org.eclipse.swt.internal.C.PTR_sizeof()I
        at org.eclipse.swt.internal.C.PTR_sizeof(Native Method)
        at org.eclipse.swt.internal.C.<clinit>(Unknown Source)
        at org.eclipse.swt.widgets.Display.<clinit>(Unknown Source)
        at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:540)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.pentaho.commons.launcher.Launcher.main(Launcher.java:134)

Has anyone got any ideas on how I might solve this?

I appreciate your help!!
Rowan.

Simple Hadoop Copy Files+Pentaho Trial Pack 5.0.2+Amazon AWS ( Elastic Map Reduce )

$
0
0
Hello all,

We have Amazon Web Service with Amazon EMR up and running.

For building a prototype, I am just trying to use Hadoop Copy Files component to copy one file from local disk to the HDFS.

So I just added a component Hadoop Copy Files and specified the required Input and Output info , but unfortunately the job is failing during execution.

Here below I added the full stack trace.
Quote:

2014/01/10 10:05:48 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : Couldn't created parent folder hdfs://machine.compute-1.amazonaws.com:9000/mnt
2014/01/10 10:05:48 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : org.apache.commons.vfs.FileSystemException: Could not create folder "hdfs://machine.compute-1.amazonaws.com:9000/".
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.commons.vfs.provider.AbstractFileObject.createFolder(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.commons.vfs.provider.AbstractFileObject.createFolder(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.CreateDestinationFolder(JobEntryCopyFiles.java:667)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:386)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:326)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:678)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:815)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:500)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.di.job.Job.run(Job.java:407)
2014/01/10 10:05:48 - Hadoop Copy Files - Caused by: java.io.IOException: Failed on local exception: java.io.IOException: An established connection was aborted by the software in your host machine; Host Details : local host is: "CHEJX05CQ1/2.0.0.2"; destination host is: "machine.compute-1.amazonaws.com":9000;
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.ipc.Client.call(Client.java:1351)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.ipc.Client.call(Client.java:1300)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
2014/01/10 10:05:48 - Hadoop Copy Files - at com.sun.proxy.$Proxy100.mkdirs(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at java.lang.reflect.Method.invoke(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2014/01/10 10:05:48 - Hadoop Copy Files - at com.sun.proxy.$Proxy100.mkdirs(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:467)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2394)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2365)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:817)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:813)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:813)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:806)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1933)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.pentaho.hdfs.vfs.HDFSFileObject.doCreateFolder(HDFSFileObject.java:81)
2014/01/10 10:05:48 - Hadoop Copy Files - ... 9 more
2014/01/10 10:05:48 - Hadoop Copy Files - Caused by: java.io.IOException: An established connection was aborted by the software in your host machine
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.nio.ch.SocketDispatcher.read0(Native Method)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.nio.ch.SocketDispatcher.read(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.nio.ch.IOUtil.readIntoNativeBuffer(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.nio.ch.IOUtil.read(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at sun.nio.ch.SocketChannelImpl.read(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
2014/01/10 10:05:48 - Hadoop Copy Files - at java.io.FilterInputStream.read(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at java.io.FilterInputStream.read(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:457)
2014/01/10 10:05:48 - Hadoop Copy Files - at java.io.BufferedInputStream.fill(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at java.io.BufferedInputStream.read(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at java.io.DataInputStream.readInt(Unknown Source)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:995)
2014/01/10 10:05:48 - Hadoop Copy Files - at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
2014/01/10 10:05:48 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : Destination folder does not exist!
2014/01/10 10:05:48 - TestLoadEMR - Finished job entry [Hadoop Copy Files] (result=[false])
Any Suggestion to make this one work guys.

Thanks in advance.

Regards,
VAP.

Error in pentaho.log: Invalid connection properties

$
0
0
Hi, please can someone help me:
here is the output of my pentaho.log.
ERROR [org.pentaho.platform.util.logging.Logger] misc-java.lang.String: MDXConnection.ERROR_0002 - Invalid connection properties: PoolNeeded=false; Locale=de_DE; Catalog=solution:/dc/Cube.xml; Provider=mondrian
mondrian.olap.MondrianException: Mondrian Error:Error while loading/reloading aggregates.
at mondrian.resource.MondrianResource$_Def3.ex(MondrianResource.java:1020)
at mondrian.rolap.aggmatcher.AggTableManager.initialize(AggTableManager.java:95)
at mondrian.rolap.RolapSchema.load(RolapSchema.java:430)
at mondrian.rolap.RolapSchema.<init>(RolapSchema.java:216)
at mondrian.rolap.RolapSchemaPool.get(RolapSchemaPool.java:214)
at mondrian.rolap.RolapSchemaPool.get(RolapSchemaPool.java:80)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:167)
at mondrian.rolap.RolapConnection.<init>(RolapConnection.java:90)
at mondrian.olap.DriverManager.getConnection(DriverManager.java:112)
at org.pentaho.platform.plugin.services.connections.mondrian.MDXConnection.init(MDXConnection.java:241)
at org.pentaho.platform.plugin.services.connections.mondrian.MDXConnection.init(MDXConnection.java:147)
at org.pentaho.platform.plugin.services.connections.mondrian.MDXConnection.setProperties(MDXConnection.java:107)
at org.pentaho.platform.engine.services.connection.PentahoConnectionFactory.getConnection(PentahoConnectionFactory.java:129)
at com.pentaho.analyzer.service.impl.OlapConnectionManagerImpl.createConnection(SourceFile:102)
at com.pentaho.analyzer.service.impl.c.getConnection(SourceFile:31)
at com.pentaho.analyzer.service.impl.OlapMetaDataManager.getConnection(SourceFile:49)
....
The complete log is attached in this post.
How can I fix this error? What is the meening of "PoolNeeded=False"? Mondrian should use a connection Pool!
Many Thanks in advance
Ayhan
Attached Files

Report Designer - Bar Chart - Legend shows only series 1 and series 2 instead of name

$
0
0
Hi all

I created a bar chart in report designer from pentaho with a data source which counts two different values (open and closed tickets) per month over the last 12 monts (see details below).

In the bar chart properties I have selected:
  • category-column: month
  • value columns: open tickets, closed tickets


When running the report, the legend only shows me "Series 1" and "Series 2"

Does anybody know why it shows that and not "open" and "close" as the column headline in the sql?

Thanks a lot for any kind of tips.

Best regards
Michael

SELECT
MONTHNAME(ticket.create_time) as 'MONTH_'
, count(t1.id) as 'open'
, (SELECT count(t2.id) FROM ticket t2
INNER JOIN ticket_history th ON th.ticket_id = t2.id
INNER JOIN ticket_history_type tht ON th.history_type_id = tht.id
WHERE tht.name = 'StateUpdate'
AND th.name LIKE '\%\%%\%\%closed%'
AND MONTHNAME(th.create_time) = MONTHNAME(ticket.create_time)
AND t2.queue_id NOT IN (3,4) /*nicht in Junk und System Message*/
AND th.create_time >= th.create_time - INTERVAL 12 MONTH
) as 'close'
FROM ticket
JOIN ticket t1 ON ticket.id = t1.id
WHERE
ticket.queue_id NOT IN (3,4) /*nicht in Junk und System Message*/
AND ticket.create_time >= ticket.create_time - INTERVAL 12 MONTH
GROUP BY MONTH_
ORDER BY ticket.create_time DESC
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>