Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Export CDE dashboard to PDF format

$
0
0
Hi all,
I suppose that who have developed cde dashboard for a customer has had the request "...and now I would like to export this beautiful dashboards in PDF".
:confused:PANIC!

I have tried the phantomjs/rasterize solution described here: http://joshid.github.io/blog/2014/10...ho-pdf-export/
applying the export functionality to the sample dashboard in "public>Steel Wheels>Dashboards>CTools_dashboard.wcdf"
The mechanism works but the result is a PDF (see attachment) without graphics but only a few labels:(:(
Some idea?
Someone has implemented a working solution?
Since this is an common issue I hope in the help of all "cde dashboards lovers"...
Thanks

(note: for the moment I have not try the solution that involves the use of CGG/PRPT report)
Attached Files

Images not displayed when embeded in java application

$
0
0
Hi,
I have created some bar charts, pie charts and integrated into my web application. when the report is accessed from web application unable to see the charts.
Getting 404 error http://<localhost>/pentaho/getImage?image=name.png

I'm using output-target=table/html;page-mode=stream&dashboard-mode=true

Please help.

Thanks,
Padma Priya.

PDI - Text File Input - RegEx to find most current file in folder

$
0
0
I'm working with a Text File Input step, pulling in a CSV file into my stream. The folder in which the CSV file exists is updated weekly with a new file. I would like the Text File Input step to use the most up to date file each time the transformation is run.
I've tried looking for a RegEx expression to handle this, but have not found anything directly related in the Pentaho Forums.
I'm also not sure if this is even possible, I have seen other steps discussing a file name step in a job before being able to find the most current.

A Regular Expression would be the ideal solution if that is possible.

Thanks!

PDI Json input Error depending on Tomcat Run Mode

$
0
0
I have a report based on a transformation that includes a Json Input step. This report and step seems to work correctly if the BI CE server has been run via the start-pentaho.bat (I'm on windows), but fails if the BI server is run under the Windows service as a jvm.
The Tomcat folder pentaho/WEB_INF/lib are identical so I don't think its that.
I have checked and their is a valid input Json as I can create a report which simply outputs the raw Json successfully.
The Json step seems to fail silently by simply no parsing the required field.
Any ideas about the potential cause of this?
Thanks

MS Excel Writer 'file closed' ERROR - only with XLSX, still works

$
0
0
I have an error with the Microsoft Excel Writer step. I'm using Java version 1.8.0 and Spoon 5.4.0.1-130 (later versions take forever to load, may have to test).

When writing to an .xls file, no error.

Selecting .xlsx, there's an error - that says "File Closed."

Here's the thing -- it still creates/ appends to the proper file, and writes all the fields, correctly. So everything went right. But it still throws this error. A non-error "file closed" error. So I guess it's no problem, but can be trouble for true error logging steps, etc .. if the step were to ever actually fail.

Any way to fix this? I may try to get the latest PDI downloaded and try to troubleshoot the long load times ...

Excel uploads - any way to write upload success per row?

$
0
0
Hey there-- I'm uploading an Excel file to SQL Server. Or namely, others will be uploading these docs (that a kitchen.bat job will be processing automatically).

The Excel docs already have decent validation.
Ultimate validation is done in PDI, though (like checking a list of names in the spreadsheet against a dynamic list in the database).


The thing is --- I figured out a way where after the Excel is uploaded, a "success" or "failure" entry is written on each row, depending on if it successfully uploaded or not. However, this is only done by validation checks of the data, and not some kind of actual results from the SQL statement.

It looks like "Execute SQL Statement" step will tell you the number of rows written, but there's no possibility of row-by-row success/ failure. Probably because it can ultimately be executed as one statement and a myriad other reasons.


Hmm ... may I should scrap this idea and just send convert the job to an "all-or-nothing" upload if the doc is completely clean or not.

BI server 6.1 remote access view report as pdf issue

$
0
0
Hi

I have installed the community pentaho application suite on my ubuntu 15.04 linux server necvmsc01. I have version 6.1 of the biserver, pdi and prd. As shown below

mule@necvmsc01:/opt/pentaho$ ls
biserver-ce data-integration eclipse-inst-linux64.tar report-designer
biserver-ce-6.1.0.1-196.zip datasource pdi-ce-6.1.0.1-196.zip reports
ctools-installer-master eclipse prd-ce-6.1.0.1-196.zip repository
ctools-installer-master.zip eclipse-installer README.TXT saiku.tar


I have created simple reports in the report designer ( without parameters ) and published them to the bi server without error. My bi server repository is on postgresql.
When I view my reports on the local server via

localhost:8080/pentaho

I can view the reports on the local server necvmsc01 using the firefox browser and I can also view the reports in pdf mode. If I try to access the bi server remotley
using an ipaddress of the form

192.168.xx.yy:8080/pentaho

I can view the reports in html mode but when I try to view in pdf mode the report is blank. The pentaho tomcat logs dont seem to contain any errors. It is only the
access log which is updated. The details are

failed access for pdf report display using my remote laptop, ip 192.168.xx.yy and internet explorer.

/opt/pentaho/biserver-ce/tomcat/logs/localhost_access_log.2016-10-21.txt

192.168.xx.zz - - [21/Oct/2016:11:06:00 +1300] "POST /pentaho/api/repos/%3Ahome%3Areports%3AAPC_ASB_ALL_DAILY_SUM_IN.prpt/parameter HTTP/1.1" 200 28257
192.168.xx.zz - - [21/Oct/2016:11:06:00 +1300] "POST /pentaho/api/repos/%3Ahome%3Areports%3AAPC_ASB_ALL_DAILY_SUM_IN.prpt/report?ts=1477001139795&output-target=pageable%2Fpdf&accepted-page=-1&showParameters=true&renderMode=REPORT&htmlProportionalWidth=false HTTP/1.1" 200 5814


suceeded access - local mule server, firefox and localhost used.

127.0.0.1 - - [21/Oct/2016:11:01:09 +1300] "POST /pentaho/api/repos/%3Ahome%3Areports%3AAPC_ASB_ALL_DAILY_SUM_IN.prpt/parameter HTTP/1.1" 200 28257
127.0.0.1 - - [21/Oct/2016:11:01:09 +1300] "POST /pentaho/api/repos/%3Ahome%3Areports%3AAPC_ASB_ALL_DAILY_SUM_IN.prpt/report?ts=1476994854449&output-target=pageable%2Fpdf&accepted-page=-1&showParameters=true&renderMode=REPORT&htmlProportionalWidth=false HTTP/1.1" 200 5812



The output looks the same, the only difference seems to be the way that pentaho was accessed - local versus remote. Does remote access need to be configured in some way
for pentaho ?

Thanks in advance for any help offered

Mike Frampton

Pentaho Data Integration running on macOS Sierra?

$
0
0
I've recently updated to macOS Sierra and the latest PDI version 6.1.0.1

Now the Data Integration.app does not open anymore. Running spoon.sh brings the error:

Code:

"Exception in thread "Thread-1" java.lang.UnsupportedClassVersionError: org/pentaho/commons/launcher/Launcher : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)"


So I guess this is a problem with the Java version. Since the new MacOS do not come anymore with Java preinstalled, I've downloaded and installed successfully the latest Java Version 8 Update 11 (as it can be seen in the info of the Java-Preference Pane). However this did not work either, so I assumed I had to install the legacy Java SE 6 version (https://support.apple.com/kb/DL1572?locale=en_US), although macOS Sierra is not explicitly mentioned to compatible with this old version anymore. The installation of Java SE 6 did not bring up any problem.

Nevertheless PDI is still not working.

Actually during the google search I found no hint about any special problems of macOS sierra and PDI. So I guess this is not a general problem, but a specific one on my configuration?

Is there anyone who can confirm PDI 6.1 is running on Sierra? If so, with what Java Version?

Also I probably should mention that java -version in the Terminal says:

Code:

java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-468-11M4833)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-468, mixed mode)


Timeseries forecast failing in junit test

$
0
0
Im running the similar to the example code, and found the exact same code runs ok if I run from main thread, but failed in junit test.
environment: java8, weka3.9, junit4.12


stacktrace:


java.lang.AssertionError
at weka.core.expressionlanguage.weka.InstancesHelper.setInstance(InstancesHelper.java:159)
at weka.filters.unsupervised.attribute.AddExpression.input(AddExpression.java:409)
at weka.filters.supervised.attribute.TSLagMaker.processInstance(TSLagMaker.java:2810)
at weka.filters.supervised.attribute.TSLagMaker.processInstancePreview(TSLagMaker.java:2692)
at weka.classifiers.timeseries.WekaForecaster.forecast(WekaForecaster.java:1671)
at weka.classifiers.timeseries.WekaForecaster.forecast(WekaForecaster.java:1485)
at com.eniro.content.social.insight.InsightEngine.predict(InsightEngine.java:96)
at com.eniro.content.social.insight.InsightEngineTest.name(InsightEngineTest.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:117)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:42)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:262)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:84)


I dig into the exception and found one instances have following attributes
0 = {Attribute@2467} "@attribute DATE date yyyy-MM-dd"
1 = {Attribute@2468} "@attribute POPULARITY numeric"
2 = {Attribute@2469} "@attribute Month {jan,feb,mar,apr,may,jun,jul,aug,sep,oct,nov,dec}"
3 = {Attribute@2470} "@attribute Quarter {Q1,Q2,Q3,Q4}"
4 = {Attribute@2471} "@attribute DATE-remapped numeric"
5 = {Attribute@2472} "@attribute Lag_POPULARITY-1 numeric"
6 = {Attribute@2473} "@attribute Lag_POPULARITY-2 numeric"
7 = {Attribute@2474} "@attribute Lag_POPULARITY-3 numeric"
8 = {Attribute@2475} "@attribute Lag_POPULARITY-4 numeric"
9 = {Attribute@2476} "@attribute Lag_POPULARITY-5 numeric"
10 = {Attribute@2477} "@attribute Lag_POPULARITY-6 numeric"
11 = {Attribute@2478} "@attribute Lag_POPULARITY-7 numeric"
12 = {Attribute@2479} "@attribute Lag_POPULARITY-8 numeric"
13 = {Attribute@2480} "@attribute Lag_POPULARITY-9 numeric"
14 = {Attribute@2481} "@attribute Lag_POPULARITY-10 numeric"
15 = {Attribute@2482} "@attribute Lag_POPULARITY-11 numeric"
16 = {Attribute@2483} "@attribute Lag_POPULARITY numeric"

The other one have following
0 = {Attribute@2467} "@attribute DATE date yyyy-MM-dd"
1 = {Attribute@2468} "@attribute POPULARITY numeric"
2 = {Attribute@2469} "@attribute Month {jan,feb,mar,apr,may,jun,jul,aug,sep,oct,nov,dec}"
3 = {Attribute@2470} "@attribute Quarter {Q1,Q2,Q3,Q4}"
4 = {Attribute@2471} "@attribute DATE-remapped numeric"
5 = {Attribute@2472} "@attribute Lag_POPULARITY-1 numeric"
6 = {Attribute@2473} "@attribute Lag_POPULARITY-2 numeric"
7 = {Attribute@2474} "@attribute Lag_POPULARITY-3 numeric"
8 = {Attribute@2475} "@attribute Lag_POPULARITY-4 numeric"
9 = {Attribute@2476} "@attribute Lag_POPULARITY-5 numeric"
10 = {Attribute@2477} "@attribute Lag_POPULARITY-6 numeric"
11 = {Attribute@2478} "@attribute Lag_POPULARITY-7 numeric"
12 = {Attribute@2479} "@attribute Lag_POPULARITY-8 numeric"
13 = {Attribute@2480} "@attribute Lag_POPULARITY-9 numeric"
14 = {Attribute@2481} "@attribute Lag_POPULARITY-10 numeric"
15 = {Attribute@2482} "@attribute Lag_POPULARITY-11 numeric"
16 = {Attribute@2608} "@attribute Lag_POPULARITY-12 numeric"


Those attributes have to be the same to continue on weka.core.expressionlanguage.weka.InstancesHelper.setInstance method.

The only difference that i can see from maven test environment is the junit dependency, but even if i removed the <scope>test</scope> main thread still runs ok.
Can someone take a look at this? thanks


Best


Andrew

Error resolving Pentaho 4.8.0 stable dependencies

$
0
0
Hello guys,

I'm trying to compile Pentaho 4.8.0 Stable 51791.

I've downloaded the source code from SVN repo, and after that I've executed the following command at bi-platform-build

Code:

ant -f dev_build.xml xresolve
After sometime awaiting to download dependencies, I've got BUILD FAILURE and the following error... See the image below.

error_resolving.jpg

error_resolve.jpg

Just before to execute this command, in the terminal at bi-platform-build I've executed only the command

Code:

ant
Ant I got the BUILD SUCCESS...

I'm following these tutorials to compile Pentaho

http://wiki.pentaho.com/display/Serv...I+Platform+2.0

I would like to know if anyone can help me with these dependencies cause I would like to compile all code in order to have the pentaho tomcat instance

Code:

ant -f dev_build.xml dev-rebuild
Attached Images

Jobs will not run in Pentaho 6.1

$
0
0
All,

Apologies if this question has been asked, but I could not find. We are trying to move our production jobs from 5.1 to 6.1. Non of or jobs will run d aper to be stuck at:
*******************************************************************************
*** Karaf Instance Number: 1 at C:\pentahodi\6.1.0\data-integration\.\ ***
*** system\karaf\caches\default\data-1 ***
*** Karaf Port:8802 ***
*** OSGI Service Port:9051 ***
*******************************************************************************
08:36:16,483 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled

First question : What the heck is this!?
Second: An suggestions for moving beyond this point.

Thanks for the help

Improve CDE Big Table performance

$
0
0
Hello all,

I'm using CDE 6 and I need to create a dashboard with a table with a large number of columns from diferent dimensions. I'm using a lot off 'NonEmptyCrossJoin' functions on the MDX Query, but one of the columns should bring some empty values and when this column has empty values the table don't show this entire row. Here is my MDX Query:

Quote:

SELECT
NON EMPTY {[Measures].[VNPok], [Measures].[Qtd]} ON COLUMNS,
NON EMPTY NONEMPTYCrossJoin({[Data Emissao.Data].[Data].Members},
NONEMPTYCrossJoin({[Pedido.Documento].[Documento].Members},
NONEMPTYCrossJoin({[Vendedor].[Nome].Members},
NONEMPTYCrossJoin({[Especificador.Nome].[Nome].Members},
NONEMPTYCrossJoin({[Cliente].[Nome].Members},
NONEMPTYCrossJoin({[Produto Codigo].[Codigo].Members},
NONEMPTYCrossJoin({[Produto Descricao].[Descricao].Members},
{[Produto Categoria.Categoria].[Categoria].Members}))))))) ON ROWS
FROM [BI Lojas]
WHERE ({[Cancelado].[N]} * {[Tipo Pedido].[N]})
I would like that the Dimension in RED could bring Empty values, can anyone know how to do it with a good perfomance?

Thanks,

Error 0014

$
0
0
What´s my solution to resolve this error?
Thanks !!!


2016-10-21 12:03:23,664 ERROR [org.pentaho.platform.web.hsqldb.HsqlDatabaseStarterBean] HsqlDatabaseStarterBean.ERROR_0006 - The default port of 9001 is already in use. Do you already have HSQLDB running in another process? The HSQLDB Starter cannot continue.
2016-10-21 12:05:22,561 ERROR [org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener] EmbeddedQuartzSystemListener.ERROR_0007_SQLERROR
org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (S1000 General error java.lang.RuntimeException: database alias does not exist)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener.verifyQuartzIsConfigured(EmbeddedQuartzSystemListener.java:158)
at org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener.startup(EmbeddedQuartzSystemListener.java:100)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:421)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:802)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1068)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1060)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:759)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.sql.SQLException: S1000 General error java.lang.RuntimeException: database alias does not exist
at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
at org.hsqldb.jdbc.jdbcConnection.<init>(Unknown Source)
at org.hsqldb.jdbcDriver.getConnection(Unknown Source)
at org.hsqldb.jdbcDriver.connect(Unknown Source)
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
... 40 more
2016-10-21 12:05:22,681 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-10-21 12:05:22,682 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error mientras se intentaba ejecutar la secuencia de arranque por org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener
org.pentaho.platform.api.engine.PentahoSystemException: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error mientras se intentaba ejecutar la secuencia de arranque por org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:348)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:311)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:212)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:802)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1068)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1060)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:759)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error mientras se intentaba ejecutar la secuencia de arranque por org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:430)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:343)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:340)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:391)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:340)
... 27 more
Caused by: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error mientras se intentaba ejecutar la secuencia de arranque por org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:422)
... 35 more
2016-10-21 12:05:22,696 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2016-10-21 12:19:48,477 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2016-10-21 12:19:48,477 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManager.ERROR_0014 - Plugin pentaho-cdf failed to properly unload
java.lang.ExceptionInInitializerError
at pt.webdetails.cpf.SimpleLifeCycleListener.unLoaded(SimpleLifeCycleListener.java:42)
at org.pentaho.platform.plugin.services.pluginmgr.PlatformPlugin.unLoaded(PlatformPlugin.java:216)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.unloadPlugins(DefaultPluginManager.java:128)
at org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager.unloadAllPlugins(DefaultPluginManager.java:770)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.shutdown(PluginAdapter.java:47)
at org.pentaho.platform.engine.core.system.PentahoSystem.shutdown(PentahoSystem.java:1027)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextDestroyed(SolutionContextListener.java:243)
at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4249)
at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4890)
at org.apache.catalina.startup.HostConfig.checkResources(HostConfig.java:1276)
at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1382)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:306)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1392)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1656)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1665)
at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1645)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NullPointerException
at pt.webdetails.cpf.PluginEnvironment.repository(PluginEnvironment.java:29)
at pt.webdetails.cpf.CpfProperties.getInstance(CpfProperties.java:45)
at pt.webdetails.cpf.persistence.PersistenceEngine.<clinit>(PersistenceEngine.java:62)
... 18 more
2016-10-21 12:19:48,478 ERROR [org.pentaho.platform.util.logging.Logger] Error end:

BI 6.1 - LDAP and MS Active Directory

$
0
0
Hi,

I am testing Pentaho 6.1 and I configured LDAP to get a SSO. All is working fine but I cannot configure Admin account. I tried many ways on how to configure Admin account/roles from LDAP and no one is working.

Kindly, do you know how to get it working?

Thanks in advance.

Bazcor

Get XML (Nested Data)

$
0
0
Hi All,

I am somewhat of an XML neophyte. I have a 500M XML file that has a list of people and the locations that they have lived over time. The locations data is "nested" under each person. I have used Pentaho 6.1 Get XML and I can pull out all of the people by pointing XPATH to the top node. I can separately pull out the list of dated locations. However, I can't find a way to pull out the data that links those two data sets. There is an ID in the people list that needs to be placed on each row of the location list so that I can link these things (and store them in a relation database).

Essentially, I don't think I fully understand XL, XPATH and the wait Pentaho processes/parses the information. BTW, I have an XSD that describes the XML file. Is there any Pentaho functionality (I am using 6.1 CE) that can make use of the XSD file to pull out what are essentially multiple liked tables?

Thanks for any hints, tips or pointers to additional information!

How to get dashboard tables row count popup alert?

$
0
0
Hi all,
I have a table in my dashboard and I want to display how many rows it contains.
Like javascript alert popup.
Note: It must display a popup on the loading dashboard with number of rows given table has.
So far i tried this, but it didn't work
function f(){
var rowCount = document.getElementById('myTableID').rows.length;
alert(rowCount);
}

AND

function testClick(e) {
var id = e.tableData.rows.length;
alert(id);
//window.location = '/pentaho/content/pentaho-cdf-dd/Render?solution=XXX&path=XXX&file=XXX.wcdf&ParameterFullName=' +id;
}

Setting Mail with Gmail

$
0
0
I tried to set up mail job in KETTLE.

I use following setting from following thread.
http://forums.pentaho.com/showthread...il-using-Gmail

Quote:

SMTP Server: smtp.gmail.com
Port: 465
Authentication: check
Authentication User: youraccount@gmail.com
Authentication Password: *******
Use Secure Authentication: checked
Secure Connection Type: SSL
However, I got following error

Quote:

2016/10/23 21:33:23 - Mail - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Problem while sending message: javax.mail.MessagingException: Could not connect to SMTP host: smtp.gmail.com, port: 465; nested exception is:
javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
2016/10/23 21:33:23 - Email - Finished job entry [Mail] (result=[false])
I did google around and found this thread.
http://forums.pentaho.com/showthread...ification-path

So I get cert by using openssl (http://notepad2.blogspot.com/2012/04...into-java.html)

Quote:

s_client -connect smtp.gmail.com:465

and tried add cert to keystore using keytool command as this link
http://www.joyofdata.de/blog/suncert...in-pdi-kettle/

Quote:

keytool -import -alias smtp.gmail.com -keystore "%JAVA_HOME%/jre/lib/security/cacerts" -file C:\Users\wilson\gmail.cert
However, I still got error. I have no idea where to fix next. Please help.

:: How does Time Series Analysis and Forecasting with Weka works? ::

$
0
0
Dear sir,

I am to use Time Series Analysis and Forecasting with Weka in the site:
http://wiki.pentaho.com/display/DATA...ting+with+Weka

But, I can't find the documentation,

Can you help-me please?

I am to use Time Series Analysis and Forecasting with Weka plugin. My dataset have four attributes: Date: YYYY-MM, Energy Consumption, Gross domestic product, and Population. My target selection is Energy Consumption. I have 480 instances and I use 3 algorithms: SMOreg, Multilayer Perceptron and Linear Regression. My predictions use 24 units of Number of time to units to forecast.

My doubt is, how the algorithms calculates the next to 24 moths? They use all the 480 instances for predict the 24 future moths? or the last 5 or 10?
How the algorithm do this?

Moreover you have a link to pass me for I sutudy?

Thanks for everything

Best regards

Table Input Multiple Copies

$
0
0
HI ,

This is the first time am trying to explore running multiple copies of Table Input , but am running into the below exception and If I reduce the number of copies of subsequent Filter step from 5 to 1 , it's running fine , but if filter step is single threaded , it will be a bottle neck w.r.t processing .

I have attached the KTR for reference and the pentaho version am using is 5.0.1


select x,y,x
FROM
order_line xo, orders o
WHERE
xo.order_id_fk=o.id
-- and o.created_on >= date_format(now(), '%Y-%m-%d 00:00:00')
and xo.last_modified_on >= ?
and o.ID mod ${Internal.Step.Unique.Number} = 0



org.pentaho.di.core.exception.KettleException:
Step [dim_product lookup] is invalid as target.

at org.pentaho.di.trans.steps.filterrows.FilterRows.processRow(FilterRows.java:104)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
at java.lang.Thread.run(Thread.java:701)
2016/10/24 09:14:41 - Filter rows.4 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unexpected error
2016/10/24 09:14:41 - Filter rows.4 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleException:
2016/10/24 09:14:41 - Filter rows.4 - Step [dim_product lookup] is invalid as target.
2016/10/24 09:14:41 - Filter rows.4 -
2016/10/24 09:14:41 - Filter rows.4 - at org.pentaho.di.trans.steps.filterrows.FilterRows.processRow(FilterRows.java:104)
2016/10/24 09:14:41 - Filter rows.4 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2016/10/24 09:14:41 - Filter rows.4 - at java.lang.Thread.run(Thread.java:701)
child index = 46, logging object : org.pentaho.di.core.logging.LoggingObject@650cc20b parent=157edce7-5e77-4e06-9b16-2597ab05fc20
2016/10/24 09:14:41 - fact_orderitem_hourly_insert - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2016/10/24 09:14:41 - fact_orderitem_hourly_insert - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
Attached Files

Report viewer

$
0
0
Is it possible in Pentaho to put condition in report which is able to choose different parametr in SQL to the different users?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>