Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Pentaho 5.0.1 BI Server Startup Issue

$
0
0
I've been diligently trying to get the 5.0.1 server up and running but have been receiving errors. I have tried several troubleshooting steps but still receive the same error. If I could get some guidance on steps for troubleshooting this error, I would appreciate it.

Problem:
I downloaded and unzipped all the latest versions of stable files from Pentaho Business Analytics through sourceforge.net for the following

  • biserver-ce - biserver-ce-5.0.1-stable.zip
  • data-integration - pdi-ce-5.0.1.A-stable.zip
  • design-studio - pds-ce-win-32-4.0.0-stable.zip
  • metadata-editor - pme-ce-5.0.1-stable.zip
  • PentahoReportingEngineSDK - pre-classic-sdk-5.0.1-stable.zip
  • report-designer - prd-ce-5.0.1-stable.zip


The next step I take after unzipping and placing each folder into a parent folder called 'Pentaho' that's placed in my Program Files directory, is to click on the start-pentaho.bat file. This brings up a two different DOS windows and kicks off the Tomcat process.

After it looks like the process is running in my Terminal window, I open a browser and type in localhost:8080.

This hangs and eventually the Terminal Services window with Tomcat running spits out a number of lines showing the process failed.

The browser shows the following error:

HTTP Status 404
type: Status report
message:
Description: The requested resource is not available

Apache Tomcat/6.0.36


I open my log files in the C:\Program Files\Pentaho\biserver-ce\tomcat\logs and the following is output:

pentaho.txt
2014-03-14 10:26:43,946 ERROR [org.springframework.web.context.ContextLoader] Context initialization failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.h2.tools.Server' defined in file [C:\Program Files\Pentaho\biserver-ce\pentaho-solutions\system\GettingStartedDB-spring.xml]: Invocation of init method failed; nested exception is org.h2.jdbc.JdbcSQLException: Exception opening port "H2 TCP Server (tcp://169.254.31.197:9092)" (port may be in use), cause: "timeout" [90061-131]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1338)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:473)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
at java.security.AccessController.doPrivileged(Native Method)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:380)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:264)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:261)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:185)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:164)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:429)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:728)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:380)
at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:255)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:199)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:45)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
...
Caused by: org.h2.jdbc.JdbcSQLException: Exception opening port "H2 TCP Server (tcp://169.254.31.197:9092)" (port may be in use), cause: "timeout" [90061-131]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:316)
at org.h2.message.DbException.get(DbException.java:167)
at org.h2.tools.Server.start(Server.java:344)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
...
Caused by: org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.pentaho.platform.plugin.services.cache.CacheManager]: Constructor threw exception; nested exception is java.lang.NullPointerException
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:115)
...
Caused by: java.lang.NullPointerException
at org.pentaho.platform.engine.core.system.SystemSettings.getAbsolutePath(SystemSettings.java:182)

localhost.2014-03-14
Mar 14, 2014 10:22:24 AM org.apache.catalina.core.ApplicationContext log
INFO: Initializing Spring root WebApplicationContext
Mar 14, 2014 10:26:43 AM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.h2.tools.Server' defined in file [C:\Program Files\Pentaho\biserver-ce\pentaho-solutions\system\GettingStartedDB-spring.xml]: Invocation of init method failed; nested exception is org.h2.jdbc.JdbcSQLException: Exception opening port "H2 TCP Server (tcp://169.254.31.197:9092)" (port may be in use), cause: "timeout" [90061-131]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1338)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:473)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
at java.security.AccessController.doPrivileged(Native Method)
...
Caused by: org.h2.jdbc.JdbcSQLException: Exception opening port "H2 TCP Server (tcp://169.254.31.197:9092)" (port may be in use), cause: "timeout" [90061-131]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:316)
at org.h2.message.DbException.get(DbException.java:167)
at org.h2.tools.Server.start(Server.java:344)
...
Mar 14, 2014 10:26:44 AM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.pentaho.platform.web.http.context.SolutionContextListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.h2.tools.Server' defined in file [C:\Program Files\Pentaho\biserver-ce\pentaho-solutions\system\GettingStartedDB-spring.xml]: Invocation of init method failed; nested exception is org.h2.jdbc.JdbcSQLException: Exception opening port "H2 TCP Server (tcp://169.254.31.197:9092)" (port may be in use), cause: "timeout" [90061-131]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1338)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:473)
...
Mar 14, 2014 10:26:44 AM org.apache.catalina.core.StandardContext listenerStop
SEVERE: Exception sending context destroyed event to listener instance of class org.pentaho.platform.web.http.context.PentahoCacheContextListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ICacheManager' defined in file [C:\Program Files\Pentaho\biserver-ce\pentaho-solutions\system\pentahoObjects.spring.xml]: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.pentaho.platform.plugin.services.cache.CacheManager]: Constructor threw exception; nested exception is java.lang.NullPointerException
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:883)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:839)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:440)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
at java.security.AccessController.doPrivileged(Native Method)

catalina.2014-03-14
SEVERE: Error listenerStart
Mar 14, 2014 10:26:44 AM org.apache.catalina.core.StandardContext start
SEVERE: Context [/pentaho] startup failed due to previous errors
Mar 14, 2014 10:26:44 AM org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc
SEVERE: The web application [/pentaho] registered the JDBC driver [org.h2.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
Mar 14, 2014 10:26:44 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/pentaho] appears to have started a thread named [HSQLDB Timer @57ae58] but has failed to stop it. This is very likely to create a memory leak.
Mar 14, 2014 10:26:44 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/pentaho] appears to have started a thread named [H2 TCP Server (tcp://169.254.31.197:9092)] but has failed to stop it. This is very likely to create a memory leak.
Mar 14, 2014 10:26:44 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [/pentaho] appears to have started a thread named [DefaultQuartzScheduler_Worker-1] but has failed to stop it. This is very likely to create a memory leak.

(I shortened these log files)

The obvious major error is the H2 TCP Server (tcp://169.254.31.197:9092) but there's noting on the 9092 port. I run a netstat -an and there's nothing ESTABLISHED or RUNNING on 8080 or 9092.

Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.

C:\Users\>netstat -an

Active Connections

Proto Local Address Foreign Address State
TCP 0.0.0.0:80 0.0.0.0:0 LISTENING
TCP 0.0.0.0:135 0.0.0.0:0 LISTENING
TCP 0.0.0.0:445 0.0.0.0:0 LISTENING
TCP 0.0.0.0:554 0.0.0.0:0 LISTENING
TCP 0.0.0.0:902 0.0.0.0:0 LISTENING
TCP 0.0.0.0:912 0.0.0.0:0 LISTENING
TCP 0.0.0.0:2869 0.0.0.0:0 LISTENING
TCP 0.0.0.0:2968 0.0.0.0:0 LISTENING
TCP 0.0.0.0:3306 0.0.0.0:0 LISTENING
TCP 0.0.0.0:5357 0.0.0.0:0 LISTENING
TCP 0.0.0.0:6515 0.0.0.0:0 LISTENING
TCP 0.0.0.0:6783 0.0.0.0:0 LISTENING
TCP 0.0.0.0:10243 0.0.0.0:0 LISTENING
TCP 0.0.0.0:19995 0.0.0.0:0 LISTENING
TCP 0.0.0.0:19996 0.0.0.0:0 LISTENING
TCP 0.0.0.0:30172 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49152 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49153 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49154 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49161 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49163 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49229 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49230 0.0.0.0:0 LISTENING
TCP 127.0.0.1:5354 0.0.0.0:0 LISTENING
TCP 127.0.0.1:5354 127.0.0.1:49155 ESTABLISHED
TCP 127.0.0.1:5550 0.0.0.0:0 LISTENING
TCP 127.0.0.1:27015 0.0.0.0:0 LISTENING
TCP 127.0.0.1:49155 127.0.0.1:5354 ESTABLISHED
TCP 127.0.0.1:49998 127.0.0.1:49999 ESTABLISHED
TCP 127.0.0.1:49999 127.0.0.1:49998 ESTABLISHED
TCP 169.254.31.197:139 0.0.0.0:0 LISTENING
TCP 192.168.1.71:139 0.0.0.0:0 LISTENING
TCP 192.168.1.71:49178 192.168.1.78:445 ESTABLISHED
TCP 192.168.1.71:49186 74.125.142.125:443 ESTABLISHED
TCP 192.168.1.71:49219 74.125.142.125:5222 ESTABLISHED
TCP 192.168.1.71:49237 207.46.5.158:443 ESTABLISHED
TCP 192.168.1.71:49805 74.125.142.125:5222 ESTABLISHED
TCP 192.168.1.71:49810 157.56.237.70:443 ESTABLISHED
TCP 192.168.1.71:49811 54.219.20.84:443 ESTABLISHED
TCP 192.168.1.71:51926 204.152.18.206:443 TIME_WAIT
TCP 192.168.1.71:51957 204.152.18.206:443 TIME_WAIT
TCP 192.168.1.71:51968 204.152.18.196:443 TIME_WAIT
TCP 192.168.1.71:51985 204.152.18.196:443 TIME_WAIT
TCP 192.168.1.71:51986 204.152.18.206:443 ESTABLISHED
TCP 192.168.1.71:52000 204.152.18.196:443 TIME_WAIT
TCP 192.168.1.71:52010 204.152.18.196:443 ESTABLISHED
TCP 192.168.1.71:52026 204.152.18.196:443 ESTABLISHED
TCP 192.168.58.1:139 0.0.0.0:0 LISTENING
TCP 192.168.117.1:139 0.0.0.0:0 LISTENING
TCP [::]:80 [::]:0 LISTENING
TCP [::]:135 [::]:0 LISTENING
TCP [::]:445 [::]:0 LISTENING
TCP [::]:554 [::]:0 LISTENING
TCP [::]:2869 [::]:0 LISTENING
TCP [::]:3306 [::]:0 LISTENING
TCP [::]:5357 [::]:0 LISTENING
TCP [::]:10243 [::]:0 LISTENING
TCP [::]:30172 [::]:0 LISTENING
TCP [::]:49152 [::]:0 LISTENING
TCP [::]:49153 [::]:0 LISTENING
TCP [::]:49154 [::]:0 LISTENING
TCP [::]:49161 [::]:0 LISTENING
TCP [::]:49163 [::]:0 LISTENING
TCP [::]:49229 [::]:0 LISTENING
TCP [::]:49230 [::]:0 LISTENING
TCP [fe80::cd44:44c6:22b7:f181%12]:52028 [fe80::cd44:44c6:22b7:f181%12]:80
SYN_SENT
TCP [fe80::cd44:44c6:22b7:f181%12]:52029 [fe80::cd44:44c6:22b7:f181%12]:80
SYN_SENT
UDP 0.0.0.0:500 *:*
UDP 0.0.0.0:2968 *:*
UDP 0.0.0.0:3702 *:*
UDP 0.0.0.0:3702 *:*
UDP 0.0.0.0:3702 *:*
UDP 0.0.0.0:3702 *:*
UDP 0.0.0.0:4500 *:*
UDP 0.0.0.0:5004 *:*
UDP 0.0.0.0:5005 *:*
UDP 0.0.0.0:5355 *:*
UDP 0.0.0.0:6514 *:*
UDP 0.0.0.0:6515 *:*
UDP 0.0.0.0:6516 *:*
UDP 0.0.0.0:6783 *:*
UDP 0.0.0.0:30172 *:*
UDP 0.0.0.0:50299 *:*
UDP 0.0.0.0:51270 *:*
UDP 0.0.0.0:51272 *:*
UDP 0.0.0.0:52775 *:*
UDP 0.0.0.0:61250 *:*
UDP 0.0.0.0:61704 *:*
UDP 0.0.0.0:61706 *:*
UDP 0.0.0.0:65137 *:*
UDP 0.0.0.0:65139 *:*
UDP 127.0.0.1:1900 *:*
UDP 127.0.0.1:51269 *:*
UDP 127.0.0.1:51271 *:*
UDP 127.0.0.1:61248 *:*
UDP 127.0.0.1:61249 *:*
UDP 127.0.0.1:62841 *:*
UDP 127.0.0.1:65136 *:*
UDP 127.0.0.1:65138 *:*
UDP 169.254.31.197:137 *:*
UDP 169.254.31.197:138 *:*
UDP 169.254.31.197:1900 *:*
UDP 169.254.31.197:5353 *:*
UDP 169.254.31.197:62840 *:*
UDP 169.254.31.197:64393 *:*
UDP 192.168.1.71:137 *:*
UDP 192.168.1.71:138 *:*
UDP 192.168.1.71:1900 *:*
UDP 192.168.1.71:5353 *:*
UDP 192.168.1.71:62837 *:*
UDP 192.168.1.71:64394 *:*
UDP 192.168.58.1:137 *:*
UDP 192.168.58.1:138 *:*
UDP 192.168.58.1:1900 *:*
UDP 192.168.58.1:5353 *:*
UDP 192.168.58.1:62838 *:*
UDP 192.168.58.1:64396 *:*
UDP 192.168.117.1:137 *:*
UDP 192.168.117.1:138 *:*
UDP 192.168.117.1:1900 *:*
UDP 192.168.117.1:5353 *:*
UDP 192.168.117.1:62839 *:*
UDP 192.168.117.1:64395 *:*
UDP [::]:500 *:*
UDP [::]:3702 *:*
UDP [::]:3702 *:*
UDP [::]:3702 *:*
UDP [::]:3702 *:*
UDP [::]:4500 *:*
UDP [::]:5004 *:*
UDP [::]:5005 *:*
UDP [::]:5355 *:*
UDP [::]:30172 *:*
UDP [::]:49152 *:*
UDP [::]:60882 *:*
UDP [::]:61251 *:*
UDP [::]:61705 *:*
UDP [::]:61707 *:*
UDP [::1]:1900 *:*
UDP [::1]:62836 *:*
UDP [fe80::cc4:5a10:8a52:1fc5%23]:546 *:*
UDP [fe80::cc4:5a10:8a52:1fc5%23]:1900 *:*
UDP [fe80::cc4:5a10:8a52:1fc5%23]:5353 *:*
UDP [fe80::cc4:5a10:8a52:1fc5%23]:62835 *:*
UDP [fe80::45d0:fff7:8c6c:e9d3%11]:546 *:*
UDP [fe80::45d0:fff7:8c6c:e9d3%11]:1900 *:*
UDP [fe80::45d0:fff7:8c6c:e9d3%11]:62832 *:*
UDP [fe80::a988:c9f4:2fa8:e374%13]:546 *:*
UDP [fe80::a988:c9f4:2fa8:e374%13]:1900 *:*
UDP [fe80::a988:c9f4:2fa8:e374%13]:62834 *:*
UDP [fe80::cd44:44c6:22b7:f181%12]:546 *:*
UDP [fe80::cd44:44c6:22b7:f181%12]:1900 *:*
UDP [fe80::cd44:44c6:22b7:f181%12]:62833 *:*

There's an h2-1.2.131.jar file and I don't have any other h2 services installed. I downloaded the h2-1.3.168.jar but haven't placed in the tomcat\lib directory yet. I deleted all lock files in the pentaho-solutions\system\jackrabbit directory (assuming lock files all have "lock" in the name. There are a few db2.lock files created). I also noticed my network printer is using port 139 through tcp tcp://169.254.31.197 and disabled those rules on my firewall, but that didn't work.

Some guidance here would be appreciated on how to troubleshoot this issue.

Thank you in advance.

Multiple update statements with same key feeding to a "insert\update" step

$
0
0
Hi:
I am using "insert\update" to do incremental loads. In the following scenario I have the following rows fed into "insert\update" to a database table assuming they are all updates

Key Column
---- -------
A1 TestA1
A2 TestA2
A3 TestA3
A1 TestNewA1


Expected Output in database
--------------------------------
A1 TestNewA1
A2 TestA2
A3 TestA3

Since the processing is parallel in pentaho, is there a possibility that the order of update for "A1" (two rows) can be in different order(I have not test it to fail with small datasets) is this a possibilty ? When can this happen? How to serialize the order of the updates?


Thanks!!

Review of Prezi presentation tool

$
0
0
I have been playing around with Prezi (http://prezi.com/), the online presentation tool. It’s a cool thing that lets you create presentations that are visually different from Powerpoint/Keynote. Like all these tools it will let you create bad presentations very quickly. If you want to create something compelling and appealing it will take planning and thought. Looking at the […]

More...

Changing the login background graphic

$
0
0
Hopefully a simple question. I found the jpeg used for the background of the login page here:
./pentaho-solutions/system/common-ui/resources/themes/crystal/images/login-crystal-bg.jpeg

I was hoping to swap out the graphic with another, but found that doing so and restarting the BA server didn't achieve the desired result.

Anyone able to explain how to customize the background?

Thanks!
-Brian

Transforming data based on a second set of data rows

$
0
0
Hello,


I am new to Pentaho Data Integration and still hoping it can meet my data transformation needs. I am completely self taught so far


I've tried searching the forum for an answer but do not even know what to search for. :confused:


My problem being I have two 'streams' of data. The first containing a single string of text. For example:


text
-------------
Hello Mary
Hello James, Goodbye
Liz says Hello to James
Goodbye


The other 'stream' contains two strings. For example:


original | new
--------------------
Hello | Hi
Goodbye | Ciao
James | Jim
Elizabeth | Liz




Using Pentaho Data Integration what is the best way to take the text from the first stream, look up each word against the 'original' field in stream two. If 'original' exists, replace it with 'new' so that the resulting output would look like:


output
-------------------------
Hi Mary
Hi Jim, Ciao
Liz says Hi to Jim
Ciao


Your help is greatly appreciated.

Thank You

BI plateform not taking DB2 as datasource

$
0
0
Hello all,

Pentaho BI 5.0.1 server is showing error while creating JDBC datasource for db2 database even i have pasted db2jcc-9.5.jar in tomcat/lib folder message is coming while hitting TEST " ERROR OK".

Can anybody suggest a solution.

thanks
sumit

CDE - ERROR [DashboardDesignerContentGenerator] Could not load dashboard

$
0
0
Hi, when I try to execute a .cdfde dashboard using a role that isn't admin I got this error:

ERROR [DashboardDesignerContentGenerator] Could not load dashboard: /<solution>/dashboard.cdfde

The file is generated by CDE ctools and I'm using a pentaho ce 4.8 on linux server.

may you help me, to fix this issue.

Regards

Ignazio

NO Dashboard FireChange but the panels data get refresh in the Dashboard

$
0
0
Hi Forum,

When I click on F bar (Female bar) data in the next panel(table) should be changed .. Only the penal data but not the entire dashboard..

I have found the entire dashboard get refreshing with the below code.
{
Dashboards.fireChange('param1_gender', this.scene.atoms.category.value);
}


But the only panel data has to get refresh.. Is there any way around to achieve this ? Now, it is very urgent requirement for us.
Thank you .

Sample image is shown below.

Dashboard.jpg
Attached Images

WEKA supported file format

$
0
0
Which files format are supported by WEKA, I know default format is ARFF , i want to know which are the other format

Thanks for replying in advance

Adding JDBC Datasource through REST?

$
0
0
Hey I was wondering if there is anyway to add a JDBC datasource to the 5.0.1 server using rest services? I have figured out how to do it for a mondrian schema but cant find anything about JDBC Datasources. Thanks!

fetch only wanted columns from the result set of the query on table component

$
0
0
Hi Forum,

Problem statement :
My query result set will return 4 columns of data and I should get 4 columns using CDA.
I need to show only first 3 columns on table component and when ever I hover the mouse on 2 column , the 4th column data has to appear on tool tip..
How can I do it using PostFetch function java script ?


I have tried below :
Output options on query : given 0,1,2 indexes but and tried to execute the query using CDA.. it is giving 3 columns output.. here in CDA I have to show the result set of the entire query (i.e. 4 columns in this case) but have to show first 3 columns(it could be wanted columns from the result set) on the table component

Thank you in Advance :D

Unrecognized SQL escape 'job' at line position 120.

$
0
0
Hello,

I am using BULK insert of MSSQL-SERVER to load the data.

in this insert stmt i use 2 variables as below.

--> (SELECT * FROM OPENROWSET(BULK '${FILE_LOCATION_IN}\${FILE_NAME_IN}',SINGLE_BLOB)AS X)

the database server is differnt and the linux server on which i run the job is differnt.

when i trigger the job from Linux machine if throws the below error.

"Unrecognized SQL escape 'job' at line position 120."

also i get an another error

"Cannot bulk load because the file "${FILE_LOCATION_IN}\${FILE_NAME_IN}" could not be opened. Operating system error code 3(The system cannot find the path specified)"

Any idea what might be the problem.

Regards,
Alok
pentaho CE-5.0.1
MSSQLSERVER-2012

Other column values to show on table component tooltip when mouse hover on a column

$
0
0
Hi Forum,

My Query result set is :
A B C D
2 4 5 5
2 4 5 6
2 4 6 9

table component display :
A B C
2 4 5
2 4 5
2 4 6
( to get this written java script in Post Execution )

When I hover the mouse on B column I want show D column values in tool tip ? How can I do this ?

Help would be appreciated ..

Thank you :D

Weka Time Series Prediction

$
0
0
I have a large dataset of minute-wise stock values, for a period of 5 years. And I'm having a hard time to forecast in Weka using time series analysis. I want the dates and timings together for prediction. How do I do that? Please help!!!

OSX Mavericks version of Kettle working?

$
0
0
I have been using Kettle for about one year now, and have generally not had interface problems. But I am finding a number of small things that don't seem to work, or work consistently in the new version of Kettle for OSX Mavericks. For example, the Browse button for linking to a file on input files seems to work the very first time it is used, and then no longer works. I'm also finding the Function icon will work sometimes for the same code, and sometimes not. Are others having these problems with OSX. I've updated Java and the Kettle version to the latest.

WEKA classifiers and association Rules Query

$
0
0
Which classfiers or classification algorithm does WEKA uses, also I want to know the association rules and clustering algorithm used by WEKA.


Thanks a lot in advance

Can we integrate HTML 5 charts in CDE ?

$
0
0
Hi Forum,
Is it possible to integrate HTML 5 charts in CDE ? If we can can any one give a small example on it ?

Thank you.

SAP Input Step

$
0
0
Hi,

I've programmed a SAP function which I want to use with the SAP Input step. If I use get Fileds, I get the output fields of the Table which I difined in the function. I tested the function in SAP and there it worked.
If I test it in the ETL Tool I get following error:


2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Unexpected error
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.core.exception.KettleException:
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : org.pentaho.di.trans.steps.sapinput.sap.SAPException:
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : There is no table parameter list. Did you use 'Table' instead of 'Structure'?
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : There is no table parameter list. Did you use 'Table' instead of 'Structure'?
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.sapinput.SapInput.processRow(SapInput.java:121)
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.step.RunThread.run(RunThread.java:50)
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at java.lang.Thread.run(Unknown Source)
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Caused by: org.pentaho.di.trans.steps.sapinput.sap.SAPException:
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : There is no table parameter list. Did you use 'Table' instead of 'Structure'?
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) :
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.getOutputCursor(SAPConnectionImpl.java:455)
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.sapinput.sap.impl.SAPConnectionImpl.executeFunctionCursored(SAPConnectionImpl.java:349)
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : at org.pentaho.di.trans.steps.sapinput.SapInput.processRow(SapInput.java:119)
2014/03/17 11:15:18 - SAP Input 2.0 - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : ... 2 more


In SAP I created under tables the following parameter:

ZZGT_OUTTAB_TAB Like ZST_IDOC (which is a structure that I created).
Do you have any Ideas why this does not work.
In SAP I get a result of 20 rows.

Regards

Dominic

how to delete lines from logging tables automatically

$
0
0
Hello

I configured my job for logging, i configured the first section Log and not job entry neither logging channels. It seems working well
my job is entring the logs of all executions into my table, however I want to delete old lines.
I set the log line timeout to 3 but all lines are still here even lines of last month

could you please help me to resolve this problem?

howa to add an attachement error to the notification mail

$
0
0
hy every body

i want to send an email for notifying the error and i want to attach to this mail the trace of the error given by PDI

i don't know how exactly

can you please help me
Viewing all 16689 articles
Browse latest View live