Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Sparkl application sharing

$
0
0
Hi,

I created a new Sparkl application using admin user in Pentaho 5.2 CE. It has a kettle endpoint which interacts with LDAP and Hadoop Hive to get some data out based on user role.
Everything works great under admin user, but I cannot figure out how to share this application with other users or perhaps roles that exists in Pentaho BA server. The only solution so far was to add the authorized users to Administrator group and then they can see the "Tools" menu option in which the application is available to be used.

Question: how do I allow regular users to use the newly created Sparkl application? Or, am I missing some concepts here altogether?

I've been hitting every possible resource about this little challenge I have with Sparkl. No luck so far.

Excellent work with Sparkl, I'm a big fan of projects the community puts out.

Cheers

=======================
Update:

solution found: once the new Sparkl plugin is functional, its kettle endpoints (if permitted to "All users") are available in all dashboards' Data Sources. Therefore, I just have to rebuild the dashboard and share it with the appropriate users.

Execute SQL Script

$
0
0
Hello,
I create this script:

UPDATE RM_IND_ELAB
SET IND_STATUT = 'REJ'
WHERE NU_CLE = ?

In the parameters list I declare the name of field in my stream that have "NU_CLE" value. I have this error.

2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : An error occurred, processing will be stopped:
2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : Couldn't execute SQL: UPDATE RM_IND_ELAB
2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : SET IND_STATUT = 'REJ'
2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : WHERE NU_CLE = ?
2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) :
2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : ORA-01008: toutes les variables ne sont pas liées


2014/11/25 12:48:05 - Execute SQL script.0 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : Error initializing step [Execute SQL script]
2014/11/25 12:48:05 - RM_Reception_marchandises3 - ERROR (version 4.1.0, build 1 from 2012-11-06 13.20.53) : Step [Execute SQL script.0] failed to initialize!

Thanks.

Executing Spoon.bat, Splash Screen appears/disappears, DI will not start; JVM Errors.

$
0
0
Hello,
I am experiencing an issue with Pentaho (5.0.1 Stable).
First, I was working on a transform late last week. I had some issue (memory, mainly) last week. I decided to deal with it this week...

Here is some info:
Environment: Windows 8, 16 mb Ram installed;
Pentaho 5.0.1 Stable (community version)

however, I cannot seem to invoke SPOON. The spash screen appears, then disappears.
I reinstalled Java on my machine, Reinstalled Pentaho.
Reset PENTAHO_JAVA_HOME to the JVM Location, etc etc.

I ran spoondebug from the command line:
The produced text file contains:

DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.7.0_71
DEBUG: _PENTAHO_JAVA
=C:\Program Files\Java\jdk1.7.0_71\bin\java

c:\Program Files\Pentaho\data-integration>"C:\Program Files\Java\jdk1.7.0_71\bin\java" "-Xmx512m" "-XX:MaxPermSize=256m" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-launcher-5.0.1-stable.jar -lib ..\libswt\win64 /level:Debug
2014/11/25 11:49:55 - Spoon - Logging is at level : Debugging
stopping

(Looks normal).

In Console, I am receiving the following messages (showing the first few lines; the rest does not seem to give any worthy info..

c:\Program Files\Pentaho\data-integration>spoondebug
SpoonDebug is to support you in finding unusual errors and start problems.
-
This starts Spoon with a console output with the following options:
-
Pause after the termination?
(helps in finding start problems and unusual crashes of the JVM)
Pause? (Y=Yes, N=No, C=Cancel) Y
-
Set logging level to Debug? (default: Basic logging)
Debug? (Y=Yes, N=No, C=Cancel) Y
-
Redirect console output to SpoonDebug.txt in the actual Spoon directory?
Redirect to SpoonDebug.txt? (Y=Yes, N=No, C=Cancel) Y
-
Launching Spoon: "c:\Program Files\Pentaho\data-integration\spoon.bat" /level:Debug
Console output gets redirected to "c:\Program Files\Pentaho\data-integration\SpoonDebug.txt"
2014/11/25 11:49:55 - General - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Error starting Spoon
shell
2014/11/25 11:49:55 - General - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : java.lang.RuntimeExce
ption: Unable to create the database cache:

2014/11/25 11:49:55 - General -
2014/11/25 11:49:55 - General - Couldn't read the database cache
2014/11/25 11:49:55 - General - org.pentaho.di.core.exception.KettleFileException:

2014/11/25 11:49:55 - General -
2014/11/25 11:49:55 - General - [npi Integer(15)], [entity_type_code Integer(9)], [replacement_npi Integer(15)], [employer_identif
ication_number_ein String(9)], [provider_organization_name_legal_business_name String(70)], [provider_last_name_legal_name String(
35)], [provider_first_name String(20)], [provider_middle_name String(20)], [provider_name_prefix_text String(5)], [provider_name_s
uffix_text String(5)], [provider_credential_text String(20)], [provider_other_organization_name String(70)], [provider_other_organ
ization_name_type_code String(1)], [provider_other_last_name String(35)], [provider_other_first_name String(20)], [provider_other_


As you can see, The error messages SEEM to be related to a transform error I had last week (I did have memory issues I was working on...)

Any suggestions?

I have attached my full (edited) log messages

thanks.
John De.
Attached Files

sql Join

$
0
0
Hi all,
I have a need to run some sql with more than one table, which can pick parameters out of the stream and return results.
I know I can do multiple lookups but if one of the lookups return multiple records, the number of rows in the stream would increase... which I cant do.
I looked at sql script but that does not look like it returns results.
I am kind of new and so looking for any ideas.
Thanks

Trouble with "database lookup" and error handling

$
0
0
Hi, I found a few threads where this topic was mentioned but it is still not clear to me what I am doing wrong.

I have a very simple input table with say 1000 rows I do an (oracle 11) database-lookup step to find rows with a key field
I check both check boxes

1. Do not pass row if lookup fails
2. fail on multiple rows

I have a write to log as my error step, but when I run the transformation the input to the lookup is 1000 rows but the output is only 998.
The write to log does not print the two records that supposedly failed the lookup.
If I un-check error handling it prints every row.

I'm using PDI 4.2
Thanks,
Rod

How to join rows in a sequencial master/detail relationship?

$
0
0
I have a file with lines like this:

...
|C400|2D|MACH|DR091123456|001|
|C405|15112014|001|000008|
|C420|01T0700|23,25||
|C420|F1|195,48||
|C460|2D|00|000087|15112014|252,60|3,74|
|C470|54445|1,000|0,000|un|0,99|060|
|C470|66554|1,000|0,000|un|5,99|060|
|C490|000|5102|007,00|23,25|23,25|1,62||
|C400|2D|MACH|DR0655411|002|
|C405|15112014|001|000045|
|C420|01T0700|455,24||
|C420|F1|5,43||
|C460|2D|00|000555|15112014|52,25|1,20|
|C470|52415|1,000|0,000|un|0,99|060|
|C470|96854|1,000|0,000|un|5,99|060|
|C490|000|5102|007,00|965,14|555,11|55,74||
...

The hierarchy is:

1: C400
1.2: C405
1.2.1: C420
1.2.2: C460
1.2.2.1: C470
1.2.2.2: C490


Well, C490 and C470 are details of the C460;
C460 and C420 are details of the C405; and
C405 is a detail of the C400.


But I don't have any ID in detail line linking it to its master. Just the file sequence. I mean, starting with one C400, next C405 is detail of previous C400. Next C420 and C460 are details of previous C405. And so on.


Here's my question:


Is there a way to merge this information and put into a database?

tks.

PRD Chart label-font-color and title-font-color set

$
0
0
Hello Pentaho World-

Newbie here who just spent hours searching internet and documentation to try and find a simple answer, but failed :(

I have placed a line chart and a pie chart into a report using PRD CE Version 5.2.0.0-209 (the client I use to publish to a server).

I used a black background for everything and discovered I was unable to specify the black chart title font color to white. The x and y axis labels on the line chart are also black and cannot be viewed against a black background. In addition, there are lines from the label to the wedge on the pie chart that are black.

I need all of these to be set to white in order to see against the black background. Help!
Screen Shot 2014-11-25 at 3.45.41 PM.png
Attached Images

Feeding data sources using Spoon into Pentaho Report

$
0
0
Hello,
I'm new to Pentaho and the product looks very promising so far. However, I'm running into some difficulties setting up some basic transformations and I'm probably just not understanding something.

What I'm trying to do is create a report template using the Report Design application, and have the data sources be fed in via a transformation. Is this possible? I think I'm supposed to use the 'external' data source for this and then create custom parameters that link with the data source being fed in. Am I on the right track? Is this possible?

Some of the documentation seems lacking in this area.

Thanks for any input

Autumatically start a job when PDI server or Carte starts

$
0
0
Hi all,

i am working on encryption methods and i created job to decrypt the passowrds. But i need that job to start automatically when PDI server and carte process starts.

Please, i need help in starting a job automatically when pdi server or carte process starts.


Thanks in advance.

Heap size issues linux 64bit

$
0
0
Hi all.. How to configure Xmx and Xms setting for spect below :

- Linux
- 64bit
- 6G RAM
- run into virtual machine

Please guide me guys... Thanks...

weka.core.converters.TextDirectoryLoader not working properly in Windows 64 bit

$
0
0
Environment:
Windows 8.1 64 bit
Weka: 3.7.11 64 bit
JVM: Java 8 64 bit

Steps the reproduce the error:


1. Download movie review file
http://www.cs.cornell.edu/People/pab...olarity.tar.gz


2. Unzip to a folder
C:\my\F1\Proj\ajitgithub\E20-IMDBMovieRatings


3. Open Weka explorer


4. Open Preprocess tab


5. Click OpenFile button and select the txt_sentoken folder in the unzipped file. This folder contains the classes (neg, pos) as subdirectories and contains the corresponding movies reviews text files. There are 1000 files in each subdirectory, total of 2000 files.


C:\my\F1\Proj\ajitgithub\E20-IMDBMovieRatings\review_polarity\txt_sentoken


6. Load Instance pop us window comes and its says that "Cannot determine file loader automatically, please choose one".


7. I choose "weka.core.converters.TextDirectoryLoader", leave the default parameters and click ok.


8 I expect 2000 instances to be loaded but the following is the outcome:


a. Only 1000 instances are loaded.
b. all instances are loaded with "neg" class attribute.
c. all 1000 files belonging to "pos" attribute is loaded in "neg" attribute.


I got the same output in SimpleCLI as well as running the Java Code API in Eclipse java project.

Can someone help?

Regards,

Ajit Singh.

BI Server: PRD with openerp-datasource

$
0
0
Hi All,
Im new on the community and have some issue with pentaho bi server.
Im trying to run a PRD report that i create with the OpenERP Data Access, but im getting a parsing error.
No error showing on logs (at least on catalina.out).
Im using PRD 5 and Biserver CE 5.
I already created others reports directly with JDBC, connecting with postgreSQL, but when i try with the OpenERP Data Access im getting the error.
I can run the report on the PRD without any problem, and it said that the publish was succesfull.
Anyone know how to configure biserver 5 to run a prpt created like this?

Thx and regards.

How to Remove Duplicates Using Advance Filter

End user with pentaho etl tool and reports in pentaho user console

$
0
0
Hi friends,

1. Can i know is it possible running reports and kettle etl task in BI Server(User Console) without prompting the user to creating a jdbc connection with databases ?

2. Does there is any way a developer can develop all reports and database connection using any xml method and configuration method, so the user just upload all database connection details, report details and only can run the report without doing developing and creating database connection task ?


Regard,
Kapil

restart with error for pentaho

$
0
0
Hi Guys,

when I restarted my pentaho server, got some error like below,someone can help me? thanks in advance!!!!

[root# more tomcat/logs/catalina.out
Nov 26, 2014 3:09:29 PM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
Nov 26, 2014 3:09:30 PM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-8080
Nov 26, 2014 3:09:30 PM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 238 ms
Nov 26, 2014 3:09:30 PM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
Nov 26, 2014 3:09:30 PM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.41
Nov 26, 2014 3:09:30 PM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor pentaho.xml
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=Login,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=DebugHome,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=jsp,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
SEVERE: Null component Catalina:type=JspMonitor,name=BrowserLocale,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=InitFailure,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.catalina.startup.HostConfig deployDescriptor
SEVERE: Error deploying configuration descriptor pentaho.xml
java.lang.NoClassDefFoundError: javax/annotation/security/DeclareRoles
at org.apache.catalina.startup.WebAnnotationSet.loadClassAnnotation(WebAnnotationSet.java:244)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationListenerAnnotations(WebAnnotationSet.java:73)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationAnnotations(WebAnnotationSet.java:56)
at org.apache.catalina.startup.ContextConfig.applicationAnnotationsConfig(ContextConfig.java:294)
at org.apache.catalina.startup.ContextConfig.start(ContextConfig.java:1047)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:265)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4616)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.ClassNotFoundException: javax.annotation.security.DeclareRoles
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 30 more


Nov 26, 2014 3:09:30 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory pentaho
Nov 26, 2014 3:09:30 PM org.apache.catalina.core.StandardContext preDeregister
SEVERE: error stopping
LifecycleException: Manager has not yet been started
at org.apache.catalina.session.StandardManager.stop(StandardManager.java:672)
at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4886)
at org.apache.catalina.core.StandardContext.preDeregister(StandardContext.java:5634)
at org.apache.tomcat.util.modeler.BaseModelMBean.preDeregister(BaseModelMBean.java:1095)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.preDeregisterInvoke(DefaultMBeanServerInterceptor.java:1045)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.exclusiveUnregisterMBean(DefaultMBeanServerInterceptor.java:433)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.unregisterMBean(DefaultMBeanServerInterceptor.java:415)
at com.sun.jmx.mbeanserver.JmxMBeanServer.unregisterMBean(JmxMBeanServer.java:546)
at org.apache.tomcat.util.modeler.Registry.unregisterComponent(Registry.java:575)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4468)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1079)
at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:1002)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:506)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)


Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=Login,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=DebugHome,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=jsp,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=Home,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=BrowserLocale,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.tomcat.util.modeler.Registry registerComponent
SEVERE: Null component Catalina:type=JspMonitor,name=InitFailure,WebModule=//localhost/pentaho,J2EEApplication=none,J2EEServer=none
Nov 26, 2014 3:09:30 PM org.apache.catalina.startup.HostConfig deployDirectory
SEVERE: Error deploying web application directory pentaho
java.lang.NoClassDefFoundError: javax/annotation/security/DeclareRoles
at org.apache.catalina.startup.WebAnnotationSet.loadClassAnnotation(WebAnnotationSet.java:244)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationListenerAnnotations(WebAnnotationSet.java:73)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationAnnotations(WebAnnotationSet.java:56)
at org.apache.catalina.startup.ContextConfig.applicationAnnotationsConfig(ContextConfig.java:294)
at org.apache.catalina.startup.ContextConfig.start(ContextConfig.java:1047)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:265)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4616)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1079)
at org.apache.catalina.startup.HostConfig.deployDirectories(HostConfig.java:1002)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:506)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

' "query_result" is not defined' error when viewing MapComponent with SQL Datasource

$
0
0
Hello everyone.

I have been trying to plot some Latitude/Longitude dots on a MapComponent.

It works well (i.e. I can see the map and three dots on it) when using a Scriptable datasource such as this one

Code:

import org.pentaho.reporting.engine.classic.core.util.TypedTableModel;


String[] columnNames = new String[]{
"Longitude",
"Latitude",
"Description"
};


Class[] columnTypes = new Class[]{
Float.class
,Float.class
,String.class
};


TypedTableModel model = new TypedTableModel(columnNames, columnTypes);


model.addRow(new Object[]{ new Float(-56.1055), new Float(-34.54106), new String("Atelier Graphique")});
model.addRow(new Object[]{ new Float(-56.12153), new Float(-34.54568), new String("Australian Collectors, Co.")});
model.addRow(new Object[]{ new Float(-56.10966), new Float(-34.53705), new String("Signal Gift Stores")});
return model;

But it does not work when I switch to a SQL Query datasource. The map displays empty, no dots on it.

Code:

select longitud  as Longitude, latitud  as Latitude, to_char(fecha_evento) as Description from posic_01 where numero_evento_recorrido = 12488966617 order by fecha_evento
I have used the same query and connection to populate a report, so I know the connection to the database works. But when I try to use the same datasource and query to plot dots on the map, it fails.
Here are the error messages in catalina.out and pentaho.log


Code:

ERROR [org.pentaho.platform.plugin.action.javascript.JavascriptRule] 9183327c-7552-11e4-9e12-001e8cf4dd73:COMPONENT:context-1007076875-1416995821702:jtable.xactionJSRULE.ERROR_0003 - [es_89] Javascript rule execution failedorg.mozilla.javascript.EcmaError: ReferenceError: "query_result" is not defined. (<cmd>#7)
and

Code:

==> catalina.out <==
07:57:01,728 ERROR [SolutionEngine] 9183327c-7552-11e4-9e12-001e8cf4dd73:SOLUTION-ENGINE:/public/plugin-samples/pentaho-cdf/actions/jtable.xaction: Action Sequence execution failed, see details below
| Error Time: miércoles 26 de noviembre de 2014 07H57' UYST
| Session ID: admin
| Instance Id: 9183327c-7552-11e4-9e12-001e8cf4dd73
| Action Sequence: jtable.xaction
| Execution Stack:
EXECUTING ACTION: JavaScript (JavascriptRule)
| Action Class: JavascriptRule
| Action Desc: JavaScript
| Loop Index: 0
Stack Trace:org.pentaho.platform.api.engine.ActionExecutionException: RuntimeContext.ERROR_0017 - [es_18] Activity failed to execute
    at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeComponent(RuntimeContext.java:1211)
    at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeAction(RuntimeContext.java:1151)
    at org.pentaho.platform.engine.services.runtime.RuntimeContext.performActions(RuntimeContext.java:1063)
    at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeLoop(RuntimeContext.java:1013)
    at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:895)

Just to make sure this is not a data problem, I switched the SQL query to this, but I still get the same error.

Code:

select -56.1055 as Latitude, -34.54106 as Longitude, 'toto' as Description
union all
select -56.12153 as Latitude, -34.54568 as Longitude,'Australian Collectors, Co.' as Description
union all
select -56.10966 as Latitude, -34.53705 as Longitude, 'test' as Description


Any help will be appreciated. Am a complete n00b to Pentaho and still trying to wrap my head around it, so pointers to FineMaterial available are well appreciated too.

Álvaro

Database connectors: native SQL or ODBC/JDBC?

$
0
0
Hi,

I want to know if PDI uses in the database connectors ODBC/JDBC or sql native or both..

Is it possible to change?..., I prefer sql native of performance reasons.

Thanks in advance,

Extracting data from HL7 and loading SQL tables

$
0
0
I am new to PDI and have extremely less knowledge on the use of all transformations.

I am trying to extract data from a HL7 input source and would like to load them into tables in SQL server. Below given is a sample of how my HL7 input would look like:

MSH|^~\&|ABC||ABDDEF||||ORU^R01|HP10000000|P|2.3||||||8859/1
PID|||MRN0XX^^^^MR~""^^^^PI~J00000000000^^^^VN||Giovinazzo^Carmine||19680101|M
PV1||I|ABCXXX^^BedXX&0&0
OBR|||572d3f14-31de-4fb8-bb1c-beaa7c5068fa|XXXX|||20140506091656|||demo^^^^^^^^^^^^^^^O~""^^^^^^^^^^^^^^^V
OBX||CE|0002-FA20^PrType^MDIL|128|F002-F01F^ABC^MDIL||||||F
OBX||ST|0002-FA21^Prot.^MDIL|128|Test Data||||||F

I have created a temporary table which contains all columns. So, basically i need to fit each data (pipe-delimited) into each column.

Not sure if this is a very simple question. Any guidance would be greatly appreciated!!

Problem with Json output, content is invalid

$
0
0
Hi!

I'm trying to read some record from the database and create a Json file in output. If I extract only one record the content is valid but if I want to extract, for example, 3 records the content of the file is invalid. I try to validated the Json file on 2 sites and the the response is negative, the content of file is invalid.
Someone have the same problem or I'm doing something wrong?

Thanks!

File extracted by PDI

$
0
0
Hello


I am using ubuntu 14.04 LTS for my PDI 5.1 CE. I am extracing data from mysql server and sending as attachment in csv format. I am not able to send the csv as attachment using Kettle Job. It gives the following error:

Problem while sending message: com.sun.mail.smtp.SMTPSendFailedException: 552-5.2.3 Your message exceeded Google's message size limits. Please visit http://support.google.com/mail/bin/a...py?answer=8770 to review our size guidelines. x10sm4022971pdr.11 - gsmtp

I found out that the attachment file(csv) exceeds the gmail attachment limit of 25MB. When I use the same query directly in mysql-workbench, I got an output file(csv) of only 500kb. Both the files have same data which I have verified manually. I don't know why does it happening. How to reduce the file size and send the file as attachment. Please help me regarding this.


Thanks for your help.


Regards
Kaushik Karan
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>