Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Scheduled jobs for prpt files; email isssue; filetype issue

$
0
0
Version 5.3
When I schedule a report designer file to be created, I obtain the report but it is in ZIP file format, not XLSX as I had specified.
In addition, I do not receive the email with attachment.....I receive no email at all.

Anyone experience this ?

Newbie with basic questions

$
0
0
Hi all, I'm totally new to ETL / pentaho. I am a data analyst teaching myself how to use this as a work-around for the unfortunate bottleneck that is our miniscule IT department. The ultimate goal is to export .csv transaction files from one of our gateway providers and upload the data onto tables. The exported data is not pretty. I was trying to just make modifications to the destination table as I ran into weird values (e.g. making the phone field a length of 30 to accommodate an email address that ended up in there somehow) but realized that it was an exercise in futility. I need to just load the good data and kick out the rows with stupid data to review later.

In order to figure out how the error handling works, I broke it down into a tiny subset of data - an output table with two columns and an input file of 10 rows containing data in two fields. One of those rows contains an obvious error so that I could figure out how this all works. Mission accomplished; I can load the "good" data and have the "error row" sent to text file.

--ORIGINAL data in V1-10rowTest:
Acct ID Sub ID
PY3P8 EM737
PYYPQ EM667
PYK3F EM665
PYLC5 EM669
PYYPQ EM667
PYYPQ EM667
PYYPQ EM667
PYK3F EM665
PYYPQ EM667x (obvious error, field restriction of length = 5)

--What lands on the table:
acctId subId
PY3P8 EM737
PYK3F EM665
PYLC5 EM669
PYYPQ EM667
  • What causes my database insert / update transformation to decide that only the unique pairs of data should be inserted onto the table? What if I wanted duplicate rows? I mean, I wouldn't really ever want that but was surprised that kettle prevented it from happening without explicitly being told to do so.



I then modified the input file so that there were no duplicate values and left one row with an error:

- New input file V1-10rowTest2.csv:
Acct ID Sub ID
PY3P8 EM737
PYYPQ EM667
PYK3F EM665
PYLC5 EM669
PYYPQ EM167
PYYPQ EM267
PYYPQ EM367
PYK3F EM165
PYYPQ EM667x (obvious error, field restriction of length = 5)

- What lands on the table now:
acctId subId
PY3P8 EM737
PYK3F EM165
PYK3F EM665
PYLC5 EM669
PYYPQ EM167
PYYPQ EM267
PYYPQ EM367
PYYPQ EM667


This is what goes to my error output file:
acctId;subId;error_count;error_description;error_fields;error_codes
PYYPQ;EM667;1;org.pentaho.di.core.exception.KettleDatabaseException:Error inserting/updating row Data truncation: Data too long for column 'subId' at row 1 ;;ISU001


  • Why does my error_fields value have <null> in the error output file?


  • When I look at the Preview Data tab for the error output transformation it does not display the subId field WITH the error (shows EM667) (expected to see EM667x)
    • Preview Data for mapping transformation shows the row #9 with EM667x for subId as expected
    • Preview Data for insert / update transformations shows the eight rows which were successfully loaded to the table as expected




Thanks!
Nichole
Attached Images
Attached Files

UnifiedRepositoryException exception

$
0
0
All of a sudden i am getting this error when i click on Browse Files . Nothing happens and keep seeing the error

Using 5.4.0.1.130 With MySQL . I have a demo tomorrow and stuck with this error .

Any help is much appreciated !


Oct 08, 2015 11:40:18 PM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The RuntimeException could not be mapped to a response, re-throwing to the HTTP container
org.pentaho.platform.api.repository2.unified.UnifiedRepositoryException: exception while getting tree rooted at path "/"


Reference number: 60f9103a-1cb2-4c4f-a328-16024bc2bd8c
at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.callLogThrow(ExceptionLoggingDecorator.java:512)
at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.getTree(ExceptionLoggingDecorator.java:443)
at org.pentaho.platform.repository2.unified.webservices.DefaultUnifiedRepositoryWebService.getTreeFromRequest(DefaultUnifiedRepositoryWebService.java:160)
at org.pentaho.platform.web.http.api.resources.services.FileService.doGetTree(FileService.java:1188)
at org.pentaho.platform.web.http.api.resources.FileResource.doGetTree(FileResource.java:1618)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)

RIGHT([fieldname]; 6) of Formula Step fails to extract rightmost 6 characters

$
0
0
Could be ignored, forget to trim field which contained 5 blanks at the end so the real input was "1234567890 ". So last 6 chars are "0 " which is correct. Sorry.




I want to extract the last 6 digits of a 10 Digit String

numfield = "1234567890" (Type: String) (input from stream)

RIGHT([numfield]; 6) returns '0' (seems to ignore optional integer value, works like with default value 1)
RIGHT("0123456789"; 6) returns '567890' (as expected)
[numfield] returns '1234567890' (to check whether input is correct)

Is there anything I did wrong in the formula?

Thanks
Jo

Kettle Version 5.4.0.1-130 as of June 14, 2015

Can't send files to the trash in Pentaho User Console

$
0
0
Hello all,

I have been configuring a fresh install of Pentaho 5.3 EE and have run in to an issue.
While logged in to the console as Administrator I am not able to send files to the trash can.
This is the alert that pops up when I try to delete a file:
Delete Failure.jpg

My Administration role is allowed to do everything possible within the BA server:
administration functions.PNG

Attached is what is written in the Pentaho log file when I attempt to delete a file:
Quote:

2015-10-12 11:03:06,445 ERROR [org.pentaho.platform.web.http.api.resources.services.FileService] Error getting system resourceorg.pentaho.platform.api.repository2.unified.UnifiedRepositoryException: exception while deleting file with id ""
Reference number: 0d5f77ba-f9cd-45b2-8669-a0724f91560e at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.callLogThrow(ExceptionLoggingDecorator.java:512)
at org.pentaho.platform.repository2.unified.ExceptionLoggingDecorator.deleteFile(ExceptionLoggingDecorator.java:109)
at org.pentaho.platform.repository2.unified.webservices.DefaultUnifiedRepositoryWebService.deleteFileWithPermanentFlag(DefaultUnifiedRepositoryWebService.java:233)
at org.pentaho.platform.web.http.api.resources.services.FileService.doDeleteFilesPermanent(FileService.java:131)
at org.pentaho.platform.web.http.api.resources.FileResource.doDeleteFilesPermanent(FileResource.java:191)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:108)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:191)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.pentaho.ui.servlet.SystemStatusFilter.doFilter(SourceFile:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Unknown Source)
In several fresh installs of 5.3 EE I have not had this issue. What configuration did I mess up during the installation?
Any help would be appreciated.

Thank you,

Jenna
Attached Images

Reports with OLAP data source and SAP - so poor performance

$
0
0
I have few BEx queries (OLAP cubes for third-party client) and .prpt reports in Pentaho that uses these BEx via OLAP data source (OLAP4J, MDX). When I execute these BEx in SAP BW or even directly as SOAP service (XML/A execute command via SoapUI) I'm getting result very fast, just 2-3 seconds. But when I try to execute the same BEx queries in Pentaho I'm getting results after 1-2 minutes It doesn't matter if I execute reports that are already published on Pentaho BI server or if I execute these reports in Report Designer. Caching is enabled.


MDX-query is very simple:
Code:

SELECT
NON EMPTY
{ [00O2TQU30NKBTE188HYVQMXKU].Members }
ON COLUMNS,
NON EMPTY
{ [00O2TQU30NKBTE184UOSJ253R].Members }
ON ROWS
FROM
[CATALOG_NAME/CUBE_NAME]
SAP VARIABLES [PARAM_1] INCLUDING [HIERARCHY_1].[${VALUE_1}]
[PARAM_2] INCLUDING [HIERARCHY_2].[${VALUE_2_1}]:[Z_DATPRB].[${VALUE_2_2}]


As captured with Wireshark, Pentaho does multiple http requests to get metadata about all existing cubes (and their measures too!) in current catalogue.
It's look like this:
1. DISCOVER_PROPERTIES
2. DISCOVER_DATASOURCES
3. DBSCHEMA_CATALOGS
4. Execute required BEx query
5. DISCOVER_PROPERTIES
6. MDSCHEMA_CUBES
7. MDSCHEMA_CUBES by specified catalog
8. MDSCHEMA_MEASURES by specified cube
9. MDSCHEMA_MEMBERS by 1st measure
10. MDSCHEMA_DIMENSIONS by specified cube
11. MDSCHEMA_HIERARCHIES by measures
12. MDSCHEMA_LEVELS by measures
13. MDSCHEMA_PROPERTIES by measures
14. MDSCHEMA_MEMBERS by 2nd measure
...
~350. MDSCHEMA_MEMBERS by last measure of last cube from specified catalog
~354. Execute required BEx query at second time
...
420. MDSCHEMA_PROPERTIES by last hierarchy of specified cube.


Capture of network packets in attachments.


It is OK that Pentaho tries to get metadata information by specified cube, but executing MDX query twice and getting metadata of all other cubes from catalogue is very unefficiently.
Previously I read about similar problem with SSAS:
http://sourceforge.net/p/olap4j/disc...hread/161c048e
http://sourceforge.net/p/olap4j/disc...read/79fad9d7/
https://github.com/olap4j/olap4j/pull/8
But this information doesn't help with SAP XML/A provider. Changes in OLAP4J driver (in org.olap4j.driver.xmla.XmlaOlap4jCellSet and org.olap4j.driver.xmla.XmlaOlap4jCube) cause unstable rendering of reports.


So the main question is: how to improve performance?

report_sap_bex.txt
Attached Files

Scheduler for reports - distribution lists

$
0
0
We do not want to hard-code the email addresses of the recipients of reports as email attachments.

How can we dynamically update the email addresses stored in the scheduler ?

ERROR with postgresSQL - data type Bit

$
0
0
I doing a transformation from txt to datatable on postgresql but i get the next ERROR

ERROR: the «status_r» columno is bit but the expresion is bigint

I tried to cast my column in bit but i got the same error

ERRORS column issue in PDI-CE-5.3

$
0
0
Hi All,

I am trying to find out logic where i am missing info regarding ERRORS column in job LOG table. I hope you people are aware of that we can maintain job running data with columns which are provided in job log table.

Scenario is like this: i have MAIN job under this i'm maintaining 2 jobs. when MAIN job runs successfully ERRORS columns storing as 0 but sub jobs(job1 and job2) storing as null.

How can i maintain all my jobs ERRORS column status with 0 when all jobs are successfully ran over.

i'm using pdi-ce-5.3 file repository with java 1.7(mysql database)

Errors column.png Errors column1.jpg

Kindly help me.
Attached Images

Role Based integration issue in Pentaho BI UI

$
0
0
Hi ,

I have customized pentaho UI completely by re-using existing code. Now am looking to achieve role based integration on my few menu items with the help of below code
Code:

<script type="text/x-handlebars-template">{{#if canCreateContent}}item2, item3,item4{{/if}}</script>
to make it more understandable let me explain. Example: i have item1, item2, item3, item4, item5 in left navigation menu out of which item3, item4, item 5 will not be available for few Logged in Users. Now the problem here is if i use above logic in home/index.jsp in pentaho folder then it works. But if am trying to use same logic through "/mantle/Mantle.jsp" then item3, item4, item5 not available for any user including Admin.

Need priority assistance on this.

Spoon: Arrgh! lost Schedule from Action and Perspective during upgrade

$
0
0
Hi Folks,

I've been experimenting with 5.4 and its service packs--what I failed to notice until after I'd applied the service packs for 5.4.0.2, .3, & .4 was that I no longer have the Schedule function available in Spoon (not on the Action menu or a Perspective button).

Any suggestions of where to look for something that got deleted, overwritten, or otherwise disabled in that process?

Thanks for any hints, links, suggestions, or what-have-you!

John

Chinese characters issue in flat files

$
0
0
Hi,

(Mysql DB ,pdi-ce-5.3, java 1.7 ), file repository.

I am trying to load data from DATABASE to FLATFILE and i am facing issue with Chinese characters, it is not transforming directly. so i am getting mis matched data. if i run same data using DATABASE to DATABASE then it is working fine.

my example Chinese data : 中文(繁體) when i use text file output then i am getting data like : 中文(ç¹�é«”) so totally different from original one( I USED SETTING utf-8 as well) am i need to apply some other setting?.

How can i load Chinese data using Flat files. please suggest me.



Thank you

library to build arff file

$
0
0
Hi all,
In my java code I have a List<List<String>>,
lets say its content is the following:

word1 word2
word1 word3 word4
word2 word5
...

I have to carry out basket analysis on it, using AprioriAlgorith to get the association rules.
How can I convert it to a suitable arff file ?

Pan & Spoon provide different results for calculator step

$
0
0
Hi,

I'm having a strange issue that I don't seem to be able to solve.

the setup is really simple, I use spoon (5.x) on a Ubuntu machine. The ktrs are executed on a raspberry pi from pan (4.x).
I get some data from xml and load it into a MySQL DB on the raspberry pi.

All the fields are transferred fine, except one (RemainingHrs). Every time I run from spoon everything works as expected. But the same transformation from pan NULLs all the entries for RemainingHrs.
I had some errors where in the calculation step I did not provide a type, but those are cleared out. There are no more errors in the log.

The he calculation is Date A - Date B (hours). The result is full hours in spoon, whereas pan just updates the table with NULLs for this field. As I said everything else works. Another calculation (Date A + B Hours) works perfectly.


I don't know where else to search for the issue. Any hints would be greatly appreciated :)

css defalut page pentaho

$
0
0
Hi all,
in version 5.4 pentaho I have a html page in the home. To keep the same graphic on the left side, where there are the buttons "Browse Files," "Create New" .. where are the css of this section? How can I create buttons similar "Browse Files," "Create New"?
if the images have a background, there is a way to fit them without showing contrast between background image and background page (from css pentaho)
Thanks in advance

Help: Stop running Kettle Job/Transformation using Java

$
0
0
Hello,

I'm developing a web-app based ETL too (with Kettle engine), using Java.

I'm running into issues, while trying to stop a running Job. I'm not sure if using the CarteSingleton.java is correct. I'm using a custom singleton map.

My code is as below
Code:

Job job = new Job(null, jobMeta);
job.setLogLevel(LogLevel.DETAILED);
job.setGatheringMetrics(true);


job.start();

Once job.start() is invoked, I'm trying to store that job object in a custom singleton map and retrieve the exact Job object that was stored in map, and on invocation of stop() with another REST call, while the Job's status is RUNNING, to stop it.
But that doesn't stop the running job. Kettle Engine is not getting notified of this! Job Execution continues. The .kjb / .ktr was creating using SPOON, although I'm not using SPOON to run/stop execution.

Is there any Kettle API config I've to change, to be able to use

Code:

same job object
job.stopAll();

Could you please enlighten on the API and sample example, if any to stop a running JOB or transformation using Java?

Any pointers or help here would be great! Thanks, again.

Regards,
Sanjeev

European Number Format

$
0
0
Hi all,
in version 5.4 pentaho I would enter the European format for numbers.
in previous versions I changed pentaho-solutions / system / pentaho-CDF / js / Dashboards / Utils.js. in vers. 5.4 I changed this file but I varied the number format. What should I change again?


Thanks in advance
Ettore Coletti

Table Component onmouseover

$
0
0
Hi all,
in version 5.4 pentaho I created a table with Expand container objects. Because I can not turn on the onmouseover event, I would like to know how you turn or if you can show a tooltip like "clik for Details".
how can I give an indication that is active click for details? (shaped mouse, tooltip, ..)


Thanks in advance
Ettore Coletti

Help:: How to output csv with dynamic column

$
0
0
Hi all,

I'm newbie, I have a csv source file, I want to read this file and export to another csv file with only column 1, 2, 3, ... (column 1,2,3,.. read from config file)

Thanks,

Embed Kettle in WebApp: use datasource

$
0
0
We're going to fully embed Kettle in a WebApp, running in WebLogic.
The goal is to load a CSV file, run a transformation and store in a database.

This means that Kettle in running in a container, with the datasource available (JNDI lookup).

How can I use this datasource instead of specifying connection details in a database XML?

Context initCtx = new InitialContext();
DataSource ds = (DataSource) initCtx.lookup(jndiName);
java.sql.Connection conn = ds.getConnection();


[edit]
Am I correct to assume that using database access type JNDI with the correct JNDI name should be exactly what I need?

Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>