Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Get the current logged in user and his role

$
0
0
Hi Guys,

Can I (and if so, how?) get the currently logged on user (his username) and its role via JavaScript?

Best regards,
Matthias

Pentaho - Still has memory problems running loops

$
0
0
Hi there--

I thought I solved this one by increasing the Java size allocation (to about 2 gigs). Now that I added a bit more data for it to read + write each loop, Pentaho is crashing again due to a heap space error in a fairly regular pattern.

Based on this, I can only assume that the data being read/ written per loop is somehow related to the "overhead garbage" Pentaho seems to like collecting.

Other than that, I've done heap dumps and taken a look at the largest memory hogs ---

This seems to be org.pentaho.di.job.Job (I think this is mostly the Job Entry "Results" -- which includes log files and meta-data). It appears to be collecting this each time a job entry is "looped" and never letting go. Other big memory hogs seem to be the strings that comprise the results.logText ... probably related.

It's weird though --- the exact data read/ written doesn't affect the size of the logText or "results" meta-data, yet still makes the java heap crash more readily --- so unless these are added ... I don't know. It's hard to figure out what's going on.

Why does Pentaho not "release" all this extraneous garbage memory once a job entry is completed?

Anyway, for now, I've deleted the loop and simply did did "Repeat" on the "Start" step .... to repeat every 2 seconds (once the job is done). For some reason, THIS method seems to not carry some extraneous baggage around after every step.


EDIT: My mistake --- the 'repeat' on the 'start' step still carries some memory baggage forward .... just less ... went 4x as many loops before it started throwing heap errors every repeat. It's weird ... it's like the "garbage memory" being accumulated is not reset unless Pentaho is exited completely. ... very frustrating. What is there to do? End each job with a shell script that kills Pentaho, then starts up the job again? That would be ridiculous ....


To add more specifics, of the nearly 2 GB of retained heap, 1 GB came from org.pentaho.di.job.Job @ ox78od31c30 Thread-2715---

Sub entries include repeats of something called org.pentaho.di.core.Result @ 0x7bfe48f50 (contains about 17 MB each, there are 20+ entries).

These sub entry "Result" objects contain 'Statics' like XML_ROW_TAG (result-rows), XML_TAG (result), and attributes like logText, logChannelID, nrLinesWritten, resultFiles, rows (java.util.ArrayList)

I don't know ... surely I'm not the only person to run into heap space errors here? Especially during initial data warehouse loading, where many loops are needed? I don't even think the data transfer I'm doing is particularly large. The final table will be well under 1 GB in total size, and about 350,000 rows.

Another curious thing ... when I click "Perform Garbage Collection" -- in VisualVM -- it seems to drive the memory overhead down ... without affecting any processes ... okay if there's any easy GC fix that Pentaho isn't doing, why not?

Google Analytics Input Step (Using GA 2.4 APIs) Erro

$
0
0
Hi everyone,


I'm using the Google Analytics Step for extract some data, but some few hours ago just stopped and I received a error, nothing changed in API key or password.

erro_ga_step.jpg

So, I asked a friend and same error shown to him.

And now, I want to know if somebody can reproduce the same error or can explain the reason, because it works the Google Analytics Query Explorer and I think it's not a error on connections. Maybe Google have updated API again. True?

Best,
Átila Neumann
Attached Images

Pentaho Bug - Execute SQL statement doesn't escape "--" in parameters

$
0
0
Hey I mentioned this before, but I just downloaded the latest version and it's still giving me grief.

Pentaho doesn't escape "--" which may be present in text fields in the "Execute MySQL" step. (and hence comment out the rest of the line).

This is unusual because in most MySQL clients, a quote-bound "--" is exactly that .... a literal reference to "--" however in Pentaho's case, they are acting as rest-of-line commentors.



I have parameters that dynamically accept text fields to insert into a DB.

I may switch to the Insert/ Update as opposed to Execute MySQL step, if that may work.


This is undoubtedly a bug, though.

BI for a web hosting company

$
0
0
Hi guys,

Would anyone have a data model or at least something to start me off? What are the key dimensions?

Chosen multiple select placeholder

$
0
0
Anyone know how I can change the default placeholder text ('Select some options') for a chosen multiple select?

Data Integration Scheduler not working

$
0
0
Hello all.

I have lots of jobs set-up in a scheduler on Spoon. All of these jobs run perfectly fine when run manually. However the scheduler has completely stopped working for over a week and we cannot find why this would be.

The scheduler, when refreshed, will show the next run time, but then nothing happens at all.

I've tried stopping and starting the Data Integration server many times. Sometimes the scheduler will run the jobs for, maybe 10 minutes, but then it just stops again.

In Data Integration logs, these are the only errors I can see:

2015-05-27 11:00:15,226 INFO [org.pentaho.di] 2015/05/27 11:00:15 - General - Logging plugin type found with ID: CheckpointLogTable
2015-05-27 11:00:38,265 ERROR [hive.ql.exec.HiveHistory] Unable to create log directory /tmp/MM-P4$
2015-05-27 11:00:39,400 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceSystemListener: DatasourceSystemListener.ERROR_0003 - Unable to pool datasource object: AgileBI caused by java.sql.SQLException: Invalid URL: jdbc:monetdb://localhost:50000/pentaho-instaview
2015-05-27 11:00:40,569 ERROR [hive.ql.exec.HiveHistory] Unable to create log directory /tmp/MM-P4$
2015-05-27 11:00:40,604 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceSystemListener: DatasourceSystemListener.ERROR_0003 - Unable to pool datasource object: My DB caused by com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure


The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
2015-05-27 11:00:40,820 ERROR [hive.ql.exec.HiveHistory] Unable to create log directory /tmp/MM-P4$
2015-05-27 11:00:40,850 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceSystemListener: DatasourceSystemListener.ERROR_0003 - Unable to pool datasource object: PDB caused by java.sql.SQLException: Access denied for user 'root'@'77.239.105.79' (using password: YES)

ANY information at all would be extremely helpful.

Thank you in advance.

Adam

Unable to view a report in Pentaho User Console

$
0
0
Hi Everyone,

I'm stuck at a very basic step: Publishing a report to pentaho User Console.

I have created a report in Pentaho report designer. The report runs perfectly fine in prd. After that I have published the report to User console. When i try to open the report in User console it throws me an error saying "Failed at query:[Queryused in the report]".

My understanding is there is something wrong with the connection. Am i missing something very basic? Do I need to publish/ create the connection in Pentaho User Console.

Is there any other way to publish reports to User console? Is there any other steps involved?

Thanks
Manojit

Actualizar reportes en tiempo real

$
0
0
Buen día, soy novato en pentaho y tengo el siguiente reto:

Estoy generando unos reportes con report designer 3.9.0-GA mediante consultas SQL sin parámetros y publicándolos en pentaho 4.8.0-stable.
El problema radica en que al actualizar mis tablas en la BD, Pentaho no actualiza esos cambios, a menos que cierre y vuelva a iniciar sesión.
Necesito que conforme se agregue información a las tablas, el reporte se pueda actualizar con el botón View Report.

Agradezco mucho su apoyo, saludos.

How to build Pentaho Data Integration in Sun Solaris ???

$
0
0
Hi,

I built Kettle in Win server and used it to build the ETL thread for my business.

Now i had to be build in Sun Solaris and use the interface from window clients.

Is it possible ?? Plz help me :D

Thanks all

ERROR: MODIFIED_DATE is empty when saving a job

$
0
0
Hello people
I've just copied the 3.1 release of SPOON from a CentOS 5 to a new CentOS 7.
I've also restored the repo_kettle DB from the old CentOS 5 mysql to the new mariadb in CentOS7
I finally got everything to work again on CentOS 7 and Spoon starts, I can connecto to the repository and open a JOB.
The only problem is that i SAVE the JOB or a TRANFORMATION, the record updated in the R_JOB or R_TRANSFORMATION tables shows an empty value for the MODIFIED_DATE field.
This is the log of the INSERT of the mariadb repo_kettle database:

INSERT INTO R_TRANSFORMATION (ID_TRANSFORMATION, NAME, DESCRIPTION, EXTENDED_DESCRIPTION, TRANS_VERSION, TRANS_STATUS, ID_STEP_READ, ID_STEP_WRITE, ID_STEP_INPUT, ID_STEP_OUTPUT, ID_STEP_UPDATE, ID_DATABASE_LOG, TABLE_NAME_LOG, USE_BATCHID, USE_LOGFIELD, ID_DATABASE_MAXDATE, TABLE_NAME_MAXDATE, FIELD_NAME_MAXDATE, OFFSET_MAXDATE, DIFF_MAXDATE, CREATED_USER, CREATED_DATE, MODIFIED_USER, MODIFIED_DATE, SIZE_ROWSET, ID_DIRECTORY) VALUES ( 734, 'PGT_Telai_S_000_1', '1', NULL, NULL, 0, -1, -1, -1, -1, -1, -1, NULL, 'Y', 'N', -1, NULL, NULL, 0, 0, 'admin', '2009-12-22 10:50:17', 'admin', '', 10000, 63)

as you can see i have an empty '' value for the MODIFIED_DATE column.

How can I toubleshoot this issue?

Thanx,

Jurij

Users complaining can't view reports

$
0
0
Some of our users are experiencing problems running reports, the screen will just not load the report, having previously working perfectly. What are common troubleshooting tips for this? Could it be a browser issue?

Setting Default Values

$
0
0
I am new to Kettle, so excuse me if this is an obvious question.

I am moving a text file into a database (Oracle). The files contain dates in the yyyyMMdd format. This is great for 90% of the inbound records, but occasionally a date comes in as empty or 00000000. I set lenient to true, but this turns the date to 30-NOV-01. I want it to be null, but can't see to figure out how to do this. I tried the "Null If", but to no avail.

Any suggestions?

Thank in advance.

Error: The CDA Datafactory could not process the query

$
0
0
Hello,

I've installed the saiku reporting plugin into the pentaho server, it shows several errors, first when I open the saiku plugin:

Code:

2015-05-27 14:42:32,995 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0004 - Resource /saiku-adhoc/web/js/adhoc/plugins/I18n/po/fr.json not found in plugin Saiku-Adhoc
and when I try to visualize data, it shows this error:

ReportGenerator.ERROR_0001 - The CDA Datafactory could not process the query


When I stack the trace, I find this error code:

Code:

2015-05-27 14:24:35,824 ERROR [org.saiku.adhoc.service.cda.PluginUtils] Failed to execute call to plugin: java.lang.NullPointerException
2015-05-27 14:24:35,825 ERROR [org.saiku.adhoc.service.report.ReportGeneratorService] org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Failed to send request
2015-05-27 14:24:35,841 ERROR [org.saiku.adhoc.rest.QueryResource] Cannot generate report (CF9AF9A9-CF72-FA83-BD84-CC5228CB8008)
org.saiku.adhoc.exceptions.SaikuAdhocException: ReportGenerator.ERROR_0001 - The CDA Datafactory could not process the query
    at org.saiku.adhoc.service.report.ReportGeneratorService.processReport(ReportGeneratorService.java:293)
    at org.saiku.adhoc.service.report.ReportGeneratorService.renderReportHtml(ReportGeneratorService.java:153)
    at org.saiku.adhoc.rest.QueryResource.generateReport(QueryResource.java:271)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:168)
    at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:71)
    at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:280)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1341)
    at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1273)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1223)
    at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1213)
    at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:414)
    at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
    at org.codehaus.enunciate.modules.jersey.EnunciateJerseyServletContainer.service(EnunciateJerseyServletContainer.java:239)
    at org.saiku.adhoc.service.EnunciateJerseyPluginServlet.service(EnunciateJerseyPluginServlet.java:76)
    at org.saiku.adhoc.service.ServletAdapterContentGenerator.createContent(ServletAdapterContentGenerator.java:80)
    at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:142)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
    at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:60)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
    at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:861)
    at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:579)
    at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1584)
    at java.lang.Thread.run(Thread.java:745)

I appreciate any kind of help

Thank you

set the width for the parameter lists

$
0
0
Is there a way to set the width for the parameter lists?
I have several multi-value list parameters, and some of them only show the first characters of the options; I cannot see the full values.
I need to make the boxes wider. I tried by adding spaces to the left of the values but I don't see any difference.
thanks

PDI: memory leak when job workflow is "too long"

$
0
0
Hello,

I'm observing serious memory leaks when using Spoon or Kitchen to run "long" workflows of Kettle jobs.

Example: A root job calls three sub-jobs in sequence.
If I start each sub job independently, they run just fine.
If I start the root job, PDI at some point starts to consume tremendous amounts of memory (up to the configured JVM memory limit 16 GB - see attached screenshot). In consequence, the job slows extremely down or aborts with an out of memory exception.

PDI 5.3.0.0-213 seems to create lots (> 19 million) objects of class org.pentaho.di.core.row.ValueMeta - see the attached screenshot of jvisualvm. I also observed the effect in pdi 4.4.0 - but it seems a bit more pronounced in PDI 5.

The jobs usually do not use Kettle transformations to process large amounts of data - the data stays in the database and is transformed by calling a sequence of SQL scripts.

Has anyone around here observed a similar behavior - and found a solution?

Environment:
Windows Server 2008 R2
Java 1.6, Java 1.7
pdi 5.3.0.0-213, pdi 4.4.0

Best regards,

Bernd
Attached Images

Using ${Internal.Transformation.Filename.Directory} to define location of shared.xml

$
0
0
Hi,

I'm trying to use ${Internal.Transformation.Filename.Directory}/shared.xml as the location of the shared object file, inside a transformation to pull in shared.xml file which will be in the directory where the transform lives (instead of the default .kettle directory).

I'm running into issues - the first is, I discovered I needed to save the transformation first, before trying to set this parameter in the transformation settings - which I suppose is reasonable, since it doesn't know where it is saved (duh)

However, even once I save, and then set the shared object file setting in "Transformation settings", it is able to pull in the shared.xml objects (it pulls in data sources), but when I try to save the transformation afterwards, it throws a null pointer exception.

Any ideas? Or other approaches to solve this problem? I'd like to set up a shared.xml with data sources that all of my transforms can use, whether used in the context of a job or just individually.

(Kettle 5.3.0.0-213)

Stack trace is below:
org.pentaho.di.core.exception.KettleException:
Unable to save shared ojects
at org.pentaho.commons.launcher.Launcher.main (Launcher.java:92)
at java.lang.reflect.Method.invoke (Method.java:606)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:57)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (NativeMethodAccessorImpl.java:-2)
at org.pentaho.di.ui.spoon.Spoon.main (Spoon.java:654)
at org.pentaho.di.ui.spoon.Spoon.start (Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose (Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch (Spoon.java:1316)
at org.eclipse.swt.widgets.Display.readAndDispatch (null:-1)
at org.eclipse.swt.widgets.Display.runDeferredEvents (null:-1)
at org.eclipse.swt.widgets.Widget.notifyListeners (null:-1)
at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
at org.eclipse.swt.widgets.Widget.sendEvent (null:-1)
at org.eclipse.swt.widgets.Display.sendEvent (null:-1)
at org.eclipse.swt.widgets.EventTable.sendEvent (null:-1)
at org.eclipse.swt.widgets.TypedListener.handleEvent (null:-1)
at org.pentaho.ui.xul.swt.tags.SwtToolbarbutton$1.widgetSelected (SwtToolbarbutton.java:96)
at org.pentaho.ui.xul.swt.tags.SwtToolbarbutton.access$100 (SwtToolbarbutton.java:48)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke (AbstractXulComponent.java:141)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke (AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke (AbstractXulDomContainer.java:313)
at java.lang.reflect.Method.invoke (Method.java:606)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:57)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (NativeMethodAccessorImpl.java:-2)
at org.pentaho.di.ui.spoon.Spoon.saveFile (Spoon.java:4935)
at org.pentaho.di.ui.spoon.Spoon.saveToFile (Spoon.java:4997)
at org.pentaho.di.trans.TransMeta.saveSharedObjects (TransMeta.java:5382)
at org.pentaho.di.shared.SharedObjects.saveToFile (SharedObjects.java:222)
at org.pentaho.di.shared.SharedObjects.createOrGetFileBackup (SharedObjects.java:359)
at org.pentaho.di.shared.SharedObjects.createFileBackup (SharedObjects.java:379)
at org.pentaho.di.shared.SharedObjects.copyFile (SharedObjects.java:399)


at org.pentaho.di.trans.TransMeta.saveSharedObjects(TransMeta.java:5384)
at org.pentaho.di.ui.spoon.Spoon.saveToFile(Spoon.java:4997)
at org.pentaho.di.ui.spoon.Spoon.saveFile(Spoon.java:4935)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtToolbarbutton.access$100(SwtToolbarbutton.java:48)
at org.pentaho.ui.xul.swt.tags.SwtToolbarbutton$1.widgetSelected(SwtToolbarbutton.java:96)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.notifyListeners(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1316)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:654)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: java.lang.NullPointerException
at org.pentaho.di.shared.SharedObjects.copyFile(SharedObjects.java:399)
at org.pentaho.di.shared.SharedObjects.createFileBackup(SharedObjects.java:379)
at org.pentaho.di.shared.SharedObjects.createOrGetFileBackup(SharedObjects.java:359)
at org.pentaho.di.shared.SharedObjects.saveToFile(SharedObjects.java:222)
at org.pentaho.di.trans.TransMeta.saveSharedObjects(TransMeta.java:5382)
... 29 more

CDA don't show results for view querys in mysql for Pentaho 5.3

$
0
0
I have a Pentaho 5.2.0.0.209 with CDA working with a mysql view query on Windows.

Now i had installed Pentaho 5.3 on linux debian and the same query shows an empty result. If i make a query over a table, shows the data. Obviously, the same query to the view executed in mysql brings non empty result.

Im using this connector mysql-connector-java-5.1.34-bin.jar.

Where can I find saiku analytics plugin

$
0
0
Hello everybody,

Can any one give me a link where I can download the plugin saiku analytics, and how to install it?
I've installed saiku-reporting and there is no charts included

I hope you can help me with that

Is there a way to get # rows inserted, # rows updated with Insert/ Update Step?

$
0
0
Hi there ... I'm using Insert/ Update instead of Execute SQL statement, for now, because free-form fields with double hyphens (--) breaks the latter.

I may use a JS step to remove (--) from fields manually, still thinking about the tradeoff.

Nevertheless, the Execute SQL statement step has field options for # inserts and # updates to be passed on be passed on as rows. Input/ Update doesn't -- is there a step I could use that can find this information?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>