Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Charts not loading on first login attempt everyday

$
0
0
Hi,
I am using pentaho 6.1 BI server.I have integrated Dashboard in my web portal using iframe.
Everyday when I login into my web portal,Some or All charts are not loaded displaying error "Error processing component".
This happens only on first login attempt.When I refresh the page all charts are loaded.
I checked error in browsers console it says "501 not implemented".
I checked catlina.log file where I get following error.Where Can I set autoReconnect=true property?


18-Aug-2016 03:47:38.612 WARNING [http-apr-8080-exec-17] com.sun.jersey.spi.container.servlet.WebComponent.filterFormParameters A servlet request, to the URI http://54.229.245.5:8080/pentaho/plugin/cda/api/doQuery, contains form parameters in the request body but the request body has been consumed by the servlet or a servlet filter accessing the request parameters. Only resource methods using @FormParam will work as expected. Resource methods consuming the request body by other means will not work as expected.
18-Aug-2016 03:47:38.613 WARNING [http-apr-8080-exec-20] com.sun.jersey.spi.container.servlet.WebComponent.filterFormParameters A servlet request, to the URI http://54.229.245.5:8080/pentaho/plugin/cda/api/doQuery, contains form parameters in the request body but the request body has been consumed by the servlet or a servlet filter accessing the request parameters. Only resource methods using @FormParam will work as expected. Resource methods consuming the request body by other means will not work as expected.
18-Aug-2016 03:47:38.644 SEVERE [http-apr-8080-exec-17] com.sun.jersey.spi.container.ContainerResponse.logException Mapped exception to response: 501
javax.ws.rs.WebApplicationException: pt.webdetails.cda.dataaccess.QueryException: The last packet successfully received from the server was 54,126,232 milliseconds ago. The last packet sent successfully to the server was 54,127,647 milliseconds ago. is longer than the server configured value of 'wait_timeout'. You should consider either expiring and/or testing connection validity before use in your application, increasing the server configured values for client timeouts, or using the Connector/J connection property 'autoReconnect=true' to avoid this problem.
at pt.webdetails.cda.CdaUtils.doQuery(CdaUtils.java:185)
at pt.webdetails.cda.CdaUtils.doQueryPost(CdaUtils.java:133)
at sun.reflect.GeneratedMethodAccessor160.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1511)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1442)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:112)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.pentaho.platform.web.servlet.JAXRSPluginServlet.service(JAXRSPluginServlet.java:117)
at pt.webdetails.cpf.JAXRSCLPluginServlet.service(JAXRSCLPluginServlet.java:37)
at org.pentaho.platform.web.servlet.PluginDispatchServlet.service(PluginDispatchServlet.java:89)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:399)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:189)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:188)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:522)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1095)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2500)
at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2489)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Unknown Source)
Caused by: pt.webdetails.cda.dataaccess.QueryException: The last packet successfully received from the server was 54,126,232 milliseconds ago. The last packet sent successfully to the server was 54,127,647 milliseconds ago. is longer than the server configured value of 'wait_timeout'. You should consider either expiring and/or testing connection validity before use in your application, increasing the server configured values for client timeouts, or using the Connector/J connection property 'autoReconnect=true' to avoid this problem.
at pt.webdetails.cda.dataaccess.PREDataAccess.performRawQuery(PREDataAccess.java:173)
at pt.webdetails.cda.dataaccess.SimpleDataAccess.queryDataSource(SimpleDataAccess.java:136)
at pt.webdetails.cda.dataaccess.AbstractDataAccess.doQuery(AbstractDataAccess.java:263)
at pt.webdetails.cda.CdaEngine.doQuery(CdaEngine.java:140)
at pt.webdetails.cda.CdaEngine.doExportQuery(CdaEngine.java:166)
at pt.webdetails.cda.CdaCoreService.doQuery(CdaCoreService.java:76)
at pt.webdetails.cda.CdaUtils.doQueryInternal(CdaUtils.java:138)
at pt.webdetails.cda.CdaUtils.doQuery(CdaUtils.java:171)
... 91 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: The last packet successfully received from the server was 54,126,232 milliseconds ago. The last packet sent successfully to the server was 54,127,647 milliseconds ago. is longer than the server configured value of 'wait_timeout'. You should consider either expiring and/or testing connection validity before use in your application, increasing the server configured values for client timeouts, or using the Connector/J connection property 'autoReconnect=true' to avoid this problem.
at sun.reflect.GeneratedConstructorAccessor463.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1116)
at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3352)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1971)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2281)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:96)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:96)
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.parametrizeAndQuery(SimpleSQLReportDataFactory.java:311)
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.queryData(SimpleSQLReportDataFactory.java:178)
at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SQLReportDataFactory.queryData(SQLReportDataFactory.java:142)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:172)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:154)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:67)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:411)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryData(CachingDataFactory.java:299)
at pt.webdetails.cda.dataaccess.PREDataAccess.performRawQuery(PREDataAccess.java:127)
... 98 more
Caused by: java.net.SocketException: Software caused connection abort: socket write error
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(Unknown Source)
at java.net.SocketOutputStream.write(Unknown Source)
at java.io.BufferedOutputStream.flushBuffer(Unknown Source)
at java.io.BufferedOutputStream.flush(Unknown Source)
at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3333)
... 114 more

Optimization, merging, blocking, nr of rows limitations

$
0
0
I have a group of closely related problems.

This is the transformation:


First rows are loaded from a database table. Then rows are sorted based on a column value in the switch.

There are two sub types both connected to the same fact table. Each row has 1 IP Source value and 1 IP Destination value. These are strings and saved in a ip dimension table with auto-generated ID key.

To avoid concurrency, I need a wait step ("Wait_IP_SRC_...") between IP source step and IP destination step. I also need a wait step ("Block_2") between sub-type 1 and sub-type 2 so they don't write to the same tables at once.

Also, the wait steps makes the transformation freeze if number of rows in DB is larger than the setting for "Nr of rows in rowset".

Are there any possible improvements to this transformation?
Attached Images

Connecting to the Pentaho Data Service from within PDI

$
0
0
Hello,

I am trying to connect to the Pentaho Data Service from PDI. I have set up a "generic database" connection as shown belw and tried to test the connection, but it does not work (I get the log below).
I am using a CE version of PDI 6.0.1.0-386 on Windows.
The Transformation where the Data Service is created is in a directory shared with other users

Can anyone help me on this?

Captura.jpg


----------


Error connecting to database [ejemplo data service] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database


Error connecting to database: (using class org.pentaho.di.trans.dataservice.jdbc.ThinDriver)
You don't seem to be getting a connection to the server. Check the host and port you're using and make sure the sever is up and running.




org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database


Error connecting to database: (using class org.pentaho.di.trans.dataservice.jdbc.ThinDriver)
You don't seem to be getting a connection to the server. Check the host and port you're using and make sure the sever is up and running.




at org.pentaho.di.core.database.Database.normalConnect(Database.java:459)
at org.pentaho.di.core.database.Database.connect(Database.java:357)
at org.pentaho.di.core.database.Database.connect(Database.java:328)
at org.pentaho.di.core.database.Database.connect(Database.java:318)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2734)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:588)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:136)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:389)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:318)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:470)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:457)
at org.pentaho.di.ui.spoon.Spoon.newConnection(Spoon.java:8750)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1339)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7939)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9214)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:653)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.pentaho.di.trans.dataservice.jdbc.ThinDriver)
You don't seem to be getting a connection to the server. Check the host and port you're using and make sure the sever is up and running.


at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:572)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:443)
... 55 more
Caused by: java.sql.SQLException: You don't seem to be getting a connection to the server. Check the host and port you're using and make sure the sever is up and running.
at org.pentaho.di.trans.dataservice.jdbc.ThinConnection.testConnection(ThinConnection.java:129)
at org.pentaho.di.trans.dataservice.jdbc.ThinDriver.connect(ThinDriver.java:65)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:554)
... 56 more


Custom URL : jdbc:pdi://localhost:9081/kettle
Custom Driver Class:org.pentaho.di.trans.dataservice.jdbc.ThinDriver
Attached Images

What does log "Triggering heartbeat signal" mean?

$
0
0
Hello guys..I am finishing my job with about 20 transformations, that is synchronizing two databases.

Lately, I see often in kettle log this message: Triggering heartbeat signal for <name_here> at every 10 seconds

One example for all:
Code:

2016/08/18 11:58:05 - read from [used_discounts].0 - SQL query : SELECT * FROM used_discounts WHERE id > ? ORDER BY id2016/08/18 11:58:05 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:58:14 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:58:15 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:58:24 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:58:25 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:58:34 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:58:35 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:58:44 - write to [used_discounts].0 - Prepared statement : INSERT INTO used_discounts (invoice_id, ticket_id, device_id, org_id, discount_id, entry_time, exit_time, pay_time, validation_time, validated_fee, var_symbol, flexcore_device_id, id, car_id, park_uuid) VALUES ( ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?,  ?)
2016/08/18 11:58:44 - report - Commit on database connection [report]
2016/08/18 11:58:44 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:58:44 - report - Commit on database connection [report]
2016/08/18 11:58:44 - report - Commit on database connection [report]
2016/08/18 11:58:44 - report - Commit on database connection [report]
2016/08/18 11:58:44 - report - Commit on database connection [report]
2016/08/18 11:58:44 - report - Commit on database connection [report]
2016/08/18 11:58:45 - report - Commit on database connection [report]
2016/08/18 11:58:45 - report - Commit on database connection [report]
2016/08/18 11:58:45 - report - Commit on database connection [report]
2016/08/18 11:58:45 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:58:54 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:58:55 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:59:04 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:59:05 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:59:14 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:59:15 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:59:24 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:59:25 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:31 - report - Commit on database connection [report]
2016/08/18 11:59:32 - report - Commit on database connection [report]
2016/08/18 11:59:34 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:59:35 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:59:44 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:59:45 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds
2016/08/18 11:59:54 - p6_copy_used_discounts_2_report - Triggering heartbeat signal for p6_copy_used_discounts_2_report at every 10 seconds
2016/08/18 11:59:55 - p6_park2report - Triggering heartbeat signal for p6_park2report at every 10 seconds

So I would like to ask, what does this message mean? I guess it is something with database. My wild guesses are, that postgres is too sow and not answering. Can anybody give me some explanation what this could mean. If it is a warning of some sort that something isn't going very well, or what is going on here.

Thanks for any advice.

(I am using Kettle Version 6.0.1.0-386)

Losing logging (sent to separate Java window) when invoking spoon from spoon

$
0
0
Hi All -

Not sure if this is an issue with Spoon, or Java, or OS. For memory optimization purposes we've got spoon calling itself with the "Execute a Shell Script" step. The thought here is that we'll free up memory used by the JVM on our ETL boxes faster after the job called by this step completes but subsequent steps in the parent job (pictured here) continue to run. This approach does indeed work and we've been using it for about a year.

We've got an identical deployment on 3 separate ETL machines for separate regions of the world. On 2 of these machines, all of the logging output from the Shell-script-invoked job is directed to the logging results area within spoon, which is exactly what we want. One the third machine (that pictured here) however, the logging results are shown in a separate window that appears upon invocation and lasts throughout but then disappears. The problem is when I get an SNMP trap and need to go back and troubleshoot I cannot actually see any messaging for the error so I'm flying blind. I've tried invoking through the command line and directing results to a atext file, to no avail.

I hope this makes sense. Does anybody know how to tune / fix this behavior?

Problems using Kettle with Scala & IntelliJ

$
0
0
I'm trying to set up a little program to call a kettle transformation using Scala and developing in IntelliJ (as it is told in the title of the post). I have been researching as far as I could to make this work, so I need a little bit of help from people with more experience. First, I added the kettle dependencies using SBT:


val kettleVersion = "5.3.0.0-200"
resolvers ++= Seq(
"pentaho-releases" at "http://repository.pentaho.org/artifactory/repo/")

libraryDependencies ++= Seq(
"pentaho-kettle" % "kettle-engine" % kettleVersion % "provided",
"pentaho-kettle" % "kettle-core" % kettleVersion % "provided",
"pentaho-kettle" % "kettle-db" % "4.4.0-GA" % "provided"
)

Next, this is my kettle function to call the transformation:


importorg.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransMeta;

public void kettlecall (String filename)
{
try {
KettleEnvironment.init();TransMeta metaData = new TransMeta(filename);
Trans trans = new Trans( metaData );
trans.execute( null );
trans.waitUntilFinished();
if ( trans.getErrors() > 0 ) {
System.out.print( "Error Executing transformation" );
}
} catch( KettleException e ) {
e.printStackTrace();
}
}

So, I can compile, without errors, and the sbt seems to pick everything ok, but for some reason when I try to execute it I get this error:

[RuntimeException: java.lang.NoClassDefFoundError: org/pentaho/di/core/exception/KettleException]


I don't know if I'm not importing the correct versions, if I'm missing some dependencies, or if there is a problem in the code itself. I would appreciate if someone can help me with this.

Thanks in advance.

Microsoft Excel Input Step - Failed to open file

$
0
0
Hi
Im trying to open and xlsx file (it seems it has been created using http://www.spreadsheetgear.com/) and Im not being able to open it.

Does anybody knows how to solve this ??

Thks in advance

Query Validation failed (MySQL)

$
0
0
Hi,

I have set a connection to a MySQL database successfully (when hitting test it says that the connection is successfull). However, when I try to run a query such as SELECT 1 (without semicolons!) i get this error:

Query validation failed: Query validation failed: {0}

Any idea of what can be going wrong?

Get Variable step not working

$
0
0
Hi,
I am having trouble with the Get Variable step. It seemed to be working fine at first but I noticed that I am getting an error and the Get Variable step is not getting the corresponding value. In the first transformation, I have the Set Variable step and in the second transformation and the rest of them, this variable value is required with the Get Variable step. Any suggestions as to how I can solve this? Thank you in advance.

GetVariable.jpgGetVar.jpgPreview.PNG
Attached Images

Really need help with a REST Client GET - please

$
0
0
The machine is OSX. Have been using it for a lot of stuff no issues. 3+ year Pentaho user I usually don't get stuck, but I am totally stuck right now. Never had to do a GET before and it's not working for me. I know this URL works with CURL, it works just fine when I curl in Terminal, and I am passing this URL with all the parameters within the input step to the Rest client. I am trying to use the login and password within the passed URL because I also can't make it work when I put the creds in the SSH section of the REST client. In fact, when I do that, I get this same error. Here are my error results. ANY help would be appreciated!


ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Because of an error, this step can't continue:
2016/08/18 17:31:26 - REST Client.0 - Can not result from [https://username:password@api.websit...9?format=json]
2016/08/18 17:31:26 - REST Client.0 - javax.net.ssl.SSLException: java.lang.RuntimeException: Could not generate DH keypair
2016/08/18 17:31:26 - Hit Our LRN tool - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2016/08/18 17:31:26 - REST Client.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleException:
2016/08/18 17:31:26 - REST Client.0 - Can not result from [https://username:password@api.websit...9?format=json]
2016/08/18 17:31:26 - REST Client.0 - javax.net.ssl.SSLException: java.lang.RuntimeException: Could not generate DH keypair
2016/08/18 17:31:26 - REST Client.0 -
2016/08/18 17:31:26 - REST Client.0 - at org.pentaho.di.trans.steps.rest.Rest.callRest(Rest.java:201)
2016/08/18 17:31:26 - REST Client.0 - at org.pentaho.di.trans.steps.rest.Rest.processRow(Rest.java:397)
2016/08/18 17:31:26 - REST Client.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2016/08/18 17:31:26 - REST Client.0 - at java.lang.Thread.run(Thread.java:695)
2016/08/18 17:31:26 - REST Client.0 - Caused by: com.sun.jersey.api.client.ClientHandlerException: javax.net.ssl.SSLException: java.lang.RuntimeException: Could not generate DH keypair
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.client.apache.DefaultApacheHttpMethodExecutor.executeMethod(DefaultApacheHttpMethodExecutor.java:213)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.client.apache.ApacheHttpClientHandler.handle(ApacheHttpClientHandler.java:175)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.api.client.Client.handle(Client.java:648)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.api.client.WebResource.handle(WebResource.java:680)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.api.client.WebResource$Builder.get(WebResource.java:507)
2016/08/18 17:31:26 - REST Client.0 - at org.pentaho.di.trans.steps.rest.Rest.callRest(Rest.java:152)
2016/08/18 17:31:26 - REST Client.0 - ... 3 more
2016/08/18 17:31:26 - REST Client.0 - Caused by: javax.net.ssl.SSLException: java.lang.RuntimeException: Could not generate DH keypair
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.Alerts.getSSLException(Alerts.java:190)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1747)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1708)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1691)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1617)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.AppOutputStream.write(AppOutputStream.java:105)
2016/08/18 17:31:26 - REST Client.0 - at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
2016/08/18 17:31:26 - REST Client.0 - at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.HttpConnection.flushRequestOutputStream(HttpConnection.java:828)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.MultiThreadedHttpConnectionManager$HttpConnectionAdapter.flushRequestOutputStream(MultiThreadedHttpConnectionManager.java:1565)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.HttpMethodBase.writeRequest(HttpMethodBase.java:2116)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.HttpMethodBase.execute(HttpMethodBase.java:1096)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:398)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171)
2016/08/18 17:31:26 - REST Client.0 - at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.jersey.client.apache.DefaultApacheHttpMethodExecutor.executeMethod(DefaultApacheHttpMethodExecutor.java:210)
2016/08/18 17:31:26 - REST Client.0 - ... 9 more
2016/08/18 17:31:26 - REST Client.0 - Caused by: java.lang.RuntimeException: Could not generate DH keypair
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.DHCrypt.<init>(DHCrypt.java:114)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.ClientHandshaker.serverKeyExchange(ClientHandshaker.java:559)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:186)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.Handshaker.processLoop(Handshaker.java:593)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.Handshaker.process_record(Handshaker.java:529)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:943)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1188)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:654)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.AppOutputStream.write(AppOutputStream.java:100)
2016/08/18 17:31:26 - REST Client.0 - ... 19 more
2016/08/18 17:31:26 - REST Client.0 - Caused by: java.security.InvalidAlgorithmParameterException: Prime size must be multiple of 64, and can only range from 512 to 1024 (inclusive)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.crypto.provider.DHKeyPairGenerator.initialize(DashoA13*..)
2016/08/18 17:31:26 - REST Client.0 - at java.security.KeyPairGenerator$Delegate.initialize(KeyPairGenerator.java:627)
2016/08/18 17:31:26 - REST Client.0 - at com.sun.net.ssl.internal.ssl.DHCrypt.<init>(DHCrypt.java:107)
2016/08/18 17:31:26 - REST Client.0 - ... 27 more
2016/08/18 17:31:26 - Hit Our LRN tool - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!

Odoo master slave implementation with Pentaho.

$
0
0
Hi, hope this is the right section to ask.

We are running Pentaho 7 on our current Odoo(7.x) with Postgresql server under ubuntu 14.x x64.

We have around 110 users using our Odoo server Master.

PSQL Streaming Replica to sync our slave(Pentaho) server from the master is working.

We setup odoo on the slave, the idea is if a user need a report from pentaho, access odoo their and run the report, but are facing issues, because once we run our report we have a error screen:

Quote:

You cannot perform this operation. New Record Creation is not allowed for this object as this object is for reporting purpose.
Postgres logs say:

Quote:

2016-08-18 17:13:53 PDT STATEMENT: SELECT value FROM ir_translation
WHERE lang='en_US'
AND type in ('code', 'sql_constraint')
AND src='UserError'
2016-08-18 17:13:53 PDT ERROR: current transaction is aborted, commands ignored until end of transaction block
2016-08-18 17:13:53 PDT STATEMENT: SELECT value
FROM ir_translation
WHERE lang='en_US'
AND type in ('code', 'sql_constraint')
AND src='You cannot perform this operation. New Record Creation is not allowed for this object as this object is for reporting purpose.'
I know that on a streaming replica, the slave is read only, odoo track users sessions and try to save on the db and the replica won't allow that, we understand that.

But odoo is trying to add records and crush our report.

This simple report in the Master works.

Do our implementation is possible?

Master(odoo)-slave(odoo+pentaho)
Slave just for reports only.

The idea is to move that DB load from the master and give that to the slave, but reports only.

This make sense?

Or we forget some setting in the slave-pentaho server?

Thanks all for your time.

Getting error when using "Launch Next Entries in parallel" in job.

$
0
0
Am trying to use kind of recursion in pentaho by calling the same job multiple times in pentaho and getting the below error.

please let know the solution for the below issue.Thanks in advance

2016/08/19 10:38:29 - TRX_newval - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : java.util.ConcurrentModificationException
2016/08/19 10:38:29 - TRX_newval - at java.util.Hashtable$Enumerator.next(Unknown Source)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.core.variables.Variables.initializeVariablesFrom(Variables.java:124)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.trans.step.BaseStep.initializeVariablesFrom(BaseStep.java:3254)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:839)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.trans.Trans.execute(Trans.java:578)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:1037)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.job.Job.execute(Job.java:678)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.job.Job.execute(Job.java:815)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.job.Job.access$000(Job.java:110)
2016/08/19 10:38:29 - TRX_newval - at org.pentaho.di.job.Job$1.run(Job.java:793)
2016/08/19 10:38:29 - TRX_newval - at java.lang.Thread.run(Unknown Source)
2016/08/19 10:38:36 - Transformation - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unable to open transformation: null
2016/08/19 10:38:36 - Transformation - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : java.util.ConcurrentModificationException
2016/08/19 10:38:36 - Transformation - at java.util.Hashtable$Enumerator.next(Unknown Source)
2016/08/19 10:38:36 - Transformation - at org.pentaho.di.core.variables.Variables.initializeVariablesFrom(Variables.java:124)
2016/08/19 10:38:36 - Transformation - at org.pentaho.di.trans.step.BaseStep.initializeVariablesFrom(BaseStep.java:3254)
2016/08/19 10:38:36 - Transformation - at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:839)
2016/08/19 10:38:36 - Transformation - at org.pentaho.di.trans.Trans.execute(Trans.java:578)
2016/08/19 10:38:36 - Transformation - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:1037)
2016/08/19 10:38:36 - Transformation - at org.pentaho.di.job.Job.execute(Job.java:678)
Attached Images

pentaho metadata injection

$
0
0
Hi,

I wanted to know more about pentaho ETL metadata injection. I took the example mentioned at this link https://help.pentaho.com/Documentati...data_Injection

The execution got stopped at ETL metadata injection step. Not being able to pass data to the final transformation. Is there anyone who cam help me with the possible solution. It gives this error - "failed to initialize at least one step. Execution can not begin!"

Any kind of help and suggestion would be highly appreciated.

Thanks and Regards,
sukanya_cg

Getting Exception when connect Pentaho report designer with Hive data source

$
0
0
Hello,

I have setup hadoop-2.6.0 and apache-hive-1.1.0-bin. and i am using pdi-ce-6.1.0.1-196 report designer.
there is plugin.properties file at this path :
/home/kumar/reportdesigner/plugins/pentaho-big-data-plugin/plugin.properties.
i have set this value in plugin.properties file:
plugin.properties active.hadoop.configuration=cdh55
because there is cdh55 directory at this path:
report-designer/plugins/pentaho-big-data-plugin/pentaho-big-data-plugin/hadoop-configurations/cdh55

i have added the following jar files in jdbc folder. (report-designer/lib/jdbc)
hive-jdbc-1.1.0.jar
hive-jdbc-1.1.0-standalone.jar

I am trying to connect the Hive datasource with Pentaho report designer.
- I give the following parameters:
Connection Type: Hadoop Hive
Host Name: localhost
Database Name: default
Port number: 10000
i am getting this exception when test the connection:

Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
No suitable driver found for jdbc:hive://localhost:10000/default
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:579)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:450)
... 123 more
Caused by: java.sql.SQLException: No suitable driver found for jdbc:hive://localhost:10000/default
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:233)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:565)
... 124 more

- When i give the Connection Type: Hadoop Hive 2
i am getting this exception when test the connection:

Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused
org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database
Error connecting to database: (using class org.apache.hive.jdbc.HiveDriver)
Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused


- When i add following content in report-designer.sh:

HADOOP_CORE={{ls $HADOOP_HOME/hadoop-*-core.jar}}
CLASSPATH=.:$HADOOP_CORE:$HIVE_HOME/conf

for i in ${HIVE_HOME}/lib/*.jar ; do
CLASSPATH=$CLASSPATH:$i
done

CLASSPATH=$CLASSPATH:launcher.jar
echo java -XX:MaxPermSize=512m -cp $CLASSPATH -jar launcher.jar
java -XX:MaxPermSize=512m -cp $CLASSPATH org.pentaho.commons.launcher.Launcher


and use Connection Type: Hadoop Hive
i am getting this exception when test the connection:

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
Unable to load Hive JDBC driver for the currently active Hadoop configuration
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:579)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:450)
... 123 more
Caused by: java.sql.SQLException: Unable to load Hive JDBC driver for the currently active Hadoop configuration
at org.apache.hadoop.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:108)
at org.apache.hadoop.hive.jdbc.HiveDriver.callWithActiveDriver(HiveDriver.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:133)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:233)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:565)
... 124 more
Caused by: java.lang.Exception: Unable to locate Hadoop Configuration Registry
at org.pentaho.hadoop.hive.jdbc.HadoopConfigurationUtil.findHadoopConfigurationProvider(HadoopConfigurationUtil.java:101)
at org.pentaho.hadoop.hive.jdbc.HadoopConfigurationUtil.getProvider(HadoopConfigurationUtil.java:107)
at org.pentaho.hadoop.hive.jdbc.HadoopConfigurationUtil.getActiveConfiguration(HadoopConfigurationUtil.java:119)
at org.pentaho.hadoop.hive.jdbc.HadoopConfigurationUtil.getActiveHadoopShim(HadoopConfigurationUtil.java:130)
at org.apache.hadoop.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:103)


- When use Connection Type: Hadoop Hive 2 with changes in report-designer.sh
i am getting this exception when test the connection:
Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:215)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:163)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:233)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:565)
... 124 more
Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
at org.apache.thrift.transport.TSocket.open(TSocket.java:187)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190)
... 129 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at org.apache.thrift.transport.TSocket.open(TSocket.java:182)
... 132 more
Hostname : localhost
Port : 10000
Database name : default


please help me to connect Hive data source with Pentaho report-designer.


Thanks.

Unable to Insert data into SQL Azure Database Table using Table Output step (PDI5.4)

$
0
0
I trying to insert data into a SQL Azure database table using the Table Output step but I am getting the error: Reference to database and/or server name in '<schema name>>..<<table name>>' is not supported in this version of SQL Server. However I was successfully able to read data from a table in the same database using the Table Input step. Any help/pointers will be really appreciated. Thank you.

Trouble in setting up Hadoop File Input step reading CSV file from HDFS directory

$
0
0
I have installed Pentaho 6.1.x.x (both PDI and Pentaho Hadoop) on Windows 7 machine. I have also setup Hadoop cluster (HDP2.2) with VMWare workstation on the same Windows 7 machine. I am trying to create a scenario in PDI to read CSV file which is HDFS directory and run some transformation and load the data back on to HDFS directory. I have 2 questions.
1) In Hadoop File Input step, I am not able to browse HDFS directory. Do I need to create any connection similar to database connections to access HDFS directory? I don't see the Browse button for File or directory to connect HDFS directory. Is there any other things to be done to connect to Hadoop and browse files?
2) How do I transfer CSV or any other file from Windows 7 host or VMWare workstation on Windows 7 to HDFS directory?

Json input step fails silently when using filename from field

$
0
0
After upgrading from 6.0 to 6.1, one of my jobs stopped working correctly. The job is launched on CentOS 6.4 using kitchen.sh and a database repository.

It looks like the Json input step does absolutely nothing when passed a filename in a field. There are no errors shown and no output is generated. If I uncheck "Source is defined in a field?" and enter the source file into the step manually, it works.

To test this, I've taken one of the samples ("JsonInput read a dynamic file") and wrapped a job around it.
  • On my local Windows 7 copy of Spoon 6.1 it runs flawlessly.
  • On CentOS running it with kitchen from the file runs flawlessly.
  • If I run the same job from the repository on either Windows 7 or CentOS, (having changed the file reference to not use ${Internal.Job.Filename.Directory}), the dynamic version of Json Input doesn't output any rows, nor does it give any warning that no file was supplied or found.


In the repository version, I'm passing 'file:///home/pentaho/JsonInput/jsonfile.js' for the dynamic version and have entered '/home/pentaho/JsonInput/jsonfile.js' statically. I've tried passing '/home/pentaho/JsonInput/jsonfile.js' in the filed as well, but no luck.

This might be related: http://jira.pentaho.com/browse/PDI-7448

Can anyone help me figure out what's going on here? i want to upgrade for the performance improvements, but there's no point if I can't pass it files to process.
Attached Files

Which CE version to use

$
0
0
Hi Guys,

I'm going to start new projet with Pentaho CE platform. I have few questions and hope you'll be able to help me.
Which version of pentaho is the best to use ?
I saw by following this link ( https://sourceforge.net/projects/pen...ence%20Server/ ) that the latest version "stable" is the 4.8.0.
Should I use this version for my project. Can I use the 6.1.0 version ?

Thanks by advance for your reply guys.

Best regards

Octet_Length () in PDI

$
0
0
Hello ;
I have a column 'bytea' in my data source , and i want to calculate the length of each rows and put it in another table using Pentaho Data integration. it work perfectly in postgresql query , i tried to use SQL Script in PDI but i doesn't work.

Any ideas to do it ?

Execute SQL Script step not working

$
0
0
I have created a Execute SQL Script step that deletes from a table in an Oracle DB. It's a simple just straight delete from this temp table, however it does not delete. It shows it ran successfully, the connection works just fine when I check it. I'm not sure what the issue is. Under the same user when using SQL developer I can delete from the table no problem but using the same connection and user info I cannot delete from Pentaho Execute SQL Step. To make things even more strange I did a Table Input, I am able to select from some tables, but not the table I want to delete from. I checked the permissions again all under the same account, no idea what is going on.

I even tried:
DELETE FROM WEB.NAMEOFTABLE;
COMMIT;

And that did not work either, I tried various configurations and statements and nothing has worked. Sometimes it does not even run, I hit start and nothing for upto 30 minutes.
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>