Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

ConnectWise

$
0
0
Hi All,

I was fiddling around in Pentaho (kettle 6) and tried to connect to ConnectWise, without any success.. :(

Does anyone know whether this is possible and if so, how?


As an example, this is what the ConnectWise API-tester sends over, captured with Wireshark:


<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<soap:Body>
<ProcessClientAction xmlns="http://tempuri.org/">
<actionString>
<?xml version="1.0" encoding="utf-8"?>
<GetConnectWiseVersionActionxmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<CompanyName>yzx</CompanyName>
<IntegrationLoginId>zyx</IntegrationLoginId>
<IntegrationPassword>xyz</IntegrationPassword>
<ConnectWiseVersionInfo><Version></Version>
<IsCloud>false</IsCloud>
</ConnectWiseVersionInfo></GetConnectWiseVersionAction>
</actionString></ProcessClientAction></soap:Body></soap:Envelope>

This returns the version number of Connectwise in XML.
As a first step in Pentaho I really would like to have this working, when I know how the "formatting" works, I can move on to do queries etc.
Btw, I got it working in SoapUI.


Thanks in advance!

plug in for check point feature in PDI 5.x

Question about debugging jobs and transformation

$
0
0
I have a question about debug a job.

I have a job with 3 transformations. The dataset results of each transformation have to be passed to the next transformation.
I used the step "copy rows to result" and "get rows from result" to connect the flow datasets between transformations.

In order to test each transformations one by one, I use temporary files:
tran1 -> make steps...write file1
tran2 -> read file1 make steps write file2 ( I test it individually)
tran3 -> read file2 make steps... ( I test it individually)

Once tested each transformation I create the job and replace temporary files for ""copy/get rows to result"

My problem now is if testing my job I have an error in a tranformtion...how I test now this transformation individualy??,
do I have to go back to the "temporary files"

I would like to know the optimal way to test job and transformations.

Any help, will be greatly appreciated,

Thanks

Question about "data validator" step

$
0
0
I am using "data validator" step to validate data types and length of fields i my job.
When this step detect a wrong validation it generates an error...is is possible to use data validator to validate but not generting an error in the job.

Thanks in advance

Question about autodoc

$
0
0
Hi, I am using "Automatic Documentation Output"

I have selected all the doc options to have a detailed doc.

I want to have a doc at step level, is it possible??

I don't find many places to insert comments in the steps, Can you insert comments in steps and use it in autodoc?

I don't find useful Automatic Documentation Output, is there any other way?

Thanks

Timestamp Datatype predicate with .0

$
0
0
Hi,

I am stuck with one issue. with timestamp datatype.
My cube has few fields as timestamp data type and in UI it is coming as 22-01-2015 23:55:59.0, I wanted to omit the last ".0".
have tried.

<Level name="Created On" visible="true" column="CREATED_ON_DATE" type="Timestamp" uniqueMembers="false" levelType="Regular" hideMemberIf="Never" description="Created On updd">
<NameExpression>
<SQL dialect="generic">
<=!=[=C=D=A=T=A=[TO_TIMESTAMP(CREATED_ON_DATE,'DD-MM-YYYY HH:MM:SS')]=]=>
</SQL>
</NameExpression>
but it is not working and giving result with .0

<Level name="Created On" visible="true" column="CREATED_ON_DATE" type="Timestamp" uniqueMembers="false" levelType="Regular" hideMemberIf="Never" description="Created On updd">
<NameExpression>
<SQL dialect="generic">
<=!=[=C=D=A=T=A=[TO_CHAR(CREATED_ON_DATE,'DD-MM-YYYY HH:MM:SS')]=]=>
</SQL>


</NameExpression>

this is working but I wanted to be timestamp.
I am using Vertica as Database.

Is there a way I can achieve this.

Please suggest.

Thanks,
Malay

Can not make a new dashboard, vanilla CE 6 on Windows Server

$
0
0
I have a vanilla install on Windows Server 2012 with 64 bit Java 1.8
When I try to make a new CDE Dashboard, I get "Sorry. We really did try. Something went wrong" ...



pentaho.log:

2015-11-23 12:07:39,071 ERROR [org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput] Error generating content from content generator with id [new]
java.lang.NullPointerException
at pt.webdetails.cdf.dd.util.JsonUtils.toJXPathContext(JsonUtils.java:87)
at pt.webdetails.cdf.dd.model.meta.reader.datasources.DataSourcesModelReader.read(DataSourcesModelReader.java:52)
at pt.webdetails.cdf.dd.MetaModelManager.readDataSourceComponents(MetaModelManager.java:144)
at pt.webdetails.cdf.dd.MetaModelManager.readModel(MetaModelManager.java:123)
at pt.webdetails.cdf.dd.MetaModelManager.<init>(MetaModelManager.java:59)
at pt.webdetails.cdf.dd.MetaModelManager.getInstance(MetaModelManager.java:45)
at pt.webdetails.cdf.dd.render.DependenciesManager.getInstance(DependenciesManager.java:57)
at pt.webdetails.cdf.dd.editor.DashboardEditor.buildReplacementTokenMap(DashboardEditor.java:76)
at pt.webdetails.cdf.dd.editor.DashboardEditor.getEditor(DashboardEditor.java:63)
at pt.webdetails.cdf.dd.api.RenderApi.getEditor(RenderApi.java:496)
at pt.webdetails.cdf.dd.api.RenderApi.newDashboard(RenderApi.java:385)
at pt.webdetails.cdf.dd.DashboardDesignerContentGenerator.createContent(DashboardDesignerContentGenerator.java:83)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.generateContent(GeneratorStreamingOutput.java:236)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutput.write(GeneratorStreamingOutput.java:163)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:54)
at org.pentaho.platform.web.http.api.resources.GeneratorStreamingOutputProvider.writeTo(GeneratorStreamingOutputProvider.java:33)
at com.sun.jersey.spi.container.ContainerResponse.write(ContainerResponse.java:306)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1479)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1391)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1381)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:538)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:716)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:109)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.pentaho.platform.web.servlet.JAXRSServlet.service(JAXRSServlet.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:87)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:399)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:191)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:115)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:263)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:411)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:188)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SystemStatusFilter.doFilter(SystemStatusFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:114)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.WebappRootForwardingFilter.doFilter(WebappRootForwardingFilter.java:70)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.pentaho.platform.web.http.filters.PentahoPathDecodingFilter.doFilter(PentahoPathDecodingFilter.java:34)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2503)
at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2492)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Unknown Source)

Text file export- Remove white spaces after each field

$
0
0
I used the pentaho data integration tool to export some data in a text format. However, after each field it adds an extra white space. I used the fixed column length and in case of strings, I want the column to have white space if the entire length of the column is not used, but the export file also has an extra space after that.
if i have 2 fields 123 and abc both with length= 3
i want it to appear as 123abc
but what i get is 123 abc
so, i get an extra white space between 2 fields.

I have set the column delimiter to blank as well. Is there a way to fix this? please help!!

PDI 6.0.0 Stable?

$
0
0
Hello.
I wonder lying site, or have changed naming PDI archive? The site says PDI 6.0.0 Stable in download section. After downloading, see pdi-ce-6.0.0.0-353.zip.

Add geometry type to Kettle

$
0
0
Hi everyone,

i would like to know how can i add a geometry type so i can read and write on postgis.

my approach is to add a plugin https://github.com/mattyb149/pdi-valuemeta-map fromMatthew Burgess which allow us to add key/value pairs (a map) to be used as a value type in Pentaho Data Integration.

ValueMetaGeometry.jpg

Can i add a main class directly from geokettle? Where is the main class? Is this a good approach how can i do this?

thanks in advance.
Attached Images

Error when installing pdi on Win 7

$
0
0
Hi there,

I'm trying to install pentaho data integration on my Win 7 (64bit) laptop for a student project.
Unfortunately, it doesn't work and even with consulting several forums and asking my teachers I could not solve the problem.

Here's the steps I undertook so far.
I downloaded pentaho data integration and saved it under C:\Windows\pentaho. Furthermore, I downloaded the jdk1.8.0_65 and created an environment variable PENTAHO_JAVA_HOME with the path directing to the jdk.

When I then try to run Spoon.bat the console opens shortly and then closes again. After waiting a couple of more seconds the pentaho start screen (see attached pic) opens but then also closes again.

pentaho data integration start screen.jpg


Here's what I get in the SpoonDebug.txt file:


DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.8.0_65
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jdk1.8.0_65\bin\java.exe


C:\Windows\pentaho\data-integration>"C:\Program Files\Java\jdk1.8.0_65\bin\java.exe" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-launcher-6.0.0.0-353.jar -lib ..\libswt\win64 /level:Debug
Invalid entry, ignoring 'C:\Windows\pentaho\data-integration\launcher\..\ui'
Invalid entry, ignoring 'C:\Windows\pentaho\data-integration\launcher\..\ui\images'
13:48:44,413 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 1 at /C:/Windows/pentaho/data-integration/./syst ***
*** em/karaf//data1 ***
*** Karaf Port:8801 ***
*** OSGI Service Port:9050 ***
*******************************************************************************
Nov 23, 2015 1:48:46 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired
INFORMATION: Lock acquired. Setting startlevel to 100
Nov 23, 2015 1:48:55 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFORMATION: Registered blueprint namespace handler for http://cxf.apache.org/ws/addressing
Nov 23, 2015 1:48:55 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFORMATION: Registered blueprint namespace handler for http://cxf.apache.org/ws/rm/manager
Nov 23, 2015 1:48:55 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFORMATION: Registered blueprint namespace handler for http://schemas.xmlsoap.org/ws/2005/02/rm/policy
Nov 23, 2015 1:48:55 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFORMATION: Registered blueprint namespace handler for http://cxf.apache.org/clustering
13:49:02,402 ERROR [KarafLifecycleListener] The Kettle Karaf Lifycycle Listener failed to execute properly. Releasing lifecycle hold, but some services may be unavailable.
2015/11/23 13:49:08 - Spoon - Logging is at level : Debugging
2015/11/23 13:49:08 - General - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : Error starting Spoon shell
2015/11/23 13:49:08 - General - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : java.lang.ExceptionInInitializerError
2015/11/23 13:49:08 - General - at org.pentaho.di.ui.spoon.Spoon.init(Spoon.java:813)
2015/11/23 13:49:08 - General - at org.pentaho.di.ui.spoon.Spoon.createContents(Spoon.java:9180)
2015/11/23 13:49:08 - General - at org.eclipse.jface.window.Window.create(Window.java:426)
2015/11/23 13:49:08 - General - at org.eclipse.jface.window.Window.open(Window.java:785)
2015/11/23 13:49:08 - General - at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9211)
2015/11/23 13:49:08 - General - at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:653)
2015/11/23 13:49:08 - General - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2015/11/23 13:49:08 - General - at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2015/11/23 13:49:08 - General - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015/11/23 13:49:08 - General - at java.lang.reflect.Method.invoke(Method.java:497)
2015/11/23 13:49:08 - General - at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
2015/11/23 13:49:08 - General - Caused by: java.util.MissingResourceException: Can't find bundle for base name ui/laf, locale de_DE
2015/11/23 13:49:08 - General - at java.util.ResourceBundle.throwMissingResourceException(ResourceBundle.java:1564)
2015/11/23 13:49:08 - General - at java.util.ResourceBundle.getBundleImpl(ResourceBundle.java:1387)
2015/11/23 13:49:08 - General - at java.util.ResourceBundle.getBundle(ResourceBundle.java:1082)
2015/11/23 13:49:08 - General - at org.pentaho.di.ui.spoon.XulSpoonResourceBundle.<clinit>(XulSpoonResourceBundle.java:65)
2015/11/23 13:49:08 - General - ... 11 more
stopping
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0




I'm appreciating any help.
Regards
Attached Images

Can't view Schedule perspective in Kettle (Enterprise Version)

$
0
0
Hi all.

I am using Pentaho Enterprise 5.3 on Linux CentOS 6.6 and when I open Kettle I can't find the Schedule Perspective neither the Schedule option in the Action menu.

Could somebody help me?

Thanks in advance!

How to find kettle.properties file in UNIX machine?

$
0
0
I have kettle.propeties file in my windows local machine, Now i want to copy this into UNIX machine. i am not sure where to copy this in UNIX.
Please give me detailed notes how to find kettle.properties in UNIX and how restart SPOON in UNIX and where to copy the kettle.properties file.

Your help is must appreciated.

Many Thanks in Advance.

Regards
Amar

Command line to run jobs in unix.

$
0
0
I just know we can run job by command line with kettle.sh. Can some please explain me what to code in kettle.sh to run the jobs in UNIX. Please give me example aswell.

Thanks and Regards
Amar.

reading files from cobol to csv

$
0
0
Hello, is it posible to expotr from cobol to csv using kettle?

does anyone have a example or a workaround?

thanks in advanced

Linux: Unsupported major.minor version 51.0

$
0
0
Hello everyone,

We've been using an older version of PDI on a CentOS server to execute jobs and the new version 5.5 + version 6 had some useful plugins so we upgraded.
Everything was going fine, until I attempted to upgrade the server's version on PDI.

After uploading and replacing the contents of
Quote:

/app/pentaho/design-tools/data-integration
with the new version we got this message.


Quote:

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/pentaho/commons/launcher/Launcher : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.pentaho.commons.launcher.Launcher. Program will exit.


My researching kept pointing to java compatibility so We tried upgrading the java on the box from


Quote:

[dmi@sc-dmipdi1 ~]$ java -version
java version "1.7.0_45"
OpenJDK Runtime Environment (rhel-2.4.3.3.el6-x86_64 u45-b15)
OpenJDK 64-Bit Server VM (build 24.45-b08, mixed mode)
to official oracle java 1.7

Quote:

[dmi@sc-dmipdi1 ~]$ java -version
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
and then out of desperation even tried to downgrade it back to 1.6

Quote:

[dmi@sc-dmipdi1 ~]$ java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)


Whiel I've worked indirectly with PDI for 2 years I'm very new to this server side setup.
I wondered if there was a settings file that the old PDI had, installed before my time. So I compared as many as I could but didnt see anything relevant to our setup that needed to transfer.

we did save the old PDI at @
Quote:

/app/pentaho/design-tools/data-integration-old/
and the kitchen.sh still works there with older scripts.

I am unsure of where to turn next, so was hoping someone here may have an answer or could point me int he right direction.

Please let me know if any additional information from my side would be helpful.

Thank you for your time and attention

Edit 1:
Note I downloaded the PDI instance for version 6.0 directly from: http://community.pentaho.com/projects/data-integration/

Preventing server components from starting under Windows 8.

$
0
0
I have a Pentaho server on an Ubuntu VPS. I want to use the client tools (ie; spoon, schema workbench, etc) on my local machine (Windows 8). I've done the install and can access the client tools, however the server side of things (PDI and BA server) are using up 95% of my RAM and killing my machine. I've looked for the start up in the windows registry, but can't find it. How do I prevent the server components from starting under Windows 8?

How do I implement this report?

$
0
0
Tables:
Purchase Order( order_id, product_id, quantity, Status)

Product(product_id, name)



The report we want is this:


select name,
(select qty from PurchaseOrder where product_id =a.product_id and order_id='AA001'),
(select qty from PurchaseOrder where product_id =a.product_id and order_id='AA002'),
(select qty from PurchaseOrder where product_id =a.product_id and order_id='AA003'),


(select sum(qty) from PurchaseOrder where product_id in ['AA001','AA002','AA003') as total
from product a


It should display a table of products, with the columns for each purchase order.
And it is better the font color is rendered according to the status (received, pending) of the order.

Thank you very much.


What is correct way to pass a js function to dataTable Component - CDE

$
0
0
Dear all,

What is the correct way to execute the js code below in a given component:

$(document).ready( function () {
var table = $('"#" + this.htmlObject + " Table"').DataTable();
var pageInfo = table.page.info();
var interval = setInterval(function(){
// "Next" ...
table.page( 'next' ).draw( 'page' );
if ( table.page()+1 === pageInfo.end ) // +1 the current page, since it starts at 0
clearInterval(interval);
}, 3000); // 3 seconds
} );

Thanks in advance.

Pentaho Logging and Opening Files

$
0
0
Hi everyone,
I'm running PDI 5.2 and 6.0, and cannot seem to locate any log files. In versions before 5.2, the log files would be located in the %TEMP% folder, with a name starting with spoon and ending in .log. I do see several folders that are created now (starting with hsperfdata and jetty-localhost), but none of these contain log files, and they're deleted after the app is closed. If I start spoon in debugging mode, the SpoonDebug file is populated, but it doesn't really contain a lot of detail. The kinds of logging I am looking for are logs from jobs/transformations that are run, and application logs.
The main reason I'm looking for these logs (tonight, anyway) is that Spoon is completely locking up when I try to locate a transformation or file for a Job step. We do have a rather large repository (2,047 files in the public folder, with various numbers in the home directory, and we develop in a file-based repo, but export the final product to an enterprise repo), but using the Open dialog or locating a repository item typically took 4 or 5 minutes. Tonight I waited more than 20 minutes for the Open dialog with no response. So, I'd like to see if I can get some sort of log to see what is happening.
If you have a suggestion, that'd be great. Thanks!

Details
Spoon 5.2.1.0-148
Windows 10 x64
Java 7

Spoon 6
Java 8 (installed with PDI)
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>