Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Decimal point is suppressed on text column of text file

$
0
0
I have a text file invoked from kettle on Input text file component. Each column of file is determined for its row position.

I have a column with following text:

-76.81889

The input text file component process my file but erase decimal point. I've got this:

Code:

-76 81889
Decimal point is suppressed...

I don't know what's happened...

Thanks for help me.

does any one encounter this exception when launch the spoon?

$
0
0
Hi,guys

I'm a total new comer for pentaho kettle for etl
when I launched the application.I got these error below:

10:25:36,775 ERROR [KarafLifecycleListener] The Kettle Karaf Lifecycle Listener failed to execute properly after waiting for 100 seconds. Releasing lifecycle hold, but some services may be unavailable.

My environment is centos 5.11 and the pentaho is pdi-ce-7.0

dotted line in cccMetricLineChart

$
0
0
Hi All,

we are using the cccMetricLineChart. we want to make the line between last two points to dotted and remaining points as line.


Below link is the example chart, In this chart Arrow highlighted place dotted line is required.

http://share.pho.to/AaRrF

Is there any option in MetricLinechart component. Please help me on this one .

Log Access Login on Penthao 6.0

$
0
0
Is it possible to retrieve login information for any user?
It will be awesome if you can log even the resource runned, but simply the login should be fine.

Thanks

Kitchen not loading job from file

$
0
0
Hey folks,

Hate to be that guy whose first post is asking for help, but... deadlines, y'know?

I've got my first job created, but our use case requires running jobs from files. Here is the kitchen output:

$ ./kitchen.sh -norep=Y -level Rowlevel -f /Users/1500164/ETLa/ProofOfConceptJob.kjb ~/ETLa/sample_users.csv
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
18:04:11,674 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
18:04:11,801 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 2 at /Users/1500164/Downloads/data-integration/. ***
*** /system/karaf/caches/kitchen/data-1 ***
*** Karaf Port:8803 ***
*** OSGI Service Port:9052 ***
*******************************************************************************
Jan 12, 2017 6:04:12 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired
INFO: Lock acquired. Setting startlevel to 100
2017/01/12 18:04:12 - Kitchen - Logging is at level : Row Level (very detailed)
2017/01/12 18:04:12 - Kitchen - Start of run.
2017/01/12 18:04:12 - Kitchen - Allocate new job.
ERROR: Kitchen can't continue because the job couldn't be loaded.

Running OS X 10.11.6

Java is
$ java -version
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)

Any ideas? I also tried it without the extra file argument with the same result. As you can see, logging is not particularly helpful.

Pentaho CDE - manipulate component parameter parameter

$
0
0
Hi Guys,

After set a parameter, I call Dashboards.getParameterValue('param_previous_month'), and it returns the values: "10","20","30".
Now I need manipulate this values and transform on "40","50","60".
I tried with javascript array functions but I got a error.

Thanks in advance for the help

Unable to find repository

$
0
0
Hi

Need quick help!

I am trying to expose the data from browser by hitting the url "http://localhost:9080/pentaho-di/kettle/executeTrans/?rep=PentahoRepo&user=admin&pass=password&trans=%2Fhome%2Fadmin%2Fservlet.ktr".

servlet.ktr transformation is saved in repository named as PentahoRepo under home folder then admin folder. I am getting the below error:

webresult>
<result>ERROR</result>
<message>Unexpected error executing the transformation&#x3a; &#xd;&#xa;org.pentaho.di.core.exception.KettleException&#x3a; &#xd;&#xa;Unable to find repository&#x3a; PentahoRepo&#xd;&#xa;&#xd;&#xa; at org.pentaho.di.http://www.ExecuteTransServlet.openR...x29;&#xd;&#xa; at org.pentaho.di.http://www.ExecuteTransServlet.doGet...x29;&#xd;&#xa; at org.pentaho.di.http://www.CarteServlet.doGet&#x28;C...x29;&#xd;&#xa; at com.pentaho.pdi.http://www.RoleBoundCarteServlet.doG...x29;&#xd;&#xa; at javax.servlet.http.HttpServlet.service&#x28;HttpServlet.java&#x3a;622&#x29;&#xd;&#xa; at javax.servlet.http.HttpServlet.service&#x28;HttpServlet.java&#x3a;729&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter&#x28;ApplicationFilterChain.java&#x3a;292&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.doFilter&#x28;ApplicationFilterChain.java&#x3a;207&#x29;&#xd;&#xa; at org.apache.tomcat.websocket.server.WsFilter.doFilter&#x28;WsFilter.java&#x3a;52&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter&#x28;ApplicationFilterChain.java&#x3a;240&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.doFilter&#x28;ApplicationFilterChain.java&#x3a;207&#x29;&#xd;&#xa; at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter&#x28;PentahoWebContextFilter.java&#x3a;185&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter&#x28;ApplicationFilterChain.java&#x3a;240&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.doFilter&#x28;ApplicationFilterChain.java&#x3a;207&#x29;&#xd;&#xa; at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter&#x28;PentahoRequestContextFilter.java&#x3a;87&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter&#x28;ApplicationFilterChain.java&#x3a;240&#x29;&#xd;&#xa; at org.apache.catalina.core.ApplicationFilterChain.doFilter&#x28;ApplicationFilterChain.java&#x3a;207&#x29;&#xd;&#xa; at org.springframework.security.util.FilterChainProxy&#x24;VirtualFilterChain.doFilter&#x28;FilterChainProxy.java&#x3a;399&#x29;&#xd;&#xa; at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke&#x28;FilterSecurityInterceptor.java&#x3a;109&#x29;&#xd;&#xa; at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter&#x28;FilterSecurityInterceptor.java&#x3a;83&#x29;&#xd;&#xa; at org.springframework.security.util.FilterChainProxy&#x24;VirtualFilterChain.doFilter&#x28;FilterChainProxy.java&#x3a;411&#x29;&#xd;&#xa; at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp&#x28;ExceptionTranslationFilter.java&#x3a;101&#x29;&#xd;&#xa; at org.springframework.security.ui.SpringSecurityFilter.doFilter&#x28;SpringSecurityFilter.java&#x3a;53&#x29;&#xd;&#xa; at org.springframework.security.util.FilterChainProxy&#x24;VirtualFilterChain.doFilter&#x28;FilterChainProxy.java&#x3a;411&#x29;&#xd;&#xa; at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp&#x28;AnonymousProcessingFilter.java&#x3a;105&#x29;&#xd;&#xa; at org.springframework.security.ui.SpringSecurityFilter.doFilter&#x28;SpringSecurityFilter.java&#x3a;53&#x29;&#xd;&#xa; at org.springframework.security.util.FilterChainProxy&#x24;VirtualFilterChain.doFilter&#x28;FilterChainProxy.java&#x3a;411&#x29;&#xd;&#xa; at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter&#x28;RequestParameterAuthenticationFilter.java&#x3a;191&#x29;&#xd;&#xa; at org.springframework.security.util.FilterChainProxy&#x24;VirtualFilterChain.doFilter&#x28;FilterChainProxy.java&#x3a;411&#x29;&#xd;&#xa; at

Google Analytics Service Account Support

$
0
0
Hi All,


I am working on Google Analytics using PDI and have stuck at one step. I am referring the official document of Pentaho - http://wiki.pentaho.com/display/EAI/Google+Analytics . Now, on google console side, when I try to create a new service account, it by default assigns XXXXX@YYYY.iam.gserviceaccount.com . Instead of this I must have XXXXX@developer.gserviceaccount.com


When asked about this in google analytics official forum - http://stackoverflow.com/questions/4...ervice-account , they suggested that @developer.gserviceaccount.com is an old format and I can't get service account in that format. I will have to use XXXXX@YYYY.iam.gserviceaccount.com. when I go ahead and use XXXXX@YYYY.iam.gserviceaccount.com, PDI shows "Authentication failure occurred when contacting Google analytics. Please verify the credentials in the service account email and key file fields as well as your network connectivity." error. There's no problem with my network.


Can someone share some insight on whether PDI supports this new format (XXXXX@YYYY.iam.gserviceaccount.com)? If yes, how should I use it? and if no, what should be the workaround for this issue?


P.S. - I have granted access to this service account email on google analytics side and downloaded .p12 file which was refereed from input step.

PDI 6: commons-vfs2

$
0
0
Hi,

I'm currently migrating from PDI 4 to 6. I was wondering about commons-vfs2-2.1-20150824.jar in lib directory. What is the difference with commons-vfs2-2.1.jar?
Thanks and Regards,

Nicolas

PDI 7 and MS SQL

$
0
0
hi Sanket Kelkar

Thanks for your solution which help me fied the same issue listed above. But I got another issue said that PDI can not create database connection with MS SQLSERVER 2008. It suggets me install SQLSERVER Driver for class (jar). I do not know how to to it and what I need to open big eye with it? Can you help me ? Many thanks!

My PDI env:
PDI: ce 7.0
JDK: 1.8 above
JRE: 1.8 above
OS: Win7-64bit

Pentaho Community Edition -Supports Dashboard

$
0
0
Hi Team,

Is Pentaho Community edition available for 5.0.4 version?
Whether it Support Dashboard(Xactions)?

Thanks
LekshmI

Branching select normalisation -unexpected error

$
0
0
Hi,

I have pairs of amount columns and pairs of date columns


  • One pair "Month_1" & "Month_2" contain Amounts
  • The other pair "Month1_Date" & "Month2_Date" contain the dates.


I would like to normalise this and produce "Amount" column and "Date" column. So I figured that I would create two branching "select values" steps and then merge them together. However while my branch to select Amount & Date for month 1 works fine, the ranch to select Amount & Date for month 2 fails due to "Unexpected Error java.lang.ClassCastException" Month2_Date contains some null dates which I have tried filtering these out as a preceding step but I still get the same error.

Can you suggest what is going wrong?

Thanks,

Andy

Initialization Error BA Server 7

$
0
0
Hello guys,
I am testing version 7 of BA Server.
Some errors and the server does not start correctly.

localhost log:

13-Jan-2017 14:49:12.156 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log No Spring WebApplicationInitializer types detected on classpath
13-Jan-2017 14:49:15.427 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log Initializing Spring root WebApplicationContext
13-Jan-2017 14:49:46.952 SEVERE [localhost-startStop-1] org.apache.tomcat.util.descriptor.web.SecurityConstraint.findUncoveredHttpMethods For security constraints with URL pattern [/jsp/*] only the HTTP methods [POST GET] are covered. All other methods are uncovered.

Catalina log:
13-Jan-2017 14:49:47.312 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/pentaho has finished in 53,401 ms
13-Jan-2017 14:49:47.313 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/sw-style
13-Jan-2017 14:49:47.499 INFO [localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
13-Jan-2017 14:49:47.501 SEVERE [localhost-startStop-1] org.apache.jasper.EmbeddedServletOptions.<init> The scratchDir you specified: /opt/pentaho/v7/pentaho-server/tomcat/work/Catalina/localhost/sw-style is unusable.
13-Jan-2017 14:49:47.501 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/sw-style has finished in 188 ms
13-Jan-2017 14:49:47.502 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/ROOT
13-Jan-2017 14:49:47.677 INFO [localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
13-Jan-2017 14:49:47.678 SEVERE [localhost-startStop-1] org.apache.jasper.EmbeddedServletOptions.<init> The scratchDir you specified: /opt/pentaho/v7/pentaho-server/tomcat/work/Catalina/localhost/ROOT is unusable.
13-Jan-2017 14:49:47.678 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/ROOT has finished in 176 ms
13-Jan-2017 14:49:47.679 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/pentaho-style
13-Jan-2017 14:49:47.854 INFO [localhost-startStop-1] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
13-Jan-2017 14:49:47.855 SEVERE [localhost-startStop-1] org.apache.jasper.EmbeddedServletOptions.<init> The scratchDir you specified: /opt/pentaho/v7/pentaho-server/tomcat/work/Catalina/localhost/pentaho-style is unusable.
13-Jan-2017 14:49:47.855 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /opt/pentaho/v7/pentaho-server/tomcat/webapps/pentaho-style has finished in 176



Browser>


"The following errors were detected
One or more system listeners failed. These are set in the systemListeners.xml.
org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error while trying to execute startup sequence for org.pentaho.platform.engine.services.connection.datasource.dbcp.DynamicallyPooledDatasourceSystemListener

Please see the server console for more details on each error detected."


Thank you for your help! :)

André.

Do not update row - create new version when value changes

$
0
0
Hello!

I need to input data from a CSV file generated every day into a very simple PostgreSQL table with only 4 attributes:

Code:

CREATE TABLE public.sensor (   
    sensor_id            SERIAL NOT NULL,
    sensor_code            INTEGER NOT NULL,
    sensor_name            VARCHAR(20) NOT NULL,
    sensor_valid_to        TIMESTAMP NULL
)
;
ALTER TABLE public.sensor ADD sensor_pk PRIMARY KEY (sensor_id);

When I insert a previously inexistent sensor, I set sensor_valid_to attribute to NULL, indicating that this is the current version of an sensor.

But the CSV source file can, sometimes, change the sensor_name of an existing sensor when keeping the same sensor_code. But for historical purposes I cannot update the row in the database table, I need to create a new row with the same code, changing the name and the new row must have sensor_valid_to attribute with NULL, and the older row must be filled with the timestamp of the ETL execution. For example:

A new sensor is created
CSV file 1:

Code:

154892;Bottom Rotational Speed;

INSERT INTO public.sensor(sensor_code, sensor_name) VALUES(154892, 'Bottom Rotational Speed');

The same sensor created before has the name updated
CSV file 2:

Code:

154892;Bottom Rotational Speed - Inverted Axis;

UPDATE public.sensor SET sensor_valid_to = CURRENT_TIMESTAMP WHERE sensor_code = 154892 AND sensor_valid_to IS NULL;
INSERT INTO public.sensor(sensor_code, sensor_name) VALUES(154892, 'Bottom Rotational Speed - Inverted Axis');

Any ideais on how to do this using PDI transformations?

Pentaho Analyzer Mondrian - return all 2-dimension values independent from facts

$
0
0
I have to produce a report in Pentaho Analyzer (server) with the following components: - dimension: store - dimension: date - measure: sales
My aim is to present all stores and all dates, and when there is no sales in that date we put number 0 (by default)?
In Pentaho Schema Workbench I can do this with 1-dimension (calculated member: COALESCEEMPTY() function). COALESCEEMPTY((Measures.[Sales], [Date].[Year].CURRENTMEMBER, [Store].[Store].CURRENTMEMBER), 0)
But I am not able to do it with 2 dimensions, because when I call dimensions in Pentaho Server (version 5) they relate between them according to existing sales (facts), and only appears the stores that has sales in some date (even if I don't call the measure).
Simply speaking, I want to make a LEFT JOIN from 2-dimensions to fact table, showing empty cells with a default value.
Thank you.

Pentaho CDE Piechart Click Action: Dashboards not defined

$
0
0
Hello Guys i'm trying to build simple interaction between charts and table using parameter.
1.jpg
The operation is simply when i click the piechart, table show the tipe with category selected.
Table on sample above created using parameter on select components.

i'm trying to use this function on my piechart click action and it returns error : Dashboards is not defined
Code:

function(scene){
    var vars = scene.vars;
    var c = vars.category.value;
    var v = vars.value.value;
    Dashboards.fireChange(param_tipe, v);
}

Any tips?
Attached Images

Row level security in reports possible?

$
0
0
Hi,

is it possible to have row level security in pentaho reports?

As I understand, it only executes SQLs... so there is no additional security.
Or am I wrong?

Integrating Pentaho BI 7 CE with CAS

$
0
0
The instructions in the doc were followed:
https://help.pentaho.com/Documentati...curity/060/000

But at start allways get a SEVERE problem:

Code:

14:29:28,880 ERROR [ContextLoader] Context initialization failedorg.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: Failed to import bean definitions from relative location [applicationContext-spring-security-cas.xml]
Offending resource: file [/var/opt/pentaho70/pentaho-server/pentaho-solutions/system/pentaho-spring-beans.xml]; nested exception is org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from file [/var/opt/pentaho70/pentaho-server/pentaho-solutions/system/applicationContext-spring-security-cas.xml]; nested exception is java.lang.RuntimeException: Cannot find class for publish type: INTERFACES specified on publish of bean id: casAuthenticationProvider.memoryUserDetailsService
        at org.springframework.beans.factory.parsing.FailFastProblemReporter.error(FailFastProblemReporter.java:70)
        at org.springframework.beans.factory.parsing.ReaderContext.error(ReaderContext.java:85)
        at org.springframework.beans.factory.parsing.ReaderContext.error(ReaderContext.java:76)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.importBeanDefinitionResource(DefaultBeanDefinitionDocumentReader.java:259)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseDefaultElement(DefaultBeanDefinitionDocumentReader.java:184)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:169)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:142)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:94)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:508)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:392)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:336)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:304)
        at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:181)
        at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:217)
        at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:188)
        at org.springframework.web.context.support.XmlWebApplicationContext.loadBeanDefinitions(XmlWebApplicationContext.java:125)
        at org.springframework.web.context.support.XmlWebApplicationContext.loadBeanDefinitions(XmlWebApplicationContext.java:94)
        at org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:129)
        at org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:612)
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:513)
        at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:444)
        at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:326)
        at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:107)
        at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4853)
        at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
        at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
        at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
        at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1092)
        at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1834)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from file [/var/opt/pentaho70/pentaho-server/pentaho-solutions/system/applicationContext-spring-security-cas.xml]; nested exception is java.lang.RuntimeException: Cannot find class for publish type: INTERFACES specified on publish of bean id: casAuthenticationProvider.memoryUserDetailsService
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:414)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:336)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:304)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.importBeanDefinitionResource(DefaultBeanDefinitionDocumentReader.java:243)
        ... 32 more
Caused by: java.lang.RuntimeException: Cannot find class for publish type: INTERFACES specified on publish of bean id: casAuthenticationProvider.memoryUserDetailsService
        at org.pentaho.platform.engine.core.system.objfac.spring.BeanPublishParser.decorate(BeanPublishParser.java:142)
        at org.springframework.beans.factory.xml.NamespaceHandlerSupport.decorate(NamespaceHandlerSupport.java:99)
        at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.decorateIfRequired(BeanDefinitionParserDelegate.java:1448)
        at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.decorateBeanDefinitionIfRequired(BeanDefinitionParserDelegate.java:1435)
        at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.decorateBeanDefinitionIfRequired(BeanDefinitionParserDelegate.java:1415)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.processBeanDefinition(DefaultBeanDefinitionDocumentReader.java:301)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseDefaultElement(DefaultBeanDefinitionDocumentReader.java:190)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:169)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:142)
        at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:94)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:508)
        at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:392)
        ... 35 more
Caused by: java.lang.ClassNotFoundException: org.springframework.security.cas.authentication.CasAuthenticationProvider
        at org.pentaho.platform.engine.core.system.objfac.spring.BeanPublishParser.findClass(BeanPublishParser.java:163)
        at org.pentaho.platform.engine.core.system.objfac.spring.BeanPublishParser.decorate(BeanPublishParser.java:101)
        ... 46 more
12-Jan-2017 14:29:28.889 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal One or more listeners failed to start. Full details will be found in the appropriate container log file
12-Jan-2017 14:29:35.112 INFO [localhost-startStop-1] org.apache.catalina.util.SessionIdGeneratorBase.createSecureRandom Creation of SecureRandom instance for session ID generation using [SHA1PRNG] took [6,221] milliseconds.
12-Jan-2017 14:29:35.113 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal Context [/pentaho] startup failed due to previous errors

Same issue was reported in StackOverflow:

http://stackoverflow.com/questions/4...ition-with-cas

Any ideas?

Thanks...

Database join parameters and NULL values

$
0
0
Hi,

I am performing a database join and I am having an issue with NULL values.

In the DB join, the sql uses nested CASE WHEN statements with one of those being something like CASE WHEN field1 IS NOT NULL THEN 1 ELSE 2

However when the step is run all NULL values fail this step and return 1 when it should be 2

The SQL is using parameters from a previous step so I think what is happening is that the parameter is being passed through and is not passing as null but as "something" therefore when running the condition it is no longer NULL

Has anyone come across this before? Is there any way around it?

I have considered creating a step to convert NULL to something that would not naturally show up but it would require a lot of rewriting which I would like to avoid if possible :)

Thanks

integration Pentaho BA 7 community edition with CAS

$
0
0
Recently I am working to integrate Pentaho BA (BI server) 7 CE with Cas server . I follow up Pentaho official documentation . https://help.pentaho.com/Documentati...50/010/060/000 but unfortunately it doesn't work!! Actually it's kind of hard to find about section 5 in documentation . where I have to set
Code:

casAuthenticationProvider.MemoryUserDetailsService
in my configuration ? And all i've get from the engine is the error which says it couldn't found CasAuthenticationProvider class . I added all the necessary JARs as the Pentaho documentation referred . The error is :
Code:

Cannot find class for publish type: INTERFACES specified on publish of  bean id: casAuthenticationProvider
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>