I can't figure out why, but my Job Executor step in my transformation is only sending the last row. My job calls another transform and passes the variables down to it, but it definitely only gets the last row. Eventually I am trying to spawn multiple copies of the Job Executor (using number of copies to start option) at the same time to run things in parallel. What am I missing?
↧
Job Executor only sending last row
↧
Pentaho reports are very slow using MDX
We are using Pentaho for our Reporting solution, we have developed Mondrian Schema with multiple cubes (data marts) and virtual cubes (combining all data marts) and using this virtual cube into our reports. All are based on MS SQL Server Database Views as source for this schema. We have configured all the performance related settings in Mondrian.properties based on various recommendations available.
The issue we are facing is our MDX query in PRPT Report taking lot of time to execute and in the sql server profiler we see that multiple SQL Queries get executed , we are not able to figure out where is bottleneck or any configuration we are missing.
It's looks something is wrong may be inside mondrian schema or configuration level.
Please let us know what we are missing.
We checked that MS SQL Server views are very fast, but via Mondrian using MDX it is very slow.
Please find below list of properties which is available into mondrian.properties file.located at pentaho-solutions/system/mondrian/ folder. [IMG]D:\Mondrian_properties.png[/IMG]
The issue we are facing is our MDX query in PRPT Report taking lot of time to execute and in the sql server profiler we see that multiple SQL Queries get executed , we are not able to figure out where is bottleneck or any configuration we are missing.
It's looks something is wrong may be inside mondrian schema or configuration level.
Please let us know what we are missing.
We checked that MS SQL Server views are very fast, but via Mondrian using MDX it is very slow.
Please find below list of properties which is available into mondrian.properties file.located at pentaho-solutions/system/mondrian/ folder. [IMG]D:\Mondrian_properties.png[/IMG]
↧
↧
Scheduled Report and Email - re-sending if mail server is down
I have a number of scheduled reports that run and are emailed, setup via the scheduling option with the Pentaho 5 CE web UI.
At the runtime of a report, the report is generated and the email tries to send, but the mail server is down for a reboot or something. Is it possible for Pentaho to realize this and try to re-send the report after an hour or something?
Right now, the report is generated and shows up in the output folder, but there is no indication that the email fails until someone notices that they did not get it.
Any suggestions?
At the runtime of a report, the report is generated and the email tries to send, but the mail server is down for a reboot or something. Is it possible for Pentaho to realize this and try to re-send the report after an hour or something?
Right now, the report is generated and shows up in the output folder, but there is no indication that the email fails until someone notices that they did not get it.
Any suggestions?
↧
Slow response time with Kettle 5.0.1 and MYSQL repository database
Hi,
We have Kettle 5.0.1 installed on a Windows 2008 R2 Server, it is a 64bit OS.
We have MYSQL 5.5.34 installed on the same machine.
We have created a repository in the MYSQL server to host our ETL transformations and Jobs. But when we try to access the repository to open the transformations or jobs the response time is really slow. We are also facing slow response times when we are trying to save the transformations or jobs.
Did anyone face similar issue before trying to use a repository based approach. Please let me know what would be the best way to resolve this issue.
Thanks,
Vishnu
We have Kettle 5.0.1 installed on a Windows 2008 R2 Server, it is a 64bit OS.
We have MYSQL 5.5.34 installed on the same machine.
We have created a repository in the MYSQL server to host our ETL transformations and Jobs. But when we try to access the repository to open the transformations or jobs the response time is really slow. We are also facing slow response times when we are trying to save the transformations or jobs.
Did anyone face similar issue before trying to use a repository based approach. Please let me know what would be the best way to resolve this issue.
Thanks,
Vishnu
↧
CDA cache issue using Mondrian access-control
Hi,
I'm using the latest version of CDA in Pentaho 5.0.1 CE.
It seems that different Mondrian roles share the same CDA cache.
In my Mondrian schema I've defined two roles: one can access all members of a hierarchy, the other one only a subset of them.
The same query (that uses that hierarchy) executed by the two roles should return different results, but it doesn't happen... the result is the same!
Only if I clear the cache I get the right behavior.
Any help is appreciated.
Luca
I'm using the latest version of CDA in Pentaho 5.0.1 CE.
It seems that different Mondrian roles share the same CDA cache.
In my Mondrian schema I've defined two roles: one can access all members of a hierarchy, the other one only a subset of them.
The same query (that uses that hierarchy) executed by the two roles should return different results, but it doesn't happen... the result is the same!
Only if I clear the cache I get the right behavior.
Any help is appreciated.
Luca
↧
↧
list environment variables
I need to create a list of Kettle job named parameters (names and values). Google searches returned solutions using ActiveX objects which is not an option. Are there any other methods which will accomplish this task?
Thanks.
Windows Spoon 4.4
Thanks.
Windows Spoon 4.4
↧
table input -> if exists combine with return to create calculated value -> insert/upd
I am very new to Pentaho and am trying to use the steps the most efficiently rather than resorting to create an ExecuteSQLScript step.
I have a table input query (setA). The results of that query go to a lookup. If the lookup returns a row I would like to replace some existing field values in setA with the lookup results summed to the corresponding column in setA then run an Insert/Update step with the newly calculated values. Anything that had not been found should be inserted as is and anything that was should be inserted with the calculated values created from the lookup step.
Can anyone recommend the best practice to acheive this?
And if there is anything that is unclear, please let me know! If there is a good resource to see complex examples like this that would be helpful too.
Thank you!
I have a table input query (setA). The results of that query go to a lookup. If the lookup returns a row I would like to replace some existing field values in setA with the lookup results summed to the corresponding column in setA then run an Insert/Update step with the newly calculated values. Anything that had not been found should be inserted as is and anything that was should be inserted with the calculated values created from the lookup step.
Can anyone recommend the best practice to acheive this?
And if there is anything that is unclear, please let me know! If there is a good resource to see complex examples like this that would be helpful too.
Thank you!
↧
Duplicated records on the filters
Good Night!
Guys i need you help to resolve a problem that i don't know the cause.
I'm using the Mondrian with the User Console and the plugin Saiku. My records in the filters are multipled, causing an confusion when i try to select an information.
Look some prints from my screen to explain better what i said (attachment).
Is a problem with the Mondrian Cube? Can someone help me?
Thanks.
cidade.jpg
tipo_pessoa.jpg
Guys i need you help to resolve a problem that i don't know the cause.
I'm using the Mondrian with the User Console and the plugin Saiku. My records in the filters are multipled, causing an confusion when i try to select an information.
Look some prints from my screen to explain better what i said (attachment).
Is a problem with the Mondrian Cube? Can someone help me?
Thanks.
cidade.jpg
tipo_pessoa.jpg
↧
How to Transfer Active directory user to OpenLDAP using LDAP Ihput
Hi All,
I have active directory user tree which have
cn=admin,ou=resource,dc=mydomain,dc=com
i want to transfer that ou=resource to my another OpenLdap
cn=admin,dc=mydomain,dc=net
Actually i see LDAP input step on kettle,
But i don't know how to use it. I tried search base with = cn=admin,dc=mydomain,dc=com and it's failed.
Help me to solve this
I have active directory user tree which have
cn=admin,ou=resource,dc=mydomain,dc=com
i want to transfer that ou=resource to my another OpenLdap
cn=admin,dc=mydomain,dc=net
Actually i see LDAP input step on kettle,
But i don't know how to use it. I tried search base with = cn=admin,dc=mydomain,dc=com and it's failed.
Help me to solve this
↧
↧
How to consume a named set into report
Hi all ,
I am very new to pentaho and need your help.
I have created a name set by name top seller for calculating top 5 product seller seller by unit price.
the formula which i am using is
TopCount([Product].[Product Category].MEMBERS, 5, [Measures].[Unit price])
now i want to consume this named set in the report which i am creating in user console.
so i guess i need to use this named set (top seller) in some dimension in schema workbench and than i would be able to use it.
but in schema workbench how can i use this named set in a dimension.....
Thanks in advance
I am very new to pentaho and need your help.
I have created a name set by name top seller for calculating top 5 product seller seller by unit price.
the formula which i am using is
TopCount([Product].[Product Category].MEMBERS, 5, [Measures].[Unit price])
now i want to consume this named set in the report which i am creating in user console.
so i guess i need to use this named set (top seller) in some dimension in schema workbench and than i would be able to use it.
but in schema workbench how can i use this named set in a dimension.....
Thanks in advance
↧
Unable to create log directory
Hi All,
i need your help in my one more Thread
i am not able to see any thing in pentaho.log file as its giving an error which is
ERROR [hive.ql.exec.HiveHistory] Unable to create log directory
so what should i do and why its not generating the same.. plz anybody do reply
Thanks
i need your help in my one more Thread
i am not able to see any thing in pentaho.log file as its giving an error which is
ERROR [hive.ql.exec.HiveHistory] Unable to create log directory
so what should i do and why its not generating the same.. plz anybody do reply
Thanks
↧
Upgrade pentaho bi server version from 3.6 to 5.0.1 Problems!!!!
Hi!
I deployed the new bi server war into my tomcat server. I work with :
Mysql 5,Tomcat 6,Pentaho 5.0.1,Java 7
I've added configuration of my own datasources, security ldap and cas, and i'm trying to embed reporting with my java gwt project already running on server.
But when i try to launch a report i found on log the errror above :
2014-03-11 10:20:41,626 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0002 - Could not get content generator: KPI=XXX&solution=xxx&path=dashboard/general&name=XXX.prpt&locale=%257Blocale%257D&PERIODO=2014-01&COD_CONTRATTO=XXX%202013&CONTRATTO=Contratto%20XXX%202013
java.lang.NullPointerException
at org.pentaho.reporting.platform.plugin.ReportContentGenerator.getMimeType(ReportContentGenerator.java:160)
at org.pentaho.platform.engine.services.solution.SimpleContentGenerator.createContent(SimpleContentGenerator.java:52)
at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:304)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:242)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.jasig.cas.client.session.SingleSignOutFilter.doFilter(SingleSignOutFilter.java:76)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:242)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:242)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:240)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:203)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:562)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:164)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:108)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at it.twoheads.valve.JNDIValve.invoke(JNDIValve.java:82)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:379)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:242)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:259)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:281)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
my pentahoObject.spring.xml is configured as follow :
<bean id="ISolutionEngine" class="org.pentaho.platform.engine.services.solution.SolutionEngine" scope="prototype"/>
<bean id="ISolutionRepository" class="org.pentaho.platform.repository.solution.dbbased.DbBasedSolutionRepository" scope="session" />
<bean id="ISolutionRepositoryService" class="org.pentaho.platform.repository.solution.SolutionRepositoryServiceImpl" scope="session" />
<bean id="IContentRepository" class="org.pentaho.platform.repository.content.ContentRepository" scope="session" />
<bean id="IUnifiedRepository" class="org.pentaho.platform.repository2.unified.fs.FileSystemBackedUnifiedRepository" scope="prototype" />
<bean id="IAuditEntry" class="org.pentaho.platform.engine.services.audit.AuditFileEntry" scope="singleton"/>
<bean id="IUITemplater" class="org.pentaho.platform.web.http.WebTemplateHelper" scope="singleton"/>
<!-- Concrete implementation of IMetadataDomainRepository -->
<bean id="IMetadataDomainRepositoryImpl"
class="org.pentaho.platform.plugin.services.metadata.SecurityAwarePentahoMetadataDomainRepository" scope="singleton">
<!-- <constructor-arg>
<ref bean="unifiedRepository"/>
</constructor-arg> -->
</bean>
<!-- Wrap the concrete IMetadataDomainRepository implementation with one that caches domains per session -->
<!-- <bean id="IMetadataDomainRepository"
class="org.pentaho.platform.plugin.services.metadata.SessionCachingMetadataDomainRepository" scope="singleton">
<constructor-arg>
<ref bean="IMetadataDomainRepositoryImpl"/>
</constructor-arg>
</bean> -->
<!-- <bean id="IMetadataDomainRepository" class="org.pentaho.platform.plugin.services.metadata.SecurityAwarePentahoMetadataDomainRepository" scope="singleton"/> -->
<!-- Use this schema factory to disable PMD security -->
<bean id="IUserSettingService" class="org.pentaho.platform.repository.usersettings.UserSettingService">
<!-- <constructor-arg ref="unifiedRepository"/> -->
</bean>
<!-- <bean id="IUserSettingService" class="org.pentaho.platform.repository.usersettings.UserSettingService" scope="session" /> -->
<bean id="IEmailService" class="org.pentaho.platform.plugin.services.email.EmailService" scope="session"/>
<bean id="file" class="org.pentaho.platform.plugin.outputs.FileOutputHandler" scope="session"/>
<bean id="contentrepo" class="org.pentaho.platform.repository.content.ContentRepositoryOutputHandler" scope="session"/>
<bean id="vfs-ftp" class="org.pentaho.platform.plugin.outputs.ApacheVFSOutputHandler" scope="session"/>
<bean id="IAclPublisher" class="org.pentaho.platform.engine.security.acls.AclPublisher" scope="singleton"/>
<bean id="IAclVoter" class="org.pentaho.platform.engine.security.acls.voter.PentahoBasicAclVoter" scope="singleton"/>
<bean id="IVersionHelper" class="org.pentaho.platform.util.VersionHelper" scope="singleton"/>
<bean id="ICacheManager" class="org.pentaho.platform.plugin.services.cache.CacheManager" scope="singleton"/>
<bean id="IScheduler2" class="org.pentaho.platform.scheduler2.quartz.QuartzScheduler" scope="singleton"/>
<bean id="IBlockoutManager" class="org.pentaho.platform.scheduler2.blockout.PentahoBlockoutManager" scope="singleton"/>
<bean id="IConditionalExecution" class="org.pentaho.platform.plugin.condition.javascript.ConditionalExecution"
scope="prototype"/>
<bean id="IMessageFormatter" class="org.pentaho.platform.engine.services.MessageFormatter" scope="singleton"/>
<!--
IDBDatasourceService - options are:
org.pentaho.platform.engine.services.connection.datasource.dbcp.JndiDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.NonPooledOrJndiDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.NonPooledDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledOrJndiDatasourceService (Default option)
-->
<bean id="IDatasource"
class="org.pentaho.platform.repository.datasource.Datasource"
scope="singleton"/>
<bean id="IPasswordService" class="org.pentaho.platform.util.Base64PasswordService" scope="singleton"/>
<bean id="IPluginProvider" class="org.pentaho.platform.plugin.services.pluginmgr.SystemPathXmlPluginProvider"
scope="singleton"/>
<bean id="IPluginManager" class="org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager"
scope="singleton"/>
<bean id="IPluginResourceLoader" class="org.pentaho.platform.plugin.services.pluginmgr.PluginResourceLoader"
scope="singleton"/>
<bean id="IPluginPerspectiveManager"
class="org.pentaho.platform.plugin.services.pluginmgr.perspective.DefaultPluginPerspectiveManager"
scope="singleton"/>
<bean id="IServiceManager" class="org.pentaho.platform.plugin.services.pluginmgr.servicemgr.DefaultServiceManager"
scope="singleton">
<property name="serviceTypeManagers">
<list>
<ref bean="gwtServiceManager"/>
<ref bean="axisServiceManager"/>
</list>
</property>
</bean>
<bean id="ITempFileDeleter" class="org.pentaho.platform.web.http.session.SessionTempFileDeleter" scope="prototype"/>
<bean id="gwtServiceManager" class="org.pentaho.platform.plugin.services.pluginmgr.servicemgr.GwtRpcServiceManager"
scope="singleton"/>
<bean id="axisServiceManager" class="org.pentaho.platform.plugin.services.pluginmgr.servicemgr.AxisWebServiceManager"
scope="singleton"/>
<bean id="IChartBeansGenerator" class="org.pentaho.platform.plugin.action.chartbeans.DefaultChartBeansGenerator"
scope="singleton"/>
<bean id="systemStartupSession" class="org.pentaho.platform.engine.security.session.TrustedSystemStartupSession" scope="prototype" />
<!-- Data connections. Connections objects should be accessed through PentahoConnectionFactory,
not directly from the PentahoObjectFactory. -->
<bean id="connection-SQL" class="org.pentaho.platform.plugin.services.connections.sql.SQLConnection"
scope="prototype">
<property name="fallBackToNonscrollableOnError" value="true"/>
</bean>
<bean id="connection-MDX" class="org.pentaho.platform.plugin.services.connections.mondrian.MDXConnection"
scope="prototype">
<property name="useExtendedColumnNames" value="true"/>
</bean>
<bean id="connection-MDXOlap4j" class="org.pentaho.platform.plugin.services.connections.mondrian.MDXOlap4jConnection" scope="prototype" />
<bean id="connection-XML" class="org.pentaho.platform.plugin.services.connections.xquery.XQConnection"
scope="prototype"/>
<bean id="connection-HQL" class="org.pentaho.platform.plugin.services.connections.hql.HQLConnection"
scope="prototype"/>
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!-- <alias name="unifiedRepository" alias="IUnifiedRepository"/> -->
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!-- <alias name="backingRepositoryLifecycleManager" alias="IBackingRepositoryLifecycleManager"/>-->
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!--<alias name="authorizationPolicy" alias="IAuthorizationPolicy"/>-->
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!--<alias name="roleAuthorizationPolicyRoleBindingDaoTxn" alias="IRoleAuthorizationPolicyRoleBindingDao"/>-->
<!-- actual bean defined in applicationContext-pentaho-security-*.xml; aliased here -->
<!-- <pen:bean id="IUserRoleListService" class="org.pentaho.platform.api.engine.IUserRoleListService">
<pen:attributes>
<pen:attr key="providerName" value="${security.provider}"/>
</pen:attributes>
<penublish as-type="INTERFACES">
<pen:attributes>
<pen:attr key="priority" value="50"/>
</pen:attributes>
</penublish>
</pen:bean>
<pen:bean id="UserDetailsService" class="org.springframework.security.userdetails.UserDetailsService">
<pen:attributes>
<pen:attr key="providerName" value="${security.provider}"/>
</pen:attributes>
</pen:bean>-->
<bean id="ehCacheUserCache" class="org.springframework.security.providers.dao.cache.EhCacheBasedUserCache">
<property name="cache">
<bean class="org.springframework.cache.ehcache.EhCacheFactoryBean">
<property name="cacheManager">
<bean class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean">
<property name="shared" value="true"/>
</bean>
</property>
<property name="cacheName" value="userCache"/>
</bean>
</property>
</bean>
<bean id="cachingUserDetailsService" class="org.pentaho.platform.plugin.services.security.userrole.PentahoCachingUserDetailsService">
<constructor-arg>
<ref bean="UserDetailsService"/>
</constructor-arg>
<constructor-arg ref="tenantedUserNameUtils"/>
<property name="userCache" ref="ehCacheUserCache"/>
<!-- <penublish as-type="INTERFACES">
<pen:attributes>
<pen:attr key="priority" value="50"/>
</pen:attributes>
</penublish>-->
</bean>
<!-- actual bean defined in applicationContext-spring-security.xml; aliased here -->
<!-- <alias name="authenticationManager" alias="AuthenticationManager"/> -->
<bean id="IMondrianCatalogService" class="org.pentaho.platform.plugin.action.mondrian.catalog.MondrianCatalogHelper"
scope="singleton"/>
<bean id="IDatabaseDialectService" class="org.pentaho.database.service.DatabaseDialectService" scope="singleton"/>
<bean id="IDatasourceMgmtService" class="org.pentaho.platform.repository.datasource.DatasourceMgmtService" scope="prototype">
<!-- <constructor-arg ref="unifiedRepository"/>
<constructor-arg ref="IDatabaseDialectService"/> -->
</bean>
<!-- This mondrian user/role mapper assumes that roles from the platform also exist in mondrian -->
<!--
Disabled by default in 3.5.2. In trunk, this should be enabled.
-->
<bean id="Mondrian-UserRoleMapper"
name="Mondrian-One-To-One-UserRoleMapper"
class="org.pentaho.platform.plugin.action.mondrian.mapper.MondrianOneToOneUserRoleListMapper"
scope="singleton" />
<!-- <bean id="ReportCache" class="org.pentaho.reporting.platform.plugin.cache.NullReportCache" scope="singleton"/>
<bean id="PentahoNameGenerator" class="org.pentaho.reporting.platform.plugin.repository.TempDirectoryNameGenerator"
scope="prototype"/>-->
<bean id="MondrianConnectionProvider"
class="org.pentaho.reporting.platform.plugin.connection.PentahoMondrianConnectionProvider" scope="singleton"/>
<bean id="metadataqueryexec-SQL"
class="org.pentaho.platform.plugin.services.connections.metadata.sql.SqlMetadataQueryExec" scope="prototype"/>
<bean id="sqlGenerator" class="org.pentaho.metadata.query.impl.sql.SqlGenerator" scope="prototype"/>
<bean id="IThemeManager" class="org.pentaho.platform.web.html.themes.DefaultThemeManager" scope="singleton"/>
<bean id="ICacheExpirationRegistry" class="org.pentaho.platform.plugin.services.cache.CacheExpirationRegistry"
scope="singleton"/>
<bean id="IUserFilesComponent" class="org.pentaho.platform.web.refactor.UserFilesComponent" scope="session" />
<bean id="IBackgroundExecution" class="org.pentaho.platform.scheduler.SecurityAwareBackgroundExecutionHelper" scope="singleton" />
<bean id="BackgroundSubscriptionExecution" class="org.pentaho.platform.scheduler.SecurityAwareBackgroundSubscriptionHelper" scope="singleton" />
<bean id="ISubscriptionRepository" class="org.pentaho.platform.repository.subscription.SubscriptionRepository" scope="singleton" />
<bean id="ISubscriptionScheduler" class="org.pentaho.platform.scheduler.QuartzSubscriptionScheduler" scope="singleton" />
<bean id="INavigationComponent" class="org.pentaho.platform.uifoundation.component.xml.NavigationComponent" scope="prototype" />
<bean id="IVersionCheckDataProvider" class="org.pentaho.platform.plugin.services.versionchecker.PentahoVersionCheckDataProvider" scope="prototype" />
<bean id="IMenuProvider" class="org.pentaho.platform.web.html.HtmlMenuProvider" scope="singleton" />
</beans>
I also have not understand how to set up unifiedRepository, because for resetRepository i use ISolutionRepository.
Thank's for help me!!!!
I deployed the new bi server war into my tomcat server. I work with :
Mysql 5,Tomcat 6,Pentaho 5.0.1,Java 7
I've added configuration of my own datasources, security ldap and cas, and i'm trying to embed reporting with my java gwt project already running on server.
But when i try to launch a report i found on log the errror above :
2014-03-11 10:20:41,626 ERROR [org.pentaho.platform.web.servlet.GenericServlet] GenericServlet.ERROR_0002 - Could not get content generator: KPI=XXX&solution=xxx&path=dashboard/general&name=XXX.prpt&locale=%257Blocale%257D&PERIODO=2014-01&COD_CONTRATTO=XXX%202013&CONTRATTO=Contratto%20XXX%202013
java.lang.NullPointerException
at org.pentaho.reporting.platform.plugin.ReportContentGenerator.getMimeType(ReportContentGenerator.java:160)
at org.pentaho.platform.engine.services.solution.SimpleContentGenerator.createContent(SimpleContentGenerator.java:52)
at org.pentaho.platform.web.servlet.GenericServlet.doGet(GenericServlet.java:261)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:304)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
at org.pentaho.platform.web.http.security.PentahoBasicProcessingFilter.doFilterHttp(PentahoBasicProcessingFilter.java:88)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.pentaho.platform.web.http.filters.HttpSessionPentahoSessionIntegrationFilter.doFilter(HttpSessionPentahoSessionIntegrationFilter.java:265)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:242)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.jasig.cas.client.session.SingleSignOutFilter.doFilter(SingleSignOutFilter.java:76)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:242)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:242)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:240)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:203)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:562)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:164)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:108)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at it.twoheads.valve.JNDIValve.invoke(JNDIValve.java:82)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:379)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:242)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:259)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:281)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:619)
my pentahoObject.spring.xml is configured as follow :
<bean id="ISolutionEngine" class="org.pentaho.platform.engine.services.solution.SolutionEngine" scope="prototype"/>
<bean id="ISolutionRepository" class="org.pentaho.platform.repository.solution.dbbased.DbBasedSolutionRepository" scope="session" />
<bean id="ISolutionRepositoryService" class="org.pentaho.platform.repository.solution.SolutionRepositoryServiceImpl" scope="session" />
<bean id="IContentRepository" class="org.pentaho.platform.repository.content.ContentRepository" scope="session" />
<bean id="IUnifiedRepository" class="org.pentaho.platform.repository2.unified.fs.FileSystemBackedUnifiedRepository" scope="prototype" />
<bean id="IAuditEntry" class="org.pentaho.platform.engine.services.audit.AuditFileEntry" scope="singleton"/>
<bean id="IUITemplater" class="org.pentaho.platform.web.http.WebTemplateHelper" scope="singleton"/>
<!-- Concrete implementation of IMetadataDomainRepository -->
<bean id="IMetadataDomainRepositoryImpl"
class="org.pentaho.platform.plugin.services.metadata.SecurityAwarePentahoMetadataDomainRepository" scope="singleton">
<!-- <constructor-arg>
<ref bean="unifiedRepository"/>
</constructor-arg> -->
</bean>
<!-- Wrap the concrete IMetadataDomainRepository implementation with one that caches domains per session -->
<!-- <bean id="IMetadataDomainRepository"
class="org.pentaho.platform.plugin.services.metadata.SessionCachingMetadataDomainRepository" scope="singleton">
<constructor-arg>
<ref bean="IMetadataDomainRepositoryImpl"/>
</constructor-arg>
</bean> -->
<!-- <bean id="IMetadataDomainRepository" class="org.pentaho.platform.plugin.services.metadata.SecurityAwarePentahoMetadataDomainRepository" scope="singleton"/> -->
<!-- Use this schema factory to disable PMD security -->
<bean id="IUserSettingService" class="org.pentaho.platform.repository.usersettings.UserSettingService">
<!-- <constructor-arg ref="unifiedRepository"/> -->
</bean>
<!-- <bean id="IUserSettingService" class="org.pentaho.platform.repository.usersettings.UserSettingService" scope="session" /> -->
<bean id="IEmailService" class="org.pentaho.platform.plugin.services.email.EmailService" scope="session"/>
<bean id="file" class="org.pentaho.platform.plugin.outputs.FileOutputHandler" scope="session"/>
<bean id="contentrepo" class="org.pentaho.platform.repository.content.ContentRepositoryOutputHandler" scope="session"/>
<bean id="vfs-ftp" class="org.pentaho.platform.plugin.outputs.ApacheVFSOutputHandler" scope="session"/>
<bean id="IAclPublisher" class="org.pentaho.platform.engine.security.acls.AclPublisher" scope="singleton"/>
<bean id="IAclVoter" class="org.pentaho.platform.engine.security.acls.voter.PentahoBasicAclVoter" scope="singleton"/>
<bean id="IVersionHelper" class="org.pentaho.platform.util.VersionHelper" scope="singleton"/>
<bean id="ICacheManager" class="org.pentaho.platform.plugin.services.cache.CacheManager" scope="singleton"/>
<bean id="IScheduler2" class="org.pentaho.platform.scheduler2.quartz.QuartzScheduler" scope="singleton"/>
<bean id="IBlockoutManager" class="org.pentaho.platform.scheduler2.blockout.PentahoBlockoutManager" scope="singleton"/>
<bean id="IConditionalExecution" class="org.pentaho.platform.plugin.condition.javascript.ConditionalExecution"
scope="prototype"/>
<bean id="IMessageFormatter" class="org.pentaho.platform.engine.services.MessageFormatter" scope="singleton"/>
<!--
IDBDatasourceService - options are:
org.pentaho.platform.engine.services.connection.datasource.dbcp.JndiDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.NonPooledOrJndiDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.NonPooledDatasourceService
org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledOrJndiDatasourceService (Default option)
-->
<bean id="IDatasource"
class="org.pentaho.platform.repository.datasource.Datasource"
scope="singleton"/>
<bean id="IPasswordService" class="org.pentaho.platform.util.Base64PasswordService" scope="singleton"/>
<bean id="IPluginProvider" class="org.pentaho.platform.plugin.services.pluginmgr.SystemPathXmlPluginProvider"
scope="singleton"/>
<bean id="IPluginManager" class="org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager"
scope="singleton"/>
<bean id="IPluginResourceLoader" class="org.pentaho.platform.plugin.services.pluginmgr.PluginResourceLoader"
scope="singleton"/>
<bean id="IPluginPerspectiveManager"
class="org.pentaho.platform.plugin.services.pluginmgr.perspective.DefaultPluginPerspectiveManager"
scope="singleton"/>
<bean id="IServiceManager" class="org.pentaho.platform.plugin.services.pluginmgr.servicemgr.DefaultServiceManager"
scope="singleton">
<property name="serviceTypeManagers">
<list>
<ref bean="gwtServiceManager"/>
<ref bean="axisServiceManager"/>
</list>
</property>
</bean>
<bean id="ITempFileDeleter" class="org.pentaho.platform.web.http.session.SessionTempFileDeleter" scope="prototype"/>
<bean id="gwtServiceManager" class="org.pentaho.platform.plugin.services.pluginmgr.servicemgr.GwtRpcServiceManager"
scope="singleton"/>
<bean id="axisServiceManager" class="org.pentaho.platform.plugin.services.pluginmgr.servicemgr.AxisWebServiceManager"
scope="singleton"/>
<bean id="IChartBeansGenerator" class="org.pentaho.platform.plugin.action.chartbeans.DefaultChartBeansGenerator"
scope="singleton"/>
<bean id="systemStartupSession" class="org.pentaho.platform.engine.security.session.TrustedSystemStartupSession" scope="prototype" />
<!-- Data connections. Connections objects should be accessed through PentahoConnectionFactory,
not directly from the PentahoObjectFactory. -->
<bean id="connection-SQL" class="org.pentaho.platform.plugin.services.connections.sql.SQLConnection"
scope="prototype">
<property name="fallBackToNonscrollableOnError" value="true"/>
</bean>
<bean id="connection-MDX" class="org.pentaho.platform.plugin.services.connections.mondrian.MDXConnection"
scope="prototype">
<property name="useExtendedColumnNames" value="true"/>
</bean>
<bean id="connection-MDXOlap4j" class="org.pentaho.platform.plugin.services.connections.mondrian.MDXOlap4jConnection" scope="prototype" />
<bean id="connection-XML" class="org.pentaho.platform.plugin.services.connections.xquery.XQConnection"
scope="prototype"/>
<bean id="connection-HQL" class="org.pentaho.platform.plugin.services.connections.hql.HQLConnection"
scope="prototype"/>
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!-- <alias name="unifiedRepository" alias="IUnifiedRepository"/> -->
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!-- <alias name="backingRepositoryLifecycleManager" alias="IBackingRepositoryLifecycleManager"/>-->
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!--<alias name="authorizationPolicy" alias="IAuthorizationPolicy"/>-->
<!-- actual bean defined in repository.spring.xml; aliased here -->
<!--<alias name="roleAuthorizationPolicyRoleBindingDaoTxn" alias="IRoleAuthorizationPolicyRoleBindingDao"/>-->
<!-- actual bean defined in applicationContext-pentaho-security-*.xml; aliased here -->
<!-- <pen:bean id="IUserRoleListService" class="org.pentaho.platform.api.engine.IUserRoleListService">
<pen:attributes>
<pen:attr key="providerName" value="${security.provider}"/>
</pen:attributes>
<penublish as-type="INTERFACES">
<pen:attributes>
<pen:attr key="priority" value="50"/>
</pen:attributes>
</penublish>
</pen:bean>
<pen:bean id="UserDetailsService" class="org.springframework.security.userdetails.UserDetailsService">
<pen:attributes>
<pen:attr key="providerName" value="${security.provider}"/>
</pen:attributes>
</pen:bean>-->
<bean id="ehCacheUserCache" class="org.springframework.security.providers.dao.cache.EhCacheBasedUserCache">
<property name="cache">
<bean class="org.springframework.cache.ehcache.EhCacheFactoryBean">
<property name="cacheManager">
<bean class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean">
<property name="shared" value="true"/>
</bean>
</property>
<property name="cacheName" value="userCache"/>
</bean>
</property>
</bean>
<bean id="cachingUserDetailsService" class="org.pentaho.platform.plugin.services.security.userrole.PentahoCachingUserDetailsService">
<constructor-arg>
<ref bean="UserDetailsService"/>
</constructor-arg>
<constructor-arg ref="tenantedUserNameUtils"/>
<property name="userCache" ref="ehCacheUserCache"/>
<!-- <penublish as-type="INTERFACES">
<pen:attributes>
<pen:attr key="priority" value="50"/>
</pen:attributes>
</penublish>-->
</bean>
<!-- actual bean defined in applicationContext-spring-security.xml; aliased here -->
<!-- <alias name="authenticationManager" alias="AuthenticationManager"/> -->
<bean id="IMondrianCatalogService" class="org.pentaho.platform.plugin.action.mondrian.catalog.MondrianCatalogHelper"
scope="singleton"/>
<bean id="IDatabaseDialectService" class="org.pentaho.database.service.DatabaseDialectService" scope="singleton"/>
<bean id="IDatasourceMgmtService" class="org.pentaho.platform.repository.datasource.DatasourceMgmtService" scope="prototype">
<!-- <constructor-arg ref="unifiedRepository"/>
<constructor-arg ref="IDatabaseDialectService"/> -->
</bean>
<!-- This mondrian user/role mapper assumes that roles from the platform also exist in mondrian -->
<!--
Disabled by default in 3.5.2. In trunk, this should be enabled.
-->
<bean id="Mondrian-UserRoleMapper"
name="Mondrian-One-To-One-UserRoleMapper"
class="org.pentaho.platform.plugin.action.mondrian.mapper.MondrianOneToOneUserRoleListMapper"
scope="singleton" />
<!-- <bean id="ReportCache" class="org.pentaho.reporting.platform.plugin.cache.NullReportCache" scope="singleton"/>
<bean id="PentahoNameGenerator" class="org.pentaho.reporting.platform.plugin.repository.TempDirectoryNameGenerator"
scope="prototype"/>-->
<bean id="MondrianConnectionProvider"
class="org.pentaho.reporting.platform.plugin.connection.PentahoMondrianConnectionProvider" scope="singleton"/>
<bean id="metadataqueryexec-SQL"
class="org.pentaho.platform.plugin.services.connections.metadata.sql.SqlMetadataQueryExec" scope="prototype"/>
<bean id="sqlGenerator" class="org.pentaho.metadata.query.impl.sql.SqlGenerator" scope="prototype"/>
<bean id="IThemeManager" class="org.pentaho.platform.web.html.themes.DefaultThemeManager" scope="singleton"/>
<bean id="ICacheExpirationRegistry" class="org.pentaho.platform.plugin.services.cache.CacheExpirationRegistry"
scope="singleton"/>
<bean id="IUserFilesComponent" class="org.pentaho.platform.web.refactor.UserFilesComponent" scope="session" />
<bean id="IBackgroundExecution" class="org.pentaho.platform.scheduler.SecurityAwareBackgroundExecutionHelper" scope="singleton" />
<bean id="BackgroundSubscriptionExecution" class="org.pentaho.platform.scheduler.SecurityAwareBackgroundSubscriptionHelper" scope="singleton" />
<bean id="ISubscriptionRepository" class="org.pentaho.platform.repository.subscription.SubscriptionRepository" scope="singleton" />
<bean id="ISubscriptionScheduler" class="org.pentaho.platform.scheduler.QuartzSubscriptionScheduler" scope="singleton" />
<bean id="INavigationComponent" class="org.pentaho.platform.uifoundation.component.xml.NavigationComponent" scope="prototype" />
<bean id="IVersionCheckDataProvider" class="org.pentaho.platform.plugin.services.versionchecker.PentahoVersionCheckDataProvider" scope="prototype" />
<bean id="IMenuProvider" class="org.pentaho.platform.web.html.HtmlMenuProvider" scope="singleton" />
</beans>
I also have not understand how to set up unifiedRepository, because for resetRepository i use ISolutionRepository.
Thank's for help me!!!!
↧
Repository - Missing folder after short time
Hi,
After creating a folder in PUC repository it's visible but only for a short time (< 5 minutes). Why :mad: ?
Thank you for your help !
After creating a folder in PUC repository it's visible but only for a short time (< 5 minutes). Why :mad: ?
Thank you for your help !
↧
↧
Problem with MDX queries in CDE
Hi everyone!
I am trying to display some information from a data source I have created from a database but I am having trouble with the queries; I obtain some results when I do them using jpivot or saiku, but when I copy the query into the CDE mdx over mondrianJndi and it says 'no data found'.
I have read a post or two about his but in none of them I found a solution. I will explain the steps I follow:
1. My star schema and the data source created is this way:
schema and datasource.jpg
2. Then I configure a Saiku OLAP Wizard:
mdx_query.jpg
3. I also configure the mdx query settings:
mdx_query_conf.jpg
4. Then I attach the query to a pie chart, which is displayed in a panel:
chart_conf.jpg
5. And the final result is 'No data found':
result.jpg
Any ideas of what I am doing wrong?
Thanks in advance and kind regards,
Alain
I am trying to display some information from a data source I have created from a database but I am having trouble with the queries; I obtain some results when I do them using jpivot or saiku, but when I copy the query into the CDE mdx over mondrianJndi and it says 'no data found'.
I have read a post or two about his but in none of them I found a solution. I will explain the steps I follow:
1. My star schema and the data source created is this way:
schema and datasource.jpg
2. Then I configure a Saiku OLAP Wizard:
mdx_query.jpg
3. I also configure the mdx query settings:
mdx_query_conf.jpg
4. Then I attach the query to a pie chart, which is displayed in a panel:
chart_conf.jpg
5. And the final result is 'No data found':
result.jpg
Any ideas of what I am doing wrong?
Thanks in advance and kind regards,
Alain
↧
Job level database transactions
Hi All,
Is job level database transactions supported in Pentaho 5.0.1 stable community version.
i am eagerly waiting for reply.
Is job level database transactions supported in Pentaho 5.0.1 stable community version.
i am eagerly waiting for reply.
↧
Not Working in Command Prompt but working with Eclipse
Hi,
I have created an utility(using java) which is responsible for executing the kettle transformations, jobs. I have created this using eclipse IDE.
I am able to execute the transformations, jobs successfully.
But, I have created a runnable jar file and tried to execute a JOB. I am getting the exception shown below: Can any one please help me on this.?
INFO 11-03 18:17:18,171 - Using "C:\Users\KATSA05\AppData\Local\Temp\vfs_cache" as temporary files store.
Job Path: C:\UniversalETLService\universal_etl\etl_repo\DASHBOARD_DATA_LOAD\Dashboard_Data_Loading.kjb
ERROR 11-03 18:17:18,685 - null.0 - Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
ERROR 11-03 18:17:18,686 - null.0 - org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:110)
at org.pentaho.di.job.JobMeta.loadXML(JobMeta.java:922)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:726)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:693)
at com.ca.businessreporting.etl.Utils.executeJob(Utils.java:564)
at com.ca.businessreporting.etl.UniversalETLService.executeJobs(UniversalETLService.java:428)
at com.ca.businessreporting.etl.UniversalETLService.init(UniversalETLService.java:559)
at com.ca.businessreporting.etl.UniversalETLService.main(UniversalETLService.java:579)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.simontuffs.onejar.Boot.run(Boot.java:342)
at com.simontuffs.onejar.Boot.main(Boot.java:168)
org.pentaho.di.core.exception.KettleXMLException:
Unable to load the job from XML file [C:\UniversalETLService\universal_etl\etl_repo\DASHBOARD_DATA_LOAD\Dashboard_Data_Loading.kjb]
Unable to load job info from XML node
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:734)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:693)
at com.ca.businessreporting.etl.Utils.executeJob(Utils.java:564)
at com.ca.businessreporting.etl.UniversalETLService.executeJobs(UniversalETLService.java:428)
at com.ca.businessreporting.etl.UniversalETLService.init(UniversalETLService.java:559)
at com.ca.businessreporting.etl.UniversalETLService.main(UniversalETLService.java:579)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.simontuffs.onejar.Boot.run(Boot.java:342)
at com.simontuffs.onejar.Boot.main(Boot.java:168)
Caused by: org.pentaho.di.core.exception.KettleXMLException:
Unable to load job info from XML node
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.JobMeta.loadXML(JobMeta.java:968)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:726)
... 11 more
Caused by: org.pentaho.di.core.exception.KettleXMLException:
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:134)
at org.pentaho.di.job.JobMeta.loadXML(JobMeta.java:922)
... 12 more
Caused by: org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:110)
... 13 more
Thanks & Regards,
Samanth
I have created an utility(using java) which is responsible for executing the kettle transformations, jobs. I have created this using eclipse IDE.
I am able to execute the transformations, jobs successfully.
But, I have created a runnable jar file and tried to execute a JOB. I am getting the exception shown below: Can any one please help me on this.?
Quote:
INFO 11-03 18:17:18,171 - Using "C:\Users\KATSA05\AppData\Local\Temp\vfs_cache" as temporary files store.
Job Path: C:\UniversalETLService\universal_etl\etl_repo\DASHBOARD_DATA_LOAD\Dashboard_Data_Loading.kjb
ERROR 11-03 18:17:18,685 - null.0 - Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
ERROR 11-03 18:17:18,686 - null.0 - org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:110)
at org.pentaho.di.job.JobMeta.loadXML(JobMeta.java:922)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:726)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:693)
at com.ca.businessreporting.etl.Utils.executeJob(Utils.java:564)
at com.ca.businessreporting.etl.UniversalETLService.executeJobs(UniversalETLService.java:428)
at com.ca.businessreporting.etl.UniversalETLService.init(UniversalETLService.java:559)
at com.ca.businessreporting.etl.UniversalETLService.main(UniversalETLService.java:579)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.simontuffs.onejar.Boot.run(Boot.java:342)
at com.simontuffs.onejar.Boot.main(Boot.java:168)
org.pentaho.di.core.exception.KettleXMLException:
Unable to load the job from XML file [C:\UniversalETLService\universal_etl\etl_repo\DASHBOARD_DATA_LOAD\Dashboard_Data_Loading.kjb]
Unable to load job info from XML node
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:734)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:693)
at com.ca.businessreporting.etl.Utils.executeJob(Utils.java:564)
at com.ca.businessreporting.etl.UniversalETLService.executeJobs(UniversalETLService.java:428)
at com.ca.businessreporting.etl.UniversalETLService.init(UniversalETLService.java:559)
at com.ca.businessreporting.etl.UniversalETLService.main(UniversalETLService.java:579)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.simontuffs.onejar.Boot.run(Boot.java:342)
at com.simontuffs.onejar.Boot.main(Boot.java:168)
Caused by: org.pentaho.di.core.exception.KettleXMLException:
Unable to load job info from XML node
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.JobMeta.loadXML(JobMeta.java:968)
at org.pentaho.di.job.JobMeta.<init>(JobMeta.java:726)
... 11 more
Caused by: org.pentaho.di.core.exception.KettleXMLException:
Unable to read Job Entry copy info from XML node : org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:134)
at org.pentaho.di.job.JobMeta.loadXML(JobMeta.java:922)
... 12 more
Caused by: org.pentaho.di.core.exception.KettleStepLoaderException:
No valid step/plugin specified (jobPlugin=null) for TRANS
at org.pentaho.di.job.entry.JobEntryCopy.<init>(JobEntryCopy.java:110)
... 13 more
Thanks & Regards,
Samanth
↧
change cde refresh method
Hi all,
please sorry if this question is already in this forum, but I've searched alot here without found anything :(.
My problem:
I'm doing a dashboard with the new cde editor in pentaho 5, there's a way to remove the spinner and/or the transparency effect when the dashboard update itself?
Why this? Because I have a google map that show some informations that change in time and I want to avoid the continuosly autorefreshing effect.
thanks in advance :)
please sorry if this question is already in this forum, but I've searched alot here without found anything :(.
My problem:
I'm doing a dashboard with the new cde editor in pentaho 5, there's a way to remove the spinner and/or the transparency effect when the dashboard update itself?
Why this? Because I have a google map that show some informations that change in time and I want to avoid the continuosly autorefreshing effect.
thanks in advance :)
↧
↧
Race Conditions In My Transformation
Hi all,
I'm using Kettle to import data from our clients systems that we are integrating our product with. It's a legacy system that was made to be cross database therefore the underlying DB uses almost no features. So the system manually creates GUIDS for IDs which I have to replicate in kettle.
The way I implemented this is I do a DBLookUp on our client table to match it with the client data in the import. If I get back a GUID from the lookup I save that in a field and use it in an insert/update step later. If the client is new I get back blank for the client GUID and I use a Modified Java Script step to generate a new GUID. Then use that new GUID in my insert/update step.
There are multiple entities I have to perform this basic process for. Which works fine until one new client is repeated multiple times in the one import file. Which due to the nature of the data I have to work with is unavoidable. Then kettle queries the DB, gets back two blank GUIDs, generates two new GUIDS, and ends up inserting the same client into the DB twice.
I tried to solve this issue by calling my transformation from a single threader in another transformation but it said Blocking Steps are not supported in a single threader. So I removed them and it caused DB deadlocks when I called it. Which I wouldn't think should happen in a single thread but I suppose it does.
I'm sure I can't be the first person to encounter this. Any tips or advice? Is there a way to check the datastream for duplicate fields or the like?
I've attached two pictures of my transformation with and without the blocking steps.
with_blocking_steps.jpg
without_blocking_steps.jpg
Cheers and thanks in advance for the help all,
Wally.
I'm using Kettle to import data from our clients systems that we are integrating our product with. It's a legacy system that was made to be cross database therefore the underlying DB uses almost no features. So the system manually creates GUIDS for IDs which I have to replicate in kettle.
The way I implemented this is I do a DBLookUp on our client table to match it with the client data in the import. If I get back a GUID from the lookup I save that in a field and use it in an insert/update step later. If the client is new I get back blank for the client GUID and I use a Modified Java Script step to generate a new GUID. Then use that new GUID in my insert/update step.
There are multiple entities I have to perform this basic process for. Which works fine until one new client is repeated multiple times in the one import file. Which due to the nature of the data I have to work with is unavoidable. Then kettle queries the DB, gets back two blank GUIDs, generates two new GUIDS, and ends up inserting the same client into the DB twice.
I tried to solve this issue by calling my transformation from a single threader in another transformation but it said Blocking Steps are not supported in a single threader. So I removed them and it caused DB deadlocks when I called it. Which I wouldn't think should happen in a single thread but I suppose it does.
I'm sure I can't be the first person to encounter this. Any tips or advice? Is there a way to check the datastream for duplicate fields or the like?
I've attached two pictures of my transformation with and without the blocking steps.
with_blocking_steps.jpg
without_blocking_steps.jpg
Cheers and thanks in advance for the help all,
Wally.
↧
Pentaho 4.6 parameterize query in Analyzer Report
Hi I need a simple answare:
is available in pentaho BA 4.6 the possibility to parameterize a query in the Analyzer Report?
It is available in v. 4.8.
Please I need a response.
Regards.
Nico
is available in pentaho BA 4.6 the possibility to parameterize a query in the Analyzer Report?
It is available in v. 4.8.
Please I need a response.
Regards.
Nico
↧
No arranca Tomcat
Hola a todos, este es mi primer post en la comunidad.
Les cuento, en mi equipo personal (32 bit) instalé y configuré Pentaho sin ningun problema. Sin embargo ahora necesito instalarlo en un equipo que cuenta con Windows Server 2008 (64bit), todo bien, sin embargo cuando lanza la segunda ventana del Tomcat simplemente la ventana se cierra y no hace nada.
Alguna idea que puede estar pasando?
Gracias desde ya a todos ;)
Les cuento, en mi equipo personal (32 bit) instalé y configuré Pentaho sin ningun problema. Sin embargo ahora necesito instalarlo en un equipo que cuenta con Windows Server 2008 (64bit), todo bien, sin embargo cuando lanza la segunda ventana del Tomcat simplemente la ventana se cierra y no hace nada.
Alguna idea que puede estar pasando?
Gracias desde ya a todos ;)
↧