Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Doing GeoLocation in PDI - Pentaho Data Integration (Kettle)

$
0
0
Geo Location

Geo location is something we often need in ETL work. And while we had a step that worked in PDI 5.x and earlier releases, we just noticed it's not currently working.

Until this morning, that is :p

I just forked Matt's initial project and applied the relevant changes to make it compatible with Pentaho 6+

The basics

Well, easy to understand... We have an IP address, we want to know where it comes from!


Geolocation transformation - Let me see if it finds out where I am...


Once I execute this, I get the following result:


Yep, this is where I am...


I am indeed in Porto Salvo, Portugal, so this is right. Can't get any easier than this!




Making it work

So, how to make this work? First, you have to get the plugin from the PDI marketplace


This plugin is available through the marketplace. Just go ahead and install it.


PDI Marketplace - Get your goodies from here


After installing it and restarting PDI, you'll see the GeoIP Lookup step in the lookup folder. Configuring it is straightforward: You point to the stream field containing the IP address, point to the IP database files and specify what fields you want back:


Configuring the step


Getting the IP Database files

You need to get the files from MaxMind, and from my experience these guys do a great job here. They have some great commercial offerings but also a GeoLite database for country and city location. You can get them from here


Getting the GeoIP data files


And you should be done! This even works great in a map reduce job






More...

Get a file with FTP when file name contains white space

$
0
0
Hello,

How to get a file from unix server (AIX) when the file name contains white space ?
I'm using scenario "Get a file with FTP"
The regular expression is .*txt$
The filename is : AGT_013 E001_17062016145232.txt

error message (in french!!) :
Erreur inattendue : com.enterprisedt.net.ftp.FTPException: 550 E001_17062016145232.txt: Un fichier ou un répertoire du chemin d'accès n'existe pas.
Erreur lors de la récupération des fichiers depuis le serveur FTP : La condition de succès a été violée! Nous avons 1 erreur(s)

Thanks,

Best regards,

Franck

dynamic transformation creation & parallel execution

$
0
0
Hello,

I have a requirement where I will read 'x' (any integer) from a table and based on 'x', I have to create 'x' no. of transformations dynamically. The template for these dynamic transformations will be similar. I ahve to make sure that these transformations execute in parallel as well.

Please provide some inputs.

Thanks,
Sachin

Date formatting

$
0
0
Hi there!

First of all, sorry if this post doesn't belong here. This is my first day here in the forums so I'm not sure where things are located...

I started working on a BI project in my company using Pentaho. The development team was already set up so I came late to join the party. They're writing XML files and then uploading these XML files to the Pentaho servers; so we're not using any of the Pentaho tools to develop.

Having this in mind, I need to add some properties to a level. Some of these properties are dates, and when I add them, they're shown in the report as datetime objects. I only need to display the date portion as 'DD/MM/YY'; I've tried to search online for a solution but a lot of people is using the Report Designer so I haven;t found a solution for this.

Could anyone tell me how to fix this problem?

Thanks!

ColorMap with MetricDot

$
0
0
Hi!
I'm trying to create a Metric dot Chart with dynamic colors accordingly the value of a specific column.

For some reason the ColorMap property is not working, it's just being ignored.

Please, how can I to set a these dynamic colors? e.g.: Value 10 then color is Black, Value 20 then color is Green, etc..

Follow my example trying to do that, please, any help is appreciated!

new pvc.MetricDotChart({
canvas: 'cccMetricDotExample5',
width: 600,
height: 550,
crosstabMode: false,
readers:
'quantity, sales, expectedSales, previousSales',

// Data
ignoreNulls: false,

// Visual Roles
visualRoles: {

// Main plot's
x: 'quantity',
y: 'sales',
color: 'expectedSales',
size: 'previousSales'
},

// Chart/Interaction
colorMap: {
'10': 'green',
'20': 'black',
'30': 'red',
'40': 'yellow'
}

})
.setData({
"resultset": [
[1, 1, 10, 10],
[2, 2, 20, 20],
[3, 3, 30, 30],
[4, 4, 40, 40],
],
"metadata": [{
"colType": "Numeric",
"colName": "quantity"
}, {
"colType": "Numeric",
"colName": "sales"
}, {
"colType": "Numeric",
"colName": "expectedSales"
}, {
"colType": "Numeric",
"colName": "previousSales"
}
]
})
.render();

Parameter limitation in ETL Metadata injection

$
0
0
Hi,
I'm using ETL Meta data injection to load data into a table from an excel, As the structure of the Excel differs from time to time i'm using injection. so in table i have the column names which will be joined with the excels header and those data will be pushed, but i'm getting an error if the columns are more than 2000. can it be increased?

Pentaho Reporting Output step using a prpt from a BA Server

$
0
0
Hi,

I'm trying to create a Pentaho Reporting Output step in a transformation using a report that is on a BA Server repository. Whenever I use a filesystem filepath value for "Report definition file" the step works, but if I want to use a report that is on a BA Server setting the "Report definition file" to something like "http://pentaho_user:password@localhost:8080/pentaho/api/repos/%3Apublic%3Asales_report_day.prpt" I get a HTTP 401 Unauthorized response. I also tested it substituting the "%3A" by the : (colon) and had the same result. I have the user and password embedded in the URL since there is no option to include it anywhere in the Pentaho Reporting Output step. Testing the URL on a browser like firefox works as expected.

I have also tested with "http://localhost:8080/pentaho/api/repos/%3Apublic%3Asales_report_day.prpt?userid=pentaho_user&password=password" without success.

What should be the correct value for the "Report definition file"?

The transformation itself is on the BA Server too.
BA Server 6.1
PDI 6.1

Embedded CDE DASHBOARD in HTML - Losing images and styles -

$
0
0
I am embedding a report (included in UC), the basic elements like graphs, selectors are represented but the images and styles are lost.

When opened in user console it is displayed correctly.





When embedded lost images and style:




Is this the correct running?
Am I doing something wrong?[MY CODE]
<script type="text/javascript" src="http://localhost:8080/pentaho/plugin/pentaho-cdf-dd/api/renderer/cde-embed.js"></script>

<script type="text/javascript">
require(['/pentaho/plugin/pentaho-cdf-dd/api/renderer/getDashboard?path=/public/plugin-samples/pentaho-cdf-dd/pentaho-cdf-dd-require/tests/ExportButton/exportbutton.wcdf'],
function(Dash) {
var dash = new Dash("content1");
dash.render();
});
</script>

Thanks in advance!

Join multiple databases with subreports

$
0
0
Hi,

I have a redshift database and a mysql database where I would like to join information from both together

There is data in the redshift one that does not exist in the mysql one so I would like to be able to pass this information over to the mysql one

I know this can be done with PDI but I was hoping to avoid that as I am not that familiar with it. Another method I saw was using subreports with different datasources

Is it possible to pass information from master-subreport or vice-versa? I know parameters can be shared but since this information is not in one of the datasources I am struggling to see how to use parameters for this

Any advice is greatly appreciated

CDE Dashboard Error

$
0
0
I discovered that the sqlserver.hibernate.cfg.xml file had two mappings, one for CE and one for EE. I just had to comment out the one for EE and restart the server. It would be nice if this step could be added to the documentation.

SOLVED

I have installed pentaho-server-ce-7.0.0.0-25 and setup Pentaho to use MS SQL for Hibernate, Jackrabbit and Quartz. The console loads and I was able to log in and change the admin password. I have setup my data source and made the necessary table joins. After selecting Tools>App Builder, the App Builder tab loads and I get an error "Could not load dashboard: null. I am assuming this is because I have not created a dashboard. After selecting "New CDE Dashboard", the CDE Dashboad Tab loads with "New Dashboard" as the name. Nothing happens when I select Save or Save As. Below is the following errors from the log file:

2017-01-11 08:38:50,200 ERROR [pt.webdetails.cpf.InterPluginCall] No bean factory for plugin: pentaho-cdf
2017-01-11 08:38:50,203 ERROR [pt.webdetails.cdf.dd.model.inst.writer.cdfrunjs.dashboard.legacy.CdfRunJsDashboardWriter] Failed to get cdf includes
2017-01-11 08:38:52,858 ERROR [pt.webdetails.cpf.InterPluginCall] No bean factory for plugin: pentaho-cdf
2017-01-11 08:38:52,859 ERROR [pt.webdetails.cdf.dd.api.RenderApi] Could not load dashboard: null
java.lang.NullPointerException
at pt.webdetails.cpf.InterPluginCall.run(InterPluginCall.java:216)
at pt.webdetails.cpf.InterPluginCall.call(InterPluginCall.java:365)
at pt.webdetails.cpf.InterPluginCall.call(InterPluginCall.java:67)
......
......

Any help would be greatly appreciated.

Joe Coyle

Errors starting pentaho 7.0 server

$
0
0
I solved this issue that I reported below. The hsql.hibernate.cfg.xml file had two mappings, one for CE and one for EE. I just needed to commit out the EE mappings and restart the server. It would be good to have this step added to the documentation.

SOLVED

I have followed the manual MS SQL instructions for installing pentaho-server-ce-7.0.0.0-25. When I start the server I am receiving these errors below. I am able to log into the web console on port 8080 and was able to change the admin password.

I notice in the log pentaho is looking for sqlserver.EE.hbm.xml. This is not the correct file. In Hibernate-settings.xml I have set the following variable <config-file>system/hibernate/hsql.hibernate.cfg.xml</config-file>

Any Help would be greatly appreciated.

Joe Coyle

2017-01-11 10:10:48,544 INFO [org.pentaho.platform.engine.core.system.status.PeriodicStatusLogger] Caution, the system is initializing. Do not shut down or restart the system at this time.
2017-01-11 10:10:49,511 INFO [org.pentaho.platform.osgi.OSGIBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
2017-01-11 10:10:54,781 INFO [org.pentaho.platform.osgi.KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 1 at C:\pentaho\server\pentaho-server\pentaho-so ***
*** lutions\system\karaf\caches\default\data-1 ***
*** Karaf Port:8802 ***
*** OSGI Service Port:9051 ***
*** JMX RMI Registry Port:11099 ***
*** RMI Server Port:44445 ***
*******************************************************************************
2017-01-11 10:11:18,548 INFO [org.pentaho.platform.engine.core.system.status.PeriodicStatusLogger] Caution, the system is initializing. Do not shut down or restart the system at this time.
2017-01-11 10:11:22,894 ERROR [org.pentaho.platform.repository.hibernate.HibernateUtil] HIBUTIL.ERROR_0006 - Building SessionFactory failed.
org.hibernate.MappingNotFoundException: resource: hibernate/sqlserver.EE.hbm.xml not found
at org.hibernate.cfg.Configuration.addResource(Configuration.java:799)
at org.hibernate.cfg.Configuration.parseMappingElement(Configuration.java:2344)
at org.hibernate.cfg.Configuration.parseSessionFactory(Configuration.java:2310)
at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:2290)
at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:2243)
at org.hibernate.cfg.Configuration.configure(Configuration.java:2216)
at org.pentaho.platform.repository.hibernate.HibernateUtil.initialize(HibernateUtil.java:138)
at org.pentaho.platform.repository.hibernate.HibernateUtil.<clinit>(HibernateUtil.java:90)
at org.pentaho.cdf.environment.configurations.PentahoHibernanteConfigurations.getConfiguration(PentahoHibernanteConfigurations.java:30)
at org.pentaho.cdf.utils.PluginHibernateUtil.initialize(PluginHibernateUtil.java:58)
at org.pentaho.cdf.storage.StorageEngine.getInstance(StorageEngine.java:61)
at org.pentaho.cdf.storage.StorageApi.<init>(StorageApi.java:46)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:142)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:89)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1098)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1050)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:510)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:776)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:861)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.reload(PentahoSystemPluginManager.java:286)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.reload(PentahoSystemPluginManager.java:176)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:40)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:442)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:433)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:433)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:83)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:364)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:361)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:361)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:331)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:227)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:162)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4853)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1092)
at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1834)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2017-01-11 10:11:22,902 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2017-01-11 10:11:22,902 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager: PluginManager.ERROR_0011 - Failed to register plugin pentaho-cdf
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'storage' defined in file [C:\pentaho\server\pentaho-server\pentaho-solutions\system\pentaho-cdf\plugin.spring.xml]: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.pentaho.cdf.storage.StorageApi]: Constructor threw exception; nested exception is java.lang.ExceptionInInitializerError
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1105)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1050)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:510)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:482)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:776)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:861)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.reload(PentahoSystemPluginManager.java:286)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.reload(PentahoSystemPluginManager.java:176)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:40)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:442)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:433)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:433)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:83)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:364)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:361)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:361)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:331)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:227)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:162)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4853)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1092)
at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1834)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.pentaho.cdf.storage.StorageApi]: Constructor threw exception; nested exception is java.lang.ExceptionInInitializerError
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:154)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:89)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1098)
... 38 more
Caused by: java.lang.ExceptionInInitializerError
at org.pentaho.platform.repository.hibernate.HibernateUtil.initialize(HibernateUtil.java:199)
at org.pentaho.platform.repository.hibernate.HibernateUtil.<clinit>(HibernateUtil.java:90)
at org.pentaho.cdf.environment.configurations.PentahoHibernanteConfigurations.getConfiguration(PentahoHibernanteConfigurations.java:30)
at org.pentaho.cdf.utils.PluginHibernateUtil.initialize(PluginHibernateUtil.java:58)
at org.pentaho.cdf.storage.StorageEngine.getInstance(StorageEngine.java:61)
at org.pentaho.cdf.storage.StorageApi.<init>(StorageApi.java:46)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:142)
... 40 more
Caused by: org.hibernate.MappingNotFoundException: resource: hibernate/sqlserver.EE.hbm.xml not found
at org.hibernate.cfg.Configuration.addResource(Configuration.java:799)
at org.hibernate.cfg.Configuration.parseMappingElement(Configuration.java:2344)
at org.hibernate.cfg.Configuration.parseSessionFactory(Configuration.java:2310)
at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:2290)
at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:2243)
at org.hibernate.cfg.Configuration.configure(Configuration.java:2216)
at org.pentaho.platform.repository.hibernate.HibernateUtil.initialize(HibernateUtil.java:138)
... 50 more
2017-01-11 10:11:22,906 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2017-01-11 10:11:31,278 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2017-01-11 10:11:31,278 ERROR [org.pentaho.platform.util.logging.Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager: PluginManager.ERROR_0011 - Failed to register plugin pentaho-cdf
java.lang.NullPointerException
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.registerContentGenerators(PentahoSystemPluginManager.java:565)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.registerPlugin(PentahoSystemPluginManager.java:375)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.reload(PentahoSystemPluginManager.java:322)
at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.reload(PentahoSystemPluginManager.java:176)
at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.startup(PluginAdapter.java:40)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:442)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:433)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:433)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:83)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:364)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:361)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:412)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:361)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:331)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:227)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:162)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4853)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1092)
at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1834)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2017-01-11 10:11:31,279 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2017-01-11 10:11:41,794 INFO [org.pentaho.platform.engine.core.system.status.PeriodicStatusLogger] The system has finished initializing.

Export Popup Component not displayed correctly

$
0
0
Hi,

All my reports have "Export Popup Component" which works fine , except in one report and unable to click on the popup. when in compare all the reports have the same properties.
When i inspect the element for both the reports, the html snippet is as

Working Popup:

<div class="popupComponent exportOptions south ui-draggable" style="top: 52.8906px; bottom: auto; left: 578.5px; right: auto;">
<div class="exportElement">Export Table</div><a class="close">&nbsp;</a><div class="arrow" style="left: 26.5px;"></div></div>

Non working Popup:
<div class="popupComponent exportOptions south ui-draggable" style="top: 52.8906px; bottom: auto; left: 607.5px; right: auto;">
<a class="close">&nbsp;</a><div class="arrow" style="left: 0px;"></div></div>

where <div class="exportElement">Export Table</div> is missing in the non working one.
Did i miss anything.
Working.PNGNonWorking.PNG
Thanks,
Padma Priya N.
Attached Images

Reporting Output Folder Path

$
0
0
I have a step in a transformation that has a reporting output and I am dynamically creating the file path in the Java Script step which creates the file output based on the year and month. Unfortunately it seems that Report Output does not create the folder path if it does not exist. Is there anyway to fix this or a work around anyone can think of?

Table input error "couldn't get row from result set"

$
0
0
I get an error in Pentaho Data Integration while creating an ETL that extracts from a MySQL database.
Specifically, I get an error in the VERY first step called "Table Input". The connection is fine so I know that is not the problem. I get a problem with even the very first row.
This paragraph is part of the log of the error:


2017/01/1112:26:00-IN R.A.L..0- ERROR (version 7.0.0.0-25, build 1from2016-11-0515.35.36by buildguy): Unexpected error
2017/01/1112:26:00-IN R.A.L..0- ERROR (version 7.0.0.0-25, build 1from2016-11-0515.35.36by buildguy): org.pentaho.di.core.exception.KettleDatabaseException:
2017/01/1112:26:00-IN R.A.L..0- Couldn't get row from result set
2017/01/11 12:26:00 - IN R.A.L..0 -
2017/01/11 12:26:00 - IN R.A.L..0 - Unable to get value '
Date' from database resultset, index 63
2017/01/11 12:26:00 - IN R.A.L..0 - Value '
0000-00-00' can not be represented as java.sql.Timestamp
2017/01/1112:26:00-IN R.A.L..0-
2017/01/1112:26:00-IN R.A.L..0-

I can "suppose" it is a value in the row declared as timestamp that has a 0000-00-00, but I checked and double checked, and that is not it! I do have a timestamp but its value is not 0000-00-00 as it says in the log, and the one column that does have a 0000-00-00 is not a timestamp but a Date, so...
I'm obviously missing something, otherwise I wouldn't be asking, so thanks in advance for your help!

Change Date format through metadata injection

$
0
0
I'm trying to copy some 100+ tables from MySQL (table input SELECT *) to Snowflake (SF Bulk Loader). When trying to copy data across these the timestamp fields needs to be converted from YYYY/MM/DD HH:MM:SS --> YYYY-MM-DD HH:MM:SS for snowflake to accept them. Problem is that I don't know a priori which fields have timestamps and I want to avoid mapping each table manually.

So I'm wondering if anyone knows a way of using meta-data injection to identify which are the inbound timestamp fields and apply a Format of YYYY-MM-DD HH:MM:SS to them automatically?

thx, sebastian

Is there a utility for this ? (formatted text files from KTR/KJB files)

$
0
0
We are not using the PDI repository and instead are using SVN and the local file system.
We have a TON of SQL in many KTR transforms.
I'd like to be able to take an entire directory and extract the SQL into a nice, formatted set of text files.
Also, for the KJB files, I would simply like a listing of the transforms and other jobs being called in their respective order of course.

I take it I'll need to use some sort of XML transform process to do this with the correct XML schemas ?

Dimension Lookup Update technical key as uuid

$
0
0
How can I create a technical key as uuid in Dimension Lookup Update?

I am using Maria DB

MongoDB Input - limiting the result

$
0
0
Hello All.

Using a MongoDB Input, how do I limit the resultset? I know that I could use "$limit". However, from what I have researched thus far, this is only available if I check off "Query is aggregation pipeline".

Is there a way to limit the resultset without checking off "Query is aggregation pipeline"?

Thanks.

Pentaho Business Analytics MongoDB Connectivity

$
0
0
I am running Pentaho Business Analytics on windows 10. trying to connect with MongoDB running on Linux sever but not able to see the MongoDB in Data Source - Database Type List.'

I downloaded the MongoDB JDBC Driver and put in C:\pentaho-server-ce-7.0.0.0-25\pentaho-server\tomcat\server\lib path but still not able to get connection.

Could you please suggest me proper way to get connected?
Here I am attaching the connections screen(Data Source) page...

Capture.jpg

thanks
Pravin Dwiwedi
Attached Images

Use formula to calculate end of month using year & month in string format

$
0
0
Hi,

I'm struggling to create a valid (end of month) date using year and month values (stored as strings). I get the error as attached ("I cannot convert the specified value to data type 0")

I've been trying to use this formula.

DATEVALUE(DATE([Source_Year]; [Source_Month1] + 1; 1) - 1)

Can you suggest where I am going wrong?

Thanks,

Andy

PS [Source Month 2], another field has some null values in it. So I will need to check for this using if(isblank). Can I use NULL() as a result if this isblank condition is met?

pdi1.jpgpdi2.jpg
Attached Images
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>