Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Data Analysis Reporting/Scheduling - Email Subject line needs modification

$
0
0
Currently, the scheduler generates the subject line from the report name.

Is this definition stored as an XML file somewhere so that we could modify the subject line on-the-fly and enter a date or other information ?

Multiple Reports sent in a single email

$
0
0
Is it possible with the report designer ?

I don't think it's possible with data analysis reports which seem to each have their own scheduled task.

Essentially I am looking to have multiple report sections in a single report each with independent data sources.

Steps for processing DOM XML

$
0
0
There was a need to process data that has discrete, non relational, but multilevel nested structure.
It can be done with standard Kettle XML steps ("Concat Fields", "Add XML", "XSL Transformation"), but it puts high GC pressure.

We try to introduce DOM XML as a data type for fields to reduce memory allocation and GC pressure.
You can see a sample plugin that includes a few modified steps using Document objects as a basis here: https://github.com/griddynamics/xml-...tle-etl-plugin

Run PDI remotely on linux from Powershell

$
0
0
ok, I know it's Friday and it's late, but I've been banging my head against a wall on this the whole day...

I'm trying to migrate a process with PDI from a "local" (next room) Windows server to a remote (really far away) Debian server. I found out this way to use SSH on Powershell, but when I try to use it to run kitchen.sh I get a "java not found" error... =(

Code:

PS C:\Users\my-user> $SshSessions.'linux-server'.RunCommand('/opt/data-integration/kitchen.sh')
CommandText          : /opt/data-integration/kitchen.sh
CommandTimeout      : -00:00:00.0010000
ExitStatus          : 0
OutputStream        : Renci.SshNet.Common.PipeStream
ExtendedOutputStream : Renci.SshNet.Common.PipeStream
Result              :
Error                : /opt/data-integration/spoon.sh: 209: /opt/data-integration/spoon.sh: java: not found

But I have no idea why, since I've added this on kitchen.sh:
Code:

PENTAHO_JAVA_HOME=/opt/jdk/jdk1.7.0_65/bin
When I try it on PuTTY, it works as it should...
Code:

my-user@linux-server:~$ /opt/data-integration/kitchen.sh
Options:
  -rep            = Repository name
  -user          = Repository username
  -pass          = Repository password
  -job            = The name of the job to launch
  -dir            = The directory (dont forget the leading /)
  -file          = The filename (Job XML) to launch
  -level          = The logging level (Basic, Detailed, Debug, Rowlevel, Error, Nothing)
  -logfile        = The logging file to write to
  -listdir        = List the directories in the repository
  -listjobs      = List the jobs in the specified directory
  -listrep        = List the available repositories
  -norep          = Do not log into the repository
  -version        = show the version, revision and build date
  -param          = Set a named parameter <NAME>=<VALUE>. For example -param:FILE=customers.csv
  -listparam      = List information concerning the defined parameters in the specified job.
  -export        = Exports all linked resources of the specified job. The argument is the name of a ZIP file.
  -custom        = Set a custom plugin specific option as a String value in the job using <NAME>=<Value>, for example: -custom:COLOR=Red
  -maxloglines    = The maximum number of log lines that are kept internally by Kettle. Set to 0 to keep all rows (default)
  -maxlogtimeout  = The maximum age (in minutes) of a log line while being kept internally by Kettle. Set to 0 to keep all rows indefinitely (default)

A better description of what's going on here:
- the process is, as of now, on a batch file. I'm migrating it to Powershell.
- I'm not the only one who must be able to run it.
- Using that SSH over Powershell solution isn't actually needed, as long as there's a way to easily make the solution available to anyone who can access a Windows fileshare were the file-based PDI repository is. That one was easy to fit in this prereq because I can save the script files on the fileshare, and have Powershell copy/enable the module/disable the module/remove the files on the local machine where the batch/PS script is running.
- The batch/PS script interacts with the user, but the PDI process doesn't. I just need to run it, so if I can get /opt/data-integration/kitchen.sh to execute, I know the jobs will run.
- SSH per se isn't needed, but I do need a way to, from Powershell, tell the Linux server to run a couple jobs (yeah, there are a few, not just one).
- Optional: a way to copy all related jobs, transformations, templates and validation files (ktr, kjb, xlsx, xsl and xsd files) from the fileshare to the linux server. If there's a way to copy one folder and all it's content is enough, since all these files are rounded up on only one folder (and subfolders).

So... can anyone gimme a light here?

(edit) just tried doing it with plink and I get the same error on powershell: "java: note found"
(edit 2) oh yeah, if I try to check java, either through plink or $SshSessions.'linux-server'.RunCommand(), I get to it...
Code:

PS C:\Users\my-user> $SshSessions.'linux-server'.RunCommand('/opt/jdk/jdk1.7.0_65/bin/java -version')



CommandText          : /opt/jdk/jdk1.7.0_65/bin/java -version
CommandTimeout      : -00:00:00.0010000
ExitStatus          : 0
OutputStream        : Renci.SshNet.Common.PipeStream
ExtendedOutputStream : Renci.SshNet.Common.PipeStream
Result              :
Error                : java version "1.7.0_65"
                      Java(TM) SE Runtime Environment (build 1.7.0_65-b17)
                      Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

Code:

PS C:\Users\my-user> \\fileshare\utils\putty\PLINK.EXE -ssh my-user@linux-server /opt/jdk/jdk1.7.0_65/bin/java -version
java version "1.7.0_65"
Java(TM) SE Runtime Environment (build 1.7.0_65-b17)
Java HotSpot(TM) 64-Bit Server VM (build 24.65-b04, mixed mode)

Table Component Highlight onClick

$
0
0
Hello all,

Curious if anyone has tried to highlight a table component during an onclick event? Basically we are in need of clicking on a row in the table which will load other charts but we need that row to highlight to show which row was selected.

Any guidance would be much appreciated.

Thanks!

Table Component Row Button or Link

$
0
0
I'm trying to determine the best approach for either adding a delete button or delete hyperlink to rows. I need the capability to delete a row. Any direction, examples or guidance would be much appreciated.

Thanks!

Row and Column Index of "Preview Data" for a transformation

$
0
0
Hi Community,

What is the row and column index of preview data of a transformation ?

USE CASE : Processioning of a CSV file with Java script.

How to insert the values in respective fields in below image.

data.jpg

Thank you in advance for any help.!
Attached Images

Error "HTTP 404" on localhost:8080

$
0
0
Hi there.

I'm trying to run BI-Server 5.4.0.1-130 but get this error in browser (http://localhost:8080):

http.jpg

I'm on Linux OpenSuSE 13.2

#>java -version
java version "1.8.0_25"
Java(TM) SE Runtime Environment (build 1.8.0_25-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode)


petaho.log

Code:

2015-08-22 11:39:52,145 ERROR [org.springframework.web.context.ContextLoader] Context initialization failed
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.h2.tools.Server' defined in file [/home/coicoi/Descargas/biserver-ce/pentaho-solutions/system/GettingStartedDB-spring.xml]: Invocation of init method failed; nested exception is org.h2.jdbc.JdbcSQLException: Exception abriendo puerto "9092" (el puerto puede estar en uso), causa: "java.net.BindException: La dirección ya se está usando"
Exception opening port "9092" (port may be in use), cause: "java.net.BindException: La dirección ya se está usando" [90061-131]
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1338)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:473)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
    at java.security.AccessController.doPrivileged(Native Method)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:380)
    at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:264)
    at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
    at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:261)
    at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:185)
    at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:164)
    at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:429)
    at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:728)
    at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:380)
    at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:255)
    at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:199)
    at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:45)
    at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:802)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
    at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
    at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1068)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1060)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:759)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.h2.jdbc.JdbcSQLException: Exception abriendo puerto "9092" (el puerto puede estar en uso), causa: "java.net.BindException: La dirección ya se está usando"
Exception opening port "9092" (port may be in use), cause: "java.net.BindException: La dirección ya se está usando" [90061-131]
    at org.h2.message.DbException.getJdbcSQLException(DbException.java:316)
    at org.h2.message.DbException.get(DbException.java:156)
    at org.h2.util.NetUtils.createServerSocketTry(NetUtils.java:175)
    at org.h2.util.NetUtils.createServerSocket(NetUtils.java:141)
    at org.h2.server.TcpServer.start(TcpServer.java:200)
    at org.h2.tools.Server.start(Server.java:330)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1414)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1375)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1335)
    ... 39 more
Caused by: java.net.BindException: La dirección ya se está usando
    at java.net.PlainSocketImpl.socketBind(Native Method)
    at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:382)
    at java.net.ServerSocket.bind(ServerSocket.java:375)
    at java.net.ServerSocket.<init>(ServerSocket.java:237)
    at java.net.ServerSocket.<init>(ServerSocket.java:128)
    at org.h2.util.NetUtils.createServerSocketTry(NetUtils.java:171)
    ... 49 more
2015-08-22 11:39:52,200 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2015-08-22 11:39:52,200 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem: PentahoSystem.ERROR_0015 - Error mientras se intentaba la secuencia de parada de org.pentaho.platform.engine.services.connection.datasource.dbcp.DynamicallyPooledDatasourceSystemListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ICacheManager' defined in file [/home/coicoi/Descargas/biserver-ce/pentaho-solutions/system/pentahoObjects.spring.xml]: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.pentaho.platform.plugin.services.cache.CacheManager]: Constructor threw exception; nested exception is java.lang.NullPointerException
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:883)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:826)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:440)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409)
    at java.security.AccessController.doPrivileged(Native Method)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:380)
    at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:264)
    at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
    at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:261)
    at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:185)
    at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:164)
    at org.pentaho.platform.engine.core.system.objfac.spring.SpringPentahoObjectReference.getObject(SpringPentahoObjectReference.java:69)
    at org.pentaho.platform.engine.core.system.objfac.AggregateObjectFactory.get(AggregateObjectFactory.java:276)
    at org.pentaho.platform.engine.core.system.objfac.AggregateObjectFactory.get(AggregateObjectFactory.java:136)
    at org.pentaho.platform.engine.core.system.PentahoSystem.getCacheManager(PentahoSystem.java:1210)
    at org.pentaho.platform.engine.services.connection.datasource.dbcp.PooledDatasourceSystemListener.shutdown(PooledDatasourceSystemListener.java:71)
    at org.pentaho.platform.engine.core.system.PentahoSystem.shutdown(PentahoSystem.java:1027)
    at org.pentaho.platform.web.http.context.SolutionContextListener.contextDestroyed(SolutionContextListener.java:243)
    at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4249)
    at org.apache.catalina.core.StandardContext.stop(StandardContext.java:4890)
    at org.apache.catalina.core.StandardContext.start(StandardContext.java:4754)
    at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:802)
    at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
    at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
    at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
    at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
    at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
    at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
    at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
    at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1068)
    at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
    at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1060)
    at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
    at org.apache.catalina.core.StandardService.start(StandardService.java:525)
    at org.apache.catalina.core.StandardServer.start(StandardServer.java:759)
    at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
    at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [org.pentaho.platform.plugin.services.cache.CacheManager]: Constructor threw exception; nested exception is java.lang.NullPointerException
    at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:115)
    at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:61)
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:877)
    ... 42 more
Caused by: java.lang.NullPointerException
    at org.pentaho.platform.engine.core.system.SystemSettings.getAbsolutePath(SystemSettings.java:177)
    at org.pentaho.platform.engine.core.system.PathBasedSystemSettings.getAbsolutePath(PathBasedSystemSettings.java:88)
    at org.pentaho.platform.engine.core.system.SystemSettings.getFile(SystemSettings.java:156)
    at org.pentaho.platform.engine.core.system.SystemSettings.getSystemSettingsDocument(SystemSettings.java:129)
    at org.pentaho.platform.engine.core.system.SystemSettings.getSystemSetting(SystemSettings.java:82)
    at org.pentaho.platform.engine.core.system.SystemSettings.getSystemSetting(SystemSettings.java:94)
    at org.pentaho.platform.engine.core.system.PathBasedSystemSettings.getSystemSetting(PathBasedSystemSettings.java:68)
    at org.pentaho.platform.plugin.services.cache.CacheManager.<init>(CacheManager.java:152)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
    at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:100)
    ... 44 more
2015-08-22 11:39:52,201 ERROR [org.pentaho.platform.util.logging.Logger] Error end:

Any idea?

Thank's in advance.
Attached Images

HELP! Data Source Model Editor - Slow and Clunky? Am I doing something wrong?

$
0
0
I really need some help!! I'm using the Data Source Model Editor to model my star schema (measure names, aggregates settings, etc.) for both reports and ad hoc analysis.

This tool is crazy clunky for me for a few reasons.

1. Cannot multi select anything.
2. I need to set my data types for 100+ measures and I have to do this one at a time.
3. I also want to move my measures up and down the list for presentation purposes. When I use the clickable up and down arrows, it takes 10+ seconds to move one spot. When I drag the measure up or down, for one, I cannot scroll the window while I have the measure 'grabbed'. I have 100+ measures and this means I have to drag it to the top of the screen, scroll up, then drag again, then scroll up, then drag, etc. To make matters worse, after I drop it where I want it, it takes 20+ seconds to take affect.
4. I cannot create sub folders for measures. I have 100+ measures and I need to sub folder them for presentation. This is not possible.

I'm pretty frustrated at this point. I was very excited to find this tool but these are making modeling too painful use.

Am I doing something wrong? Is my environment just slow? Is there a better tool to configure star schema, measure names, measure types.

Multiple value mapper conversions

$
0
0
I'm a novice with PDI. Looking for advice on the best way to map/convert values in different columns in the input stream to values held in a metadata dictionary. Is it simply a matter of instantiating a Value Mapper for each column or is there a more efficient approach, as I worry that with each addition the memory and processing time foot print of the overall stream will balloon.

If I were to implement this in java, I'd have a single iteration cycle and perform all the look ups/mappings at once.

Thank you

Report Design Client & Server Data Sources

$
0
0
Hi;

I am a noobie and have created a report using osx Report Designer and using a JDBC data source to MS SQL Server, I have also created a data source on the server with the same name also using JDBC.

But when I publish the report to the server, the report fails with a "validation failed error".

How can I create reports on different machines and allow the server to pickup the correct data source ?

Strange: error java.lang.String cannot be cast to java.lang.Long working with integer

$
0
0
I have a problem with datatypes and Formula step.

The formula is:
IF(isNA([idLesionDate]);[idLesionDate];[daysToStart])
and the result is stored in'daysToStart'.

I get this error:
Integer(4) : Unable to compare with value [Integer]
Unexpected conversion error while converting value [daysToStart Integer(4)] to an Integer
java.lang.String cannot be cast to java.lang.Long

days ToStart is defined as TIMESTAMPDIFF(DAY, lesionDate, startNPT) as daysToStart.

The problem happens when the value of 'daysToStart' or 'idLesionDate' in the first row is null.

If I preview the result after the previous step I get the right values of 'daysToStart' into a Integer(15) field. But in the preview after the Formula step I only get an error and all the values of that field are set to null (idLesionDate isn't null). The error is:
daysToStart Integer : There was a data type error: the data type of java.lang.String object [2018] does not correspond to value meta [Integer]

Best practice: Use different steps or do as much as possible in input steps.

$
0
0
Hello,

I used to use several steps (Value mapper, Select values, Null if ...) to perform the necessary transformations. However, in some situations I have had some problems with these and I have decided to do these transformations in the Table input step. In fact, I think the performance is better if you do the operations in the database. If you need to speed up the reading process, you will want to use simple querys, but if the source database is a staging database, perhaps the best option is to get all the work done in the reading step.

So, what's your opinion. Is it a best practice to do as much as possible in the input step?

Read database parameter from a properties file while running job

$
0
0
Hi,

I have a job with only three transformations. I want to read the database parameters from a properties file and use the same parameters in all the three transformations.
I have used the following code to set the parameters in the job:
Code:

JobMeta jobMeta1 = new JobMeta(filename, repo);
Job job = new Job(repo, jobMeta1);
job.initializeVariablesFrom(null);
Properties prop = new Properties();
InputStream ins = getClass().getClassLoader().getResourceAsStream("config.properties");
 if(ins != null) {
        prop.load(ins);
}
Enumeration enumer = prop.propertyNames();
while(enumer.hasMoreElements()) {
        String key = (String)enumer.nextElement();
        jobMeta1.setVariable(key,prop.getProperty(key));
}
job.setLogLevel(LogLevel.DEBUG);
jobMeta1.setInternalKettleVariables(job);
jobMeta1.activateParameters();
job.start();

But I get the following error:
Invalid number format for port number

Please help. Its urgent !!!

Value Mapper - default field

$
0
0
Hi,

I'd like to replace a subset of my input field with specified values via the Value Mapper step. But I'd also like the non-matching rows to default to their existing value. However, it seems the current functionality only allows for a constant value to be specified as the default upon non-matching rows. Anybody else think this could be a valuable change request for the Value Mapper step? and how would you deal with instead? via a case/switch?

Thanks

Automated testing of transformations

$
0
0
Hi

I am looking at setting up some automated testing for transformations and was planning to use RowDetail logging to log all input and output data for each step and load this detail to some sort of test tool to test expected vs actual for each step. However, it appears that not every step actually does this.

Another option I was going to look at would be extending the source code and extend the Hop class so that it performs this level of logging.

I don't want to rely on having to add "Log" steps throughout transformations but that may also be the only way.

Has anyone done anything similar (i.e. automated testing) so that I can see the input and outputs of every step in a transformation?

Thanks
Calum

run transformations in parallel through a loop

$
0
0
Hi all,

Is there a possibility to run transformations in parallel through a loop? I've got many tables to reload. Some of them are reloaded once a day, some few times a day etc.
Info about tables is stored in database(load history, load cycles, status etc). I've got transformation to set which tables should be reload in particular time and then run them through loop. To improve performace it would be great to load this tables in parallel but I've got a problem to do this. Image below, shows my schema .

loop_parallel.png

set_threads sets tables to reload and the results are sent to output(Copy rows to result). In the next step job run_threads is executing for every input row - everything works fine beside parallelism. As I noticed option "Launch next entries in parallel" doesn't work as I expected. What should be done in this case ?

Many thanks for help
kostek
Attached Images

how to upload huge file into Pentaho User Console

$
0
0
Hi all,

I have a csv file of 300 MB, its zipped size is 49 MB. I'm able to upload the zipped file into BI server. But not able to view it in the pentaho user console.

Could any one suggest me or provide the steps to upload huge file into the PEntaho user console.

regards,
naveen.

Validate fields against one another

$
0
0
Hi all,

new to this forum, and quite new to Kettle.

Is there a way (apart from javascript) to do a validation of two fields?

if field A has value "x", then field B must be filled in, something like this...

any help is appreciated

Spiffo

Json

$
0
0
Hi Sir, Madam,

I am new to JSON related task. Could you please help me, how to use JSON steps in PDI CE 5.3 edition. I followed using example( https://www.youtube.com/watch?v=J0xA9vyZc8w ) but NO luck to me. getting below error


Attached the sample transformation and json samle file, can you please provide sample transformation if possible.

Locale.txt
JSON.ktr

2015/08/24 11:35:31 - /JSON - Dispatching started fortransformation [/JSON]
2015/08/24 11:35:31 - Json Input.0 - ERROR (version 5.3.0.0-213, build 1 from 2015-02-02_12-17-08 by buildguy) : Could not open file #1 : file:///D:/ETL /samplemize.js --> org.pentaho.di.core.exception.KettleException:
2015/08/24 11:35:31 - Json Input.0 - Error parsing file [Unexpected token COLON(:) at position 7.]!
2015/08/24 11:35:31 - Json Input.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2015/08/24 11:35:31 - /JSON - Transformation detected one or more steps with errors.
2015/08/24 11:35:31 - /JSON - Transformation is killing the other steps!

Thank you
Attached Files
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>