Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

regarding xml component setting for parsing large xml data

$
0
0
Hi All,

My existing Java Application uses Apache DDLUtils to generate database schema which is very complex and large in size, so when i am using xml component, kettle is getting hanged for 15 mins, so which parser it uses for traversing XML structure (SAX or DOM), where these settings are defined and how to change it ?


Regards,
S.A. Mateen

Regarding tomcat server which comes with Pentaho CE 4.5

$
0
0
Hi All,

I am new to Pentaho BI, i have tried to integrate my Java web application with tomcat (which comes along with BI server) , but when i deploy it and restart tomcat, its not getting started, my main aim is to use only tomcat within single system, so
Can i integrate Pentaho bi server with already existing tomcat, which i already had it prior to installing Pentaho BI ?

Regards,
S.A. Mateen

how to rename/refractor pentaho project

$
0
0
I am new to pentaho, and all i want to rename/refractor my project

Lets suppose my project is deployed as
http://localhost:8080/pentaho/loginbut i want to rename pentaho as
http://localhost:8080/myname/loginHow can i do it?

Can I use Kettle 5.0 with Pentaho BA 4.8 ?

$
0
0
Hi,

We have installed pentaho BA 4.8 ,

we are using kettle 4.3 and now I want to use kettle 5.0,

I am using kettle as a separate and jobs are running using Shell script and Job scheduler, there are no any direct interaction of BA 4.8 and kettle.

So can I update and using kettle 5.0 like this ?

Intermittent java.util.ConcurrentModificationException .HashMap$Ha****erator Errors

$
0
0
All,
I am running Pentaho 5.0.1-stable and I get the below error on several different jobs randomly through out the day.


java.util.ConcurrentModificationException
at java.util.HashMap$Ha****erator.nextEntry(HashMap.java:926)
at java.util.HashMap$EntryIterator.next(HashMap.java:966)
at java.util.HashMap$EntryIterator.next(HashMap.java:964)
at java.util.HashMap.putAllForCreate(HashMap.java:558)
at java.util.HashMap.clone(HashMap.java:800)
at org.pentaho.di.core.plugins.KettleURLClassLoader.closeClassLoader(KettleURLClassLoader.java:237)
at org.pentaho.di.core.plugins.BasePluginType.registerPluginJars(BasePluginType.java:564)
at org.pentaho.di.core.plugins.BasePluginType.searchPlugins(BasePluginType.java:117)
at org.pentaho.di.core.plugins.PluginRegistry.registerType(PluginRegistry.java:517)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:489)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:457)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:110)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:65)
at org.pentaho.di.kitchen.Kitchen$1$1.call(Kitchen.java:96)
at org.pentaho.di.kitchen.Kitchen$1$1.call(Kitchen.java:91)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)

Looking through the bugs database this type of issue gets closed because it cannot be duplicated.
I am just wondering is anybody out there getting this exact error? And do they have a simple job/transformation that can duplicate it?

Thanks,
-Eric

CDE Chart doesn't display date correctly

$
0
0
Hi guys!

I have a problem in making a chart: I'm training on make charts using CDE.
My problem is, like you can see in the attachment, that the x-axis, that represents the time, is categorized and doesn't scale on the total time.

CCC-problem.jpg

Practically, here in the image, you see that 00.05 is equidistant from 00.00 to 01.00 when I want that this is re-scaled.
I found some clue in http://www.webdetails.pt/ctools/ccc....me-series-line but I didn't find the field "dimensions" in the CDE and I didn't found how I can inject category: {valueType: Date, isDiscrete: true } inside the CCC Line Chart component.

How did you resolved this? Thank you in advance
Attached Images

How to increase Memory?

$
0
0
Hi all,

I am new to Pentaho, I have a transformation that reads data from a postgresql table and exports it into a flat file.

When I run the transformation, it generates an error:

2014/02/24 12:04:23 - Spoon - Asking for repository2014/02/24 12:04:23 - Version checker - OK
2014/02/24 12:10:22 - Spoon - Transformation opened.
2014/02/24 12:10:22 - Spoon - Launching transformation [DatabaseExampleToprovideparameters2]...
2014/02/24 12:10:22 - Spoon - Started the transformation execution.
2014/02/24 12:10:22 - DatabaseExampleToprovideparameters2 - Dispatching started for transformation [DatabaseExampleToprovideparameters2]
2014/02/24 12:10:22 - Retrieve Products.0 - Finished reading query, closing connection.
2014/02/24 12:10:22 - Text file output.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : UnexpectedError:
2014/02/24 12:10:22 - Text file output.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : java.lang.OutOfMemoryError: Requested array size exceeds VM limit
2014/02/24 12:10:22 - Text file output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.convertStringToBinaryString(TextFileOutput.java:338)
2014/02/24 12:10:22 - Text file output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.formatField(TextFileOutput.java:282)
2014/02/24 12:10:22 - Text file output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.writeField(TextFileOutput.java:387)
2014/02/24 12:10:22 - Text file output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.writeRowToFile(TextFileOutput.java:245)
2014/02/24 12:10:22 - Text file output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.processRow(TextFileOutput.java:193)
2014/02/24 12:10:22 - Text file output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2014/02/24 12:10:22 - Text file output.0 - at java.lang.Thread.run(Thread.java:744)
2014/02/24 12:10:22 - Retrieve Products.0 - Finished processing (I=10, O=0, R=0, W=10, U=0, E=0)
2014/02/24 12:10:22 - Text file output.0 - Finished processing (I=0, O=1, R=1, W=0, U=0, E=1)
2014/02/24 12:10:22 - DatabaseExampleToprovideparameters2 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2014/02/24 12:10:22 - Spoon - The transformation has finished!!
2014/02/24 12:10:22 - DatabaseExampleToprovideparameters2 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2014/02/24 12:10:22 - DatabaseExampleToprovideparameters2 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Errors detected!
2014/02/24 12:10:22 - DatabaseExampleToprovideparameters2 - DatabaseExampleToprovideparameters2
2014/02/24 12:10:22 - DatabaseExampleToprovideparameters2 - DatabaseExampleToprovideparameters2

I know the error has something to increase the JVM.

How do I increase my JVM on a 64bit Windows 7 machine installed with PDI?

Thanks,

Ron

Cannot append to a zip file

$
0
0
hi,
I'm trying to create a zip file with two input files.

The first 'zip file' step works fine, but the second 'zip file' step even with the option
'if zip File exists' set to 'Append file to the existing'
fails with a 'Could not create file XXXXXX , exception:URI is not absolute'.

Tested all I am able to imagine but no way: if the zip file exists I canot append.

What am I doing wrong?

Thks

Mail Issues

$
0
0
I have tried sending a mail via the job, that failed since the file is located on a network share. I put in a mail step in the transform and I get an email, but one email per line of the file. Is there a way to only send the file once? I heard someone else having this issue, but didn't find the fix for it.

pdf-events

$
0
0
I was able to render my .prpt (converted to HTML) report as a pdf file within the browser window but I'm looking to give the user the option to open with pdf reader and or save to desktop. I'm unsure of how to accomplish this within the html attributes tab. I have the following code within the append-body section which renders the report correctly as pdf.

Code:

<form style="display:inline"
id="pdfLink"
action="index.html"
method="get">

<input type="hidden"
name="reportName"
value="dashboard"/>

<input type="hidden"
name="outputFormat"
value="pdf"/>

<input type="submit"
value="View As PDF"/>


</form>

Your help is appreciated.

Error: operator does not exist character varying= bigint (when inserting new records

$
0
0
Hi all,

I have a simple transformation that imports data from a flats file and inserts it into Postgresql table.

When I execute the transformation, I get an error as below:
2014/02/24 17:40:55 - Update.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Error in step, asking everyone to stop because of:
2014/02/24 17:40:55 - Update.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2014/02/24 17:40:55 - Update.0 - Error looking up row in database
2014/02/24 17:40:55 - Update.0 - ERROR: operator does not exist: character varying = bigint
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 61
2014/02/24 17:40:55 - Update.0 -
2014/02/24 17:40:55 - Update.0 - at org.pentaho.di.core.database.Database.getLookup(Database.java:2636)
2014/02/24 17:40:55 - Update.0 - at org.pentaho.di.core.database.Database.getLookup(Database.java:2611)
2014/02/24 17:40:55 - Update.0 - at org.pentaho.di.trans.steps.update.Update.lookupValues(Update.java:104)
2014/02/24 17:40:55 - Update.0 - at org.pentaho.di.trans.steps.update.Update.processRow(Update.java:327)
2014/02/24 17:40:55 - Update.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
2014/02/24 17:40:55 - Update.0 - at java.lang.Thread.run(Thread.java:744)
2014/02/24 17:40:55 - Update.0 - Caused by: org.postgresql.util.PSQLException: ERROR: operator does not exist: character varying = bigint
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.


I have attached the transformation for the reference.

Thanks for your help.

Ron
Attached Files

Null values when Joining MSSQL -> MySQL

$
0
0
Hello all,

I've been trying to create a pretty straight forward job that takes a few thousand records from a MSSQL database, and pulls certain foreign keys from a MySQL table. Once I have these reference keys, I'd look to make an insert in the MySQL database for the records. (For those curious, I'm taking my Dynamics GP invoice line items and putting them in MySQL to run batch e-invoicing...)

My issue is that no matter what tool I use, I get null values from the MySQL records. I've sorted everything on the given key in Ascending order, used Merge Join, Stream Lookup, Database Lookup, and probably one or two other options that I can't recall.

Does anyone know why I might be having issues? Both fields are formatted as strings (the key generally fits the format `CSL-####` or `SSL-####`), so I don't think its a data type issue. After about 15 hours into tinkering, I think I'm up against a wall here.

Any help is appreciated!

Edit - solved, see below, was a silly issue!

Multiple Classifications Problem

$
0
0
Hello everyone,

I just started using weka. I have read and watched multiple tutorials that explain how to use weka for classification. But now I am not sure if weka is the right tool for my problem.

The dataset that I have contains 10 networks. Each network is being descried by a number of attributes. Then I have to classify my 10 networks to three classes A, B, and C according to four different tests.

For example, according to their sizes, my networks will be classified into A, B ,C. Then
according to another criteria, my networks will be classified again to A, B, C. And so on for four times

Obviously, each network will appear in a different class each time. And here where I needed to use weka.

Does weka classification support this? Because all what I have seen so far is to use one classification attribute.

Any help will be greatly appreciated.


Thank you.

Removing Special Characters from a string field in Oracle

$
0
0
Today while I was doing consultancy work I faced against the issue of loading a table into from Oracle to PostgreSQL, when I checked the logs I saw the some oracle varchar fields had strange characters at the end of … Continue reading →

More...

Displaying error message while adding table component

$
0
0
Hi,
I am new to pentaho CDE. I created sample dashboards. Now I am getting display problem while table component is added to the panel. The same query is executing for Chart components. How can i rectify this.................


Thanks in advance.............

Displaying error message while adding parameters to the table component

$
0
0
Hi,
I am new to pentaho CDE. I created sample dashboards. Now I am getting display problem while table component is added to the panel. The same query is executing for Chart components. How can i rectify this.................


Thanks in advance....

Displaying error message while adding parameters to the table component

$
0
0
Hi,
I am new to pentaho CDE. I created sample dashboards. Now I am getting display problem while table component is added to the panel. The same query is executing for Chart components. How can i rectify this.................

Thanks in advance..........

Which is better option for ordering on attribute column .Value or .Name?

$
0
0
Using Foodmart sample cube we are trying to Sort data based on Attribute column.


Following is Sample MDX where we are applying sorting, (here we are using CurrentMember.Value while sorting data):


SELECT {[Measures].[Store Sales]} ON 0, NON EMPTY ORDER({[Customers].[Customers].[State Province].AllMembers}, [Customers].[Customers].CurrentMember.Value, BDESC) ON 1
FROM [Warehouse and Sales]


in above MDX as we are trying to perform descending sort based on Attribute column, it should sort it on that column, but instead of doing that it incorrectly sorts the value based on measure column available in MDX.


Now if we use CurrentMember.Name it correctly sorts data based on Attribute column. Following is sample MDX query for same:


SELECT {[Measures].[Store Sales]} ON 0, NON EMPTY ORDER({[Customers].[Customers].[State Province].AllMembers}, [Customers].[Customers].CurrentMember.Name, BDESC) ON 1
FROM [Warehouse and Sales]

Please find attached screenshots for same queries and results for both executed over Saiku Analytics plugin on Pentaho Server.


Our observation is CurrentMember.Name gives correct ordering where as CurrentMember.Value does not give correct ordering. What according to you is the better option?


Thanks,
Hardik
Attached Images

Pehtaho PRD 5.0.1 - stable : hide null values

$
0
0
Hi all!

Could you please, help me with the issue in Report Designer 5.0.1 - hide null values. I guess this function has removed in this report version, because I used it in prd 3.9.

I need to hide a line on my subreport when the data is null there. The labels you can hide using if-null attribute and in expression write if statement formula, but the line hasn't that attribute.

How to resolved it?

create root folder in user console

$
0
0
Hi
I need to create a root folder like '1CAAP' in pentaho user console.We can create under home folder but i need a root folder.

Now am achieving by creating folder using report designer while publishing my reports.-windows machine

How i ll create in Linux machine.

Thanks in advance
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>