Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Salesforce Java Heap space error

$
0
0
I am not sure where the issue lies for my problem but here is the break down.
I am using a salesforce input to to read our opportunities object and update/insert into a mysql 5.5 table using a date range of about one day.
Two boxes and it runs out of heap space every time. I can truncate the mysql table and to a full load with the salesforce insert step without a problem.

I am able to do this with other salesforce objects the same way without issue.

I have already increased my memory settings in my spoon.bat (set PENTAHO_DI_JAVA_OPTIONS="-Xmx1024m" "-XX:MaxPermSize=4096m")

I know the opportunities object is large but It never even seems to start the process.

2015/04/03 09:03:42 - Spoon - Starting job...
2015/04/03 09:04:05 - sf_Update_Opportunities - Dispatching started for transformation [sf_Update_Opportunities]
2015/04/03 09:05:55 - Salesforce Input.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : UnexpectedError:
2015/04/03 09:05:55 - Salesforce Input.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : java.lang.OutOfMemoryError: Java heap space
2015/04/03 09:05:55 - Salesforce Input.0 - at org.apache.axis.message.SAX2EventRecorder$objArrayVector.add(SAX2EventRecorder.java:254)

Any thoughts?

Java characters encoding

$
0
0
Hi,

I'm trying to convert urls containing foreign characters (ã, ó, ñ, ...) with the step User Java Defined Class. My problem is : the code (attached) to do that is working well in Eclipse but not as I expect in Pentaho. Here's some examples :

url to convert : http://fr.wikipedia.org/wiki/São Paulo

url converted in Eclipse : http://fr.wikipedia.org/wiki/S%C3%A3o Paulo
url converted in Pentaho : http://fr.wikipedia.org/wiki/Sᅢᆪo Paulo

I wonder why. Is this a JVM encoding? Is it because of a Java library? Or something else?
I don't know.

Configuration :
Eclipse Luna (java-7-openjdk-amd64)
ubuntu 14.04 LTS
Pentaho 5.0.1

Have you some ideas to solve that?

Thanks
Attached Files

Java characters encoding

$
0
0
Hi,

I'm trying to encode urls containing foreign characters (á, ó, ö, 代, ...) by using the step 'User Defined Java Class'. My problem is : the Java code (in attachment) works well in Eclipse but not as I expected in Pentaho.

Here's some examples :

url to encode : http://fr.wikipedia.org/wiki/São Paulo

url encoded in Eclipse : http://fr.wikipedia.org/wiki/S%C3%A3o Paulo
url encoded in Pentaho : http://fr.wikipedia.org/wiki/Sᅢᆪo Paulo

Configuration :
- Ubuntu 14.04 LTS
- Eclipse (java-7-openjdk-amd64)
- Pentaho 5.0.1

I wonder why I have these square boxes. Is it because of Java version? Java library? JVM encoding problem?

Have you some ideas to resolve that please?

Thanks.
Attached Files

enable/disable

$
0
0
How to enable/disable logging feature based on the property value which we set. Is there anyway which we can do that ?

Any suggestions

How to pass "config.set" command to PDI to enable Hadoop File Copy

$
0
0
Hello PDI Users,

I am using PDI 5.2 CE with the Big Data plugin on Windows platform. Our Hadoop cluster is on Linux (Hortonworks 2.x). "Hadoop Copy Files" transform fails to copy files from or to HDFS. I cannot make it work due to some internal network issue on the Hadoop cluster. A solution to this is to set this property: config.set("dfs.client.use.datanode.hostname","true"); I would like some input as to where to set this property in PDI, so the Hadoop Copy (To/From) transformation works for me.

I've tested this issue with a simple java program and know that the above config.set setting works (so without the setting, this java program cannot copy files to/from either). I also know that my PDI install is good as I can use the same job to copy files to another sandbox Hadoop cluster that has only one node!

Please note that though I've extensive ETL experience, my java skills are very basic and I do not have thorough knowledge of Kettle internal architecture. :o

Thank you in advance.
Sean

LDAP Output modify dn/rdn transformation

$
0
0
Where I can find some transformation example or documentation about this subject?

I need to update some entries on ldap with a set of values.

I got the steps to retrieve the information in two differents ldap sources, made the filter, lookup , match the values, set the fields and save the result in a text file but I'm strugling to figure out how to deal with LDAP Output component to save the data. For me the LDap Input component is easy to use and understand althrough the ldap output isn't.

I need help desesperatly for accomplish this task, but unfortunatly I cound't found useful documentation, examples or transformation samples to learn about.


Thanks for help.

Ldap output interate a set of key-values to perform a modify in a set of attributes

$
0
0
I'm getting trouble with Ldap Output connector.

I'm filtered a set of fields and values from two distinct ldap sources and after performed some transformations to get the desirable fields and then set some values for them.

In the end I finished and got a set of then entries if an field id and the value of a other field for each entry, as is showed bellow:


Code:

    ------------------------------
    id                | uniqueID     
    ------------------------------
    12345678    |  xyz
    12345679    |  qwe
    12345670    |  123
    ...(seven more similar entries as above using a text/csv output component)

The distinguished name for each atribute is showed bellow:

Code:

id=12345678, ou=0, ou=people, globalid=00000000000000000000, ou=acm
e

And for each a set of attributes are present:

Code:

cn=Brenda Lee
    designated=10/10/2013
    uniqueid=xyz


So I'm need to figure out how to use the ldap output component together with another flow or filter component to lookup for each id value, find the correspondent distinguished name object and set the appropriated value for uniqueId each one, for example:

id=12345679, ou=0, ou=people, globalid=00000000000000000000, ou=acme

uniqueid=qwe

id=12345670, ou=0, ou=people, globalid=00000000000000000000, ou=acme

uniqueid=123

And so on.

So I need to do a lookup or for task to find each ldap entry with designed id ans set the correct uniqueId entry for each.

How can I accomplish this result in my kettle transformation?

Actually I'm using a Set Values component to mapping values before the LDAP Output component.
.

Save logs

$
0
0
Hi,

Is there any way which we can save transformation and jobname execution start_date and end_date in the flat file instead of loading data to database. If yes how can we do it.

Is there any way if we enable the log_flag as "Y" then load the transformation name and jobname startdate and enddate information into the table.

Dimension Lookup/update

$
0
0
Hi,

I am using Dimension Lookup/update to handle type 2 SCD. The basic use of that step shows that when some change occurs in an attribute of dimension, 'Dimension Lookup/update' step sets the system timestamp in column specified in 'table date range end'.

Is there any way to set timestamp in 'table date range end' specified column using some data column (some field from current or previous step) ?

Regards,
-Asrar

Json Output File Error

$
0
0
Hi,

I have problem with jason output. I am only getting one record in output file or only 1kb file is creating.

Id Code Amount Cat
1 A1 100 B
1 A1 200 P
2 b1 300 M
3 C1 300 D

I have tried Json Block name as ID, Nr rows in Blocs 4.
In fields section i have added all columns.
However couldn't get through in.
Can anyone help me please how to get jason output for all four rows.

Locally Weighted Learning

$
0
0
Dears,

I have a question in locally weighted learning algorithm, is it memory based algorithm and store all the data set in memory?
in this case how the update function works (if it is not updating the model)?

If it is memory based, is there any other instance based and pure incremental that really updates the model and doesn't store data in memory such as locally weighted projection?

Thank you in advance

how to fetch mysql daabase for Weka knowlegde flow from another machine

$
0
0
Hello,


I want to access MySQL database from private cloud instance to another machine in LAN, where Weka is running . I am using Weka Server for remote execution of clustering algorithm and Knowledge flow in addition to this. I am not getting how to fetch MySQL database from cloud to local machine for weka.


In knowledge flow ,Should i connect MySql server from cloud to Database loader instance?? and what will be the exact procedure for that?
Can i use " Remote expt https://weka.wikispaces.com/Remote+Experiment" for this connectivity??


Waiting for immediate reply.thank you in adv.

Remote host scheduler in Weka knowledge flow

$
0
0
Hello,

My Weka server is running properly using " java weka.Run WekaServer" command.
Now next step is to add tasks which will get displayed on browser page. I have uploaded sample cluster compare template in weka knowledge flow and trying to connect test connection in Remote host scheduler tab.

i got error " unable to connect to server". Also i have uploaded proper iris.arff dataset path at arffloader.

Let me know what mistake i am doing here? let me know proper procedure to schedule task for weka Server.!!!!!



Thanks,
NAZNEEN

CDE preview not showing charts

$
0
0
Hello all,

Can anyone tell me why charts are not showing in my dashboards? I tested sql connection and the output is exacly as I want it to be. As seen on many tutorials, all I need to do is to set this datasource to the chart, but it is not working. Also, i tried to set attributes I need as category, series nad values in advanced options, but nothing has changed. I tried to turn off crosstabMode, but the result is the same. Even sample data charts are not showing. I saw in a post that if you reinstall server and java version it could work, but I would not like to do that because I had a hard time installing the server in the first place and making it work.

I appreciate your help.
Thanks in advance!

spoon core dumped

$
0
0
HI all,

I am new to pentaho.Curently i have configured and worked on it.While trying to put data into a database my pentaho stucked and closed i am getting the following error in my terminal. could any one suggest


java: cairo-misc.c:380: _cairo_operator_bounded_by_source: Assertion `NOT_REACHED' failed../spoon.sh: line 205: 19924 Aborted (core dumped) "$_PENTAHO_JAVA" $OPT -jar "$STARTUP" -lib $LIBPATH "${1+$@}"




Thanks & Regards
Amithsha

Installation of BI Suite with MySQL database on Windows 7 HomePremium operating sys.

$
0
0
Hi all,
Please boost me up by responding to my 1st post. :)
I want to learn Pentaho BI and hence install the Pentaho Suite for this purpose.

To make sure that I don't get installation problems, I read few posts/ articles regarding this and found that for errors during installation, the common reply is "..oh may be it doesn't work on this version XXX .Try using YYY version instead."

So,wanted to know which versions of software (java jdk version ??, MySQL version ??, Pentaho Suite version ??, etc) gel well and work together error free.. if I want to work with Latest Version of Pentaho Suite with MySQL database on Windows 7 Home Premium operating system.
Also let me know the order of installations to be done -
I'm presuming it to be (pl correct if I am wrong)...
1. Java jdk
2. MySQL
3. Pentaho BI Suite

I wish ( and would be happy ) if there is one link that would take me to "a step by step instructions" to install my above requirements.

Regards

BISeeker

how to save a report with the extension of filename and current dateand time

$
0
0
how to save a report with the extension of filename and current dateand time

how to save a report with the extension of filename and current dateand time

$
0
0
From : ashok gorige
To : ashok gorige
Date : 2015-04-06 08:08
Title : how to save a report with the extension of filename and current dateand time
--------------------------------------------------------------------------------
Hi guys,
i have doubt in Pentaho Reporting Designer.
how to save a report with the extension of filename and current date and time automatically?
when user save a report in the format of pdf or excel, the file name must be file name with the extension of current date and time.
is it possible.
if possible, can let me know.

regarding sending single mail with multiple attachments

$
0
0
Hi All,


I have a tight requirement to send single mail with multiple attachments using transformation mail entry, attachment files are dynamic which consists of zip files & job log files, but mail entry in transformation sends multiple mails with each attachment as path in input row, so is there any way to send mail with multiple files using transformation mail entry....



test-send-mail.ktr

Regards,
Mateen
Attached Files

How to create data models and visualizations using Excel as a data source in DI?

$
0
0
I'm not able to find a way to use Excel as a data source for directly creating new data models and then the visualizations in Data Integration.
Can someone help me with this or provide me a link to the documentation for this.

Thanks.
Viewing all 16689 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>