Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Get File with FTP : no download

$
0
0
Hello

I try to get files with FTP, but nothing is downloaded.
I checked the parameters and log says

2016/09/20 15:39:13 - Get a file with FTP - Found 1 files (or folders) in the remote ftp directory
2016/09/20 15:39:13 - Get a file with FTP - set binary transfer mode
2016/09/20 15:39:13 - Get a file with FTP - Analysing remote file [20141007_000001_XPL.xml] ...
2016/09/20 15:39:13 - Get a file with FTP - =======================================
2016/09/20 15:39:13 - Get a file with FTP - Nr errors : 0
2016/09/20 15:39:13 - Get a file with FTP - Nr files downloaded : 0
2016/09/20 15:39:13 - Get a file with FTP - =======================================

So 1 file is found but it's not downloaded :(
Any hint ?

thanks in advance

Execute SQL Job not honouring constraints

$
0
0
Hi,
we recently added an "Execute SQL script" job step to clean up some records on a Postgres database table (tableA). There is a second table (TableB) with a foreign key defined with a constraint ON DELETE CASCADE over the id of the deleted records.

To our surprise the primary table records were deleted but the referenced table records are not. We coudln't believe it so we checked. And rechecked. And tried deleting manually (and yes, of course the records were deleted as expected).

Incredible enough. We don't know any way to disable constraints in Postgress! And now we have a database with records on TableB with a column with the value of the old now unexiting id of TableA!!! This is against all our knowledge of databases. Even if Postgress constraints were disabled and enabled again (wow!), enabling shouldn't work because the key value in TableB is not present on TableA!!!!!

Any idea what can be happening? We are really mad at this one

Thks

PS. Pentaho Kettle 4.4.0 Stable

How to use a parameter value?

$
0
0
Hi,

in a quite complex transformation I need to set some constant value that I use many times.

I would like to pass these values using parameters (Transformation properties/Parameters) but I cannot figure out how to do that.

For example I add some field (using "Add constant values") and I would like to insert the parameter value in the column Value, but it is not possible.
I thought it could be possible doing what I do in an output step where I use a parameter to define the name of a file (e.g.: Filename: /${parameter.name}_filename.xls ).

How could I do the same for a field?

Thanks for any suggestions!

LK

calling oracle procedures from pentaho

$
0
0
How to call a procedure in oracle using pentaho.

vinodh1978

Connecting Pentaho DI to Cloudera MV (problems reading file)

$
0
0
Hi,

I am tryng to read a file from HDFS in Hadoop from Pentaho DI and I am getting problems:

- Pentaho DI (open source) in a local machine Win 7 version 6.1
- HDFS in a Virtual Machine Cloudera Quick Start 5.4

I have indicated my Hadoop Distribution (cdh55) in Tools, I have created a cluster in 'view', when I test the cluster it works properly.

Then I create a Hadoop File Input step, I can browse in the cluster and select a File, I specified the stributes (Unix systemes, delimitier..) but I have problems when trying to read it. The following error:

......

Hadoop File Input.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Couldn't open file #1 : hdfs://cloudera:***@192.168.109.128:8020/user/cloudera/prueba.txt --> org.pentaho.di.core.exception.KettleFileException:
2016/09/20 13:27:02 - Hadoop File Input.0 -
2016/09/20 13:27:02 - Hadoop File Input.0 - Exception reading line: org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-286282631-127.0.0.1-1433865208026:blk_1073742215_1393 file=/user/cloudera/prueba.txt
2016/09/20 13:27:02 - Hadoop File Input.0 - Could not obtain block: BP-286282631-127.0.0.1-1433865208026:blk_1073742215_1393 file=/user/cloudera/prueba.txt
2016/09/20 13:27:02 - Hadoop File Input.0 -
2016/09/20 13:27:02 - Hadoop File Input.0 - Could not obtain block: BP-286282631-127.0.0.1-1433865208026:blk_1073742215_1393 file=/user/cloudera/prueba.txt
2016/09/20 13:27:02 - Hadoop File Input.0 - Procesamiento finalizado (I=0, O=0, R=0, W=0, U=1, E=1)
2016/09/20 13:27:02 - C:\temp\ejercicios pentaho\tr_conexion_hdfs.ktr : tr_conexion_hdfs - Transformaci�n detectada
2016/09/20 13:27:02 - C:\temp\ejercicios pentaho\tr_conexion_hdfs.ktr : tr_conexion_hdfs - Transformaci�n est� matando los otros pasos!

.....

I am not sure if the problem is in Pentaho or HDFS. I have checked HDFS service ans is up and running, also Zookeeper and other Hadoop services. I have created the connection with the user of Cloudera.

Can anybody help me?, ant advice will be greatly aprecciated.

Thanks in advance,

Juan

displaying the summed up valued in the center of a pie chart

$
0
0
All,

I am am using creating pie chart in CDE like the one in the below link
http://pentaho-bi-suite.blogspot.com...ntaho-cde.html

However, I am using the code like this below to get a value to display in the center of the pie graph

http://jsfiddle.net/duarteleao/mx4pr55t/

new pvc.PieChart({
canvas: 'cccExample',
width: 300,
height: 300,

animate: false,
selectable: true,
hoverable: true,

valuesVisible: true,
valuesLabelStyle: 'inside',
valuesFont: '35px sans-serif',
valuesMask: '{value.percent}',

label_visible: function() { return !this.index; },
label_left: null,
label_top: null,
label_textAngle: 0,
label_textAlign: 'center',
label_textBaseline: 'middle',
label_strokeStyle: 'black',

slice_innerRadiusEx: '85%',

categoryRole: "series",

slice_fillStyle: function() {
return this.index === 0 ? this.delegate() : "lightgray";
}
})
.setData(relational_04b, {crosstabMode: false})
.render();



However, is there a way to show the sum of all of the values in the center of my pie instead of just the percenter. So I have 4 values , each value is equal to 10, then I want the number 40 displayed in the center. any ideas?

Global Data base connection

$
0
0
Hi,
I have around 10 transformations which use the same database connection. Is there a way to reuse the same connection settings(like global) instead of creating for every transformation.

Thanks,
Padma Priya

Meta data injection with dynamic where clause

$
0
0
Hi together,

I make some tests with PDI and Meta data injection, now I have a SQL for a "table input" step with a dynmic where clause.

For example my SQL looks like:
Code:

select id,
      name
from person
where 1=1
${where_clause}

Now PDI should replace the variable "where_clause" with 'and last_update >= ${input_date}'
and also replace the new variable "input_date" with '18.09.2016' from the previous transformation.

The SQL for the Meta data injection must looking like:
Code:

select id,
      name
from person
where 1=1
and last_update >= '18.09.2016'

Have someone an idea, how I can implement it? Thank you for your support.


Regards
Florian

Store Procedure Pentaho Report Designer

$
0
0
Hello I would like to know if you can call a stored procedure from the Pentaho Report Designer and use parameters to run.

I do not see button "Change" (Pentaho bi server ce 5.0, plugin Change Password)

$
0
0
I installed plugin change password, then I enter to option of menu ARCHIVO/CHANGE PASSWORD but I do not see button "Change". Help!

why in my pentaho report designer not have data-cache options?

How to give symbol to the orthoAxis?

$
0
0
Hello Everyone,
I want to give a lable with symbol to the orthoAxis i.e (On orthoAxis lable : Temp(°C))
How can i do that?

I tried to use extension point, but it didn't help.

Thanks

how to set vertical line to 100% in detaiils section?

Data format BC

$
0
0
Good morning / afternoon.
The component Table Input gives an error: ORA-01856: It requires BC / B.C. and AD / A.D, because of this field
Quote:

NVL(to_date(TO_CHAR(my.date1, 'dd.mm.yyyy ad'), 'dd.mm.yyyy ad'), to_date(
'01.01.4712 bc', 'dd.mm.yyyy bc')) AS t$rcdt
Please help me.

Version 5.0.1 and 6.1, ojdbc14_g.jar.

Capture exact message from log field

$
0
0
When using default logging from pentaho pdi, it capture complete execution log in log field, can we capture exact error message in the log field?

ERROR: No repository provided, can't load transformation

$
0
0
C:\Users\Administrator\Desktop\data-integration\pan.bat Add a checksum - Basic CRC32 example /level Basic > C:\Users\Administrator\Desktop\data-integration\trans.log


-----------------------------------------------
I am runnigng the above sample and getting below error.



where am i doing mistake ?

DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jre1.8.0_66
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jre1.8.0_66\bin\java.exe
C:\Users\Administrator\Desktop\data-integration>"C:\Program Files\Java\jre1.8.0_66\bin\java.exe" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\pentaho-application-launcher-6.1.0.1-196.jar -lib ..\libswt\win64 -main org.pentaho.di.pan.Pan \file D:\repository\REPOSITORY_TEST.ktr /level Basic
01:11:54,806 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 2 at C:\Users\Administrator\Desktop\data-integra ***
*** tion\.\system\karaf\caches\pan\data-1 ***
*** Karaf Port:8803 ***
*** OSGI Service Port:9052 ***
*******************************************************************************
01:11:54,806 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
2016/09/21 01:11:57 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/09/21 01:12:01 - Pan - Logging is at level : Basic
2016/09/21 01:12:01 - Pan - Start of run.
ERROR: No repository provided, can't load transformation.

vinodh1978

Creating a Pentaho report with data Coming from PDI...

$
0
0
Hi

Please someone can explain to me the interest of this type of report?
I made a report like this in my training, but I don't see the interest.

What is the advantage to PDI like datasource in the place of JDBC when I build my report??

Regards,

PRD & Crosstab

Display Total at the end of Stacked Bar Chart

$
0
0
Hi folks,

Wondering if there is an easy way to display total of all columns in a stacked bar chart?

Thank you in advanced.

Value Mapper

$
0
0
I have a table where I'm converting county names to ISO codes for the UK, however there are also some addresses from other countries where I don't have these, but want the exiting value to stay as is in the new column...what can I put in the Default upon Non-matching field to keep the entry rather than changing it to a null?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>