Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Insert / Update: Unable to compare with value...

$
0
0
Hi,
I am using Kettle version 5.0.1.-stable.

An Insert / Update step sporadically fails with this error message:

org.pentaho.di.core.exception.KettleValueException:
BigNumber : Unable to compare with value [BigNumber(17, 2)]

In the beginning it was failing very sporadically. The Transformation is re-started in a job after 10 seconds again.
After a while the row is written to the target.
In the meantime it's happening with every 10th record.

We are experiencing the problem in production only (Ubuntu 12.04.4 LTS) .
It does not happen on my Windows development machine nor in our Ubuntu Beta environment.
Even if I copy the same source row into the beta database it will not fail.

Because of the fact that the record IS written after several retries (sometimes it takes hours, but finally the row will be written), I assume, that the error is not in the Update / Insert statement itself.
Can it be something in the synchronization/preparation in the dataflow? That the data type isn't clear in the moment when the update is checked?
I put a Block until step in front, to wait for all lookups and calculations, but it didnt change anything.

The Job did run perfectly for some weeks.
Then we added some calculated currency fields to the target table (decimal 17,2).
Each row always had (and has) an individual (local) currency.
In a first step I convert them to USD after a Lookup step for the conversion.
In a second step I get EUR, GBP, INR and JPY to USD courses in 1 Table Input, to calculate these values.
Since we a doing this, we have the Update problems.

What can cause that it sometimes works and sometimes does not for the same imput rows?

Thanks for any help.

Job Executor - Grouping rows

$
0
0
Hi Guys,

I have a stream of rows that contains a ID and I need process these rows grouped by this ID.

My stream looks like this:

rows.png

I´m using a Job Executor to handle the rows and process them in a grouped way but didn´t work like I expected. The job was called only 1 time and I expected that it would called at least 3 times.

This Job is responsible to call a Sub-Transformation that will process the grouped rows. I configured the Job Executor to group the rows using the column ID but when I executed the Main Transformation only the last rows were processed by the Sub-Transformation.

The project is attached.

Thanks!
Attached Images
Attached Files

BTable dynamic dimension change

$
0
0
When use BTable component to drill from dashboard to table (BTable) we must set mdx syntax Dimension in component properties. Does exists any solution to change Dimension with parameter (something like when use parameter in dimension value)?

example

dimension:
arg: [Product].[Name]
value: include:p_specific_product

example custom parameter:
param_1 = Dashboards.getQueryParameter("param_product");

How can change dimension [Product].[Name] when param_1 change value?

Thx.

org.pentaho.di.core.exception.KettleDatabaseException:

$
0
0
Hello all,

I have a Kettle transformation that connects to the Postgresql Amazon RDS instance.

When I established the connection, the test ran successful but when I run the job, it fails to connect.

Below is the log with the errors:

2014/03/20 12:04:59 - Save to Landing.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : An error occurred intialising this step:
2014/03/20 12:04:59 - Save to Landing.0 - Error occured while trying to connect to the database
2014/03/20 12:04:59 - Save to Landing.0 -
2014/03/20 12:04:59 - Save to Landing.0 - Error occured while trying to connect to the database
2014/03/20 12:04:59 - Save to Landing.0 -
2014/03/20 12:04:59 - Save to Landing.0 - Unable to pre-load connection to the connection pool
2014/03/20 12:04:59 - Save to Landing.0 - FATAL: password authentication failed for user "sidw_dev"
2014/03/20 12:04:59 - Save to Landing.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Error initializing step [Save to Landing]
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - Step [Trim Operation.0] initialized flawlessly.
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - Step [Data Conversion.0] initialized flawlessly.
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Step [Save to Landing.0] failed to initialize!
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - Step [Generate Rows.0] initialized flawlessly.
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - Step [GetYesterday.0] initialized flawlessly.
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - Step [filename.0] initialized flawlessly.
2014/03/20 12:04:59 - Load_Data_FRM_SRC_To_Landing - Step [ReadCSV.0] initialized flawlessly.
2014/03/20 12:04:59 - Save to Landing.0 - Signaling 'output done' to 0 output rowsets.
2014/03/20 12:04:59 - Save to Landing.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Unexpected error rolling back the database connection.
2014/03/20 12:04:59 - Save to Landing.0 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException:
2014/03/20 12:04:59 - Save to Landing.0 - Unable to get database metadata from this database connection
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.Job.run (Job.java:407)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.Job.execute (Job.java:500)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.Job.execute (Job.java:815)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.Job.execute (Job.java:815)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.Job.execute (Job.java:815)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.Job.execute (Job.java:678)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute (JobEntryTrans.java:1037)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.trans.Trans.execute (Trans.java:578)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.trans.Trans.prepareExecution (Trans.java:1043)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.dispose (TableOutput.java:708)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.core.database.Database.rollback (Database.java:739)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.core.database.Database.rollback (Database.java:747)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.core.database.Database.getDatabaseMetaData (Database.java:2653)
2014/03/20 12:04:59 - Save to Landing.0 -
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.core.database.Database.getDatabaseMetaData(Database.java:2655)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.core.database.Database.rollback(Database.java:747)
2014/03/20 12:04:59 - Save to Landing.0 - at org.pentaho.di.core.database.Database.rollback(Database.java:739)

Thanks,

Ron

Using a MS SQL server as a Data Source (MS Project Reporting Server)

$
0
0
Hi,

Having trouble figuring out how I'd configure this, since apparently the server is on a dynamic port.

Here is the info given by my DBA (hostname removed for privacy sake)


Server connection: XXXXXXXXX\SPS (dynamic port)
Version: MSSQL 2008 R2

Database: ProjectServer_Reporting

Can this be done?

How to execute CDE charts

$
0
0
When I set "Execute at start" to false, what's the code to execute it? And where can I place the code to execute it? In Post Fetch, Pre Change, Post Change?

"Metadata structure" step for a file

$
0
0
Hi:
Used "Metadata structure" step in database now trying to use for a file it works but generates default as "date" type

Input File "CSV step" ; delimited
------
21091;;2
1423;;2
1424;;2
6022;;2

Output
--------
Position;Fieldname;Comments;Type;Length;Precision;Origin
1;Field_000;;Integer;15;0;CSV file input
2;Field_001;;Date;-1;-1;CSV file input
3;Field_002;;Integer;15;0;CSV file input

When there is no values for the column "metadata" step assumes it as a date as default, is there any way to have the default as "String"? I checked if there is one row with value it works, for new transformations we have default values, but for already existing files we need to be able to take these null values.


Basically, I tried with "CSV input" and get fields, it defaults null to "date" so is there a way for "InputRowMetaData" to set default as "String"? please any quick help will be appreciated.
Thanks
Michael

Report doesn't show subreport in html view!!!!

$
0
0
Hi! i've publish a report on my ba server version 4.8. But when i display it inside my frame in html i don't see the subreports but only the header of master report.
with report designer it works fine! I've already set up in html output table body fragment = true.
Can anyone have an idea to help me?
Thank's a lot

Passing multiple parameters to an 'Execute SQL Statements' stage from input stream

$
0
0
Hi All,

I am in need of executing a couple of SQL statements for every input row coming from a file. For this I need to pass 3 input columns from the file as parameters to the SQL stage. Here is an example:

Columns in the Input file:
Key-val1
Key-val2
Key-val3

And here are the sql stmts that I need to execute inside the 'Execute SQL Statements' stage:

Update table1 set target_column1 = 'abc' where Key_column = Key-val1;

Update table2 set target_column2 = '123' where Key_column = Key-val2;

Delete from table3 where Key_column = Key-val3;

How can I pass these columns as parameters inside the 'Execute SQL Statements' stage? I know we can pass one parameter using '?' but how can we do it for multiple parameters?

relative path inside JCR

$
0
0
So... I'm in another mess with JCR, and I actually don't even know if I should post this here, or in PDI section, or even in CTools section.

The deal is: I've designed a transformation as datasource for a dashboard. This transformation uses ETL Metadata Injection into another transformation. I've set the file in Injection step as ${Internal.Transformation.Filename.Directory}\XX_historicoDLO_injected.ktr

But when I execute the dashboard, all I get is this error:
Code:

2014/03/20 17:17:19 - ETL Metadata Injection.0 - Unable to read file [file:///C:/Program Files (x86)/Pentaho/biserver-ce/pentaho-solutions/home/DLO/XX_historicoDLO_injected.ktr]
2014/03/20 17:17:19 - ETL Metadata Injection.0 - Could not read from "file:///C:/Program Files (x86)/Pentaho/biserver-ce/pentaho-solutions/home/DLO/XX_historicoDLO_injected.ktr" because it is a not a file.

Apparently, ${Internal.Transformation.Filename.Directory} and JCR don't get along. What can I do here to solve this?

Meat and Potatoes

$
0
0
Newbie:

I have training data like this:

iip,lat,lon,email,date,tag, confidence
64,35,-89,a@b,2013,Jim, 1
64,35,-89,ugh, 2014,Jim, 0

I want to fill in the confidence in real-time...

Q: Can anyone point me to a textbook or place I might start?

Or perhaps I should just write rules and forgo Weka. Not sure...

Thanks for any leads!

Jim Pruett
Wikispeedia.org

Connect Pentaho User Console Enterprise Edition to SQL SERVER 2000

$
0
0
I'm really confused right now. I try to make a connection between Pentaho User Console and SQL SERVER 2000 in Pentaho User Console but it's failed. When I try to make that connection in Pentaho Data Integration, connection success.
Then, I want to create a report in pie chart. I want to create that report in Pentaho User Console but every time I try to make database connection to SQL SERVER 2000, connection failed.
How I can solve this problem?

Is there a Bug in Pentaho Report Designer 3.9.1?

$
0
0
I installed Report Designer 3.9.1 in my Linux Server. I created a report and published it to BI server community edition 4.8. When I checked the report in Pentaho I noticed that in some of the pages of the report the top most entry under the Detail line are being covered by the Header. How can I ensure that the height of the Detail line will be followed throughout the pages of the report to ensure that every detail will be seen? Can anyone help me? Thanks in advance.

R Weka integration errors

$
0
0
Hi,

I am getting the following errors when trying to run Weka 3.7.10 under Windows7 with R.3.0.2:

---Registering Weka Editors---
Getting parent classloader....
Injecting JRI classes into the root class loader...
Windows 32 bit OS
Trying R_LIBS_USER (C:/Users/user1/Documents/R/win-library/3.0)
Found rJava installed in C:\Users\user1\Documents\R\win-library\3.0\rJava
Trying to loaded R library from C:\Users\user1\Documents\R\win-library\3.0\rJava \jri\i386\jri.dll
Engine class: class org.rosuda.JRI.Rengine ClassLoader:sun.misc.Launcher$ExtClas sLoader@1a1472d
Unable to load R library from C:\Users\user1\Documents\R\win-library\3.0\rJava\j ri\i386\jri.dll: C:\Users\user1\Documents\R\win-library\3.0\rJava\jri\i386\jri.d ll: Can't find dependent libraries
Getting REngine....
Exception in thread "Thread-3" java.lang.UnsatisfiedLinkError: org.rosuda.JRI.Re ngine.rniSetupR([Ljava/lang/String;)I
org.rosuda.JRI.Rengine.rniSetupR(Native Method)
org.rosuda.JRI.Rengine.setupR(Rengine.java:170)
org.rosuda.JRI.Rengine.run(Rengine.java:635)

at org.rosuda.JRI.Rengine.rniSetupR(Native Method)
at org.rosuda.JRI.Rengine.setupR(Rengine.java:170)
at org.rosuda.JRI.Rengine.run(Rengine.java:635)


Loading freezes at this point.

I have:
R_LIBS_USER = C:/Users/user1/Documents/R/win-library/3.0
PATH = %SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\;C:\Program Files\R\R-3.0.2\bin;C:\Program Files\Calibre2\

I would highly appreciate any help or suggestion.

Connect Pentaho to SQL SERVER 2000

$
0
0
I am newbie. I need help how to make a connection between Pentaho User Console and database SQL SERVER 2000 using JDBC access.
I already follow tutorial that I get from this webpage: http://wiki.pentaho.com/display/Serv...005+Connection

Since the driver version SQL Server 2000 JDBC driver, that I are looking for is no longer available for download in Microsoft SQL Server webpage, I download JDBC driver from http://sourceforge.net/projects/jtds/files/

Based on tutorial above, I make a changes in context.xml like this:

Code:

<Resource name="jdbc/Microsoft" auth="Container" type="javax.sql.DataSource"
        factory="org.apache.commons.dbcp.BasicDataSourceFactory" maxActive="20" maxIdle="5"
        maxWait="10000" username="sa" password="sa"
        driverClassName="com.microsoft.sqlserver.jdbc.SQLServerDriver" url="jdbc:sqlserver://172.16.1.222:1433;DatabaseName=inventory"
        validationQuery="select count(*) from sys.indexes"/>

I also make a changes in jdbc.properties like this:

Code:

BookStore/type=javax.sql.DataSource
BookStore/driver=com.microsoft.jdbc.sqlserver.SQLServerDriver
BookStore/url=jdbc:microsoft:sqlserver://172.16.1.222:1433;DatabaseName=NPDemoData
BookStore/user=sa BookStore/password=sa

Then, I try to create new Data Source for this connection in Pentaho User Console but when I test the connection, failed. Can anyone tell me how to fix this?

PostgreSQL Bulk Loader

$
0
0
Hi,

I cant find any doc/information how to proper use Postgresql Bulk Loader. My question is how could i feed it with rows which are strings direct for copy command.

for example this is what i need:
COPY lead_tool.profcopy
FROM stdin;
900 abcd 10 f abasd adasd {a} {c,d} fb
\.


But the Bulk loader adds: (WITH CSV DELIMITER AS '\t' QUOTE AS 'null';)
COPY lead_tool.profcopy ( profile_identifier, title, fan_count, is_in_analytics, url_key, profile_countries, profile_tags, profile_type ) FROM STDIN WITH CSV DELIMITER AS '\t' QUOTE AS 'null';

I dont wanna use that in copy command, why CSV? why Quote? I want to pass rows/lines into Loader as tab seperated data.

Any idea/ suggestions/ ideas?


thanks

How to get database specic exceptions from transformation

$
0
0
I need to get the exceptions thrown by the source database (Teradata) when there is an error in the query, which is executed by the transformation step.

I am getting the exception in the console,in the form of log with timestamps which is done by the transmeta methods .
The only method of trans object available is trans.getErrors() which gives just an integer number of errors.
How can I get the error details - the cause, message, stack trace etc- so that I can handle it in a custom way.

The only exception I am able to catch and printstacktrace in my code is Runtime Exception, which gives no information about the teradata sql error.

How to deny login to not Autheticated users using LDAP (BI server 5.0.1)

$
0
0
I've managed to have LDAP Access in Pentaho BI 5.0.1, but i'm unable to deny login to users which doesn't have "Authenticated" role (or "PentahoAuthenticated" in my case).

When a user without "Authenticated" or "PentahoAuthenticated" role tries to log in Pentaho BI typing a valid password, he can actually reach the PUC. In the login process catalina.out spit some errors like "Access denied to this data...", so the user is unable to access the JCR repository and PUC only shows him "Browse files" (which do nothing) and "Documentation" buttons.

In previous versions of Pentaho (4.10) was possible to define ACL rules in ./pentaho-solutions/system/pentaho.xml file but that file no longer contains ACL rules.

I tried to alter applicationContext-security-ldap.properties in that way
Code:

contextSource.providerUrl=ldap\://ldap.org.intranet\:389
contextSource.userDn=cn\=root,dc\=org,dc\=intranet
contextSource.password=xxxxx

userSearch.searchBase=ou\=Users,dc\=org,dc\=intranet
userSearch.searchFilter=(uid\={0})

populator.convertToUpperCase=false
populator.groupRoleAttribute=cn
populator.groupSearchBase=ou\=Groups,dc\=org,dc\=intranet
populator.groupSearchFilter=(&(memberUid\={1})(cn="PentahoAuthenticated"))
populator.rolePrefix=
populator.searchSubtree=false

allAuthoritiesSearch.roleAttribute=cn
allAuthoritiesSearch.searchBase=ou\=Groups,dc\=org,dc\=intranet
allAuthoritiesSearch.searchFilter=(objectClass\=posixGroup)

allUsernamesSearch.usernameAttribute=uid
allUsernamesSearch.searchBase=ou\=Users,dc\=org,dc\=intranet
allUsernamesSearch.searchFilter=objectClass\=person

adminRole=cn\=PentahoAdmin,ou\=Groups,dc\=org,dc\=intranet
adminUser=cn\=root,dc\=org,dc\=intranet

I've added the condition (cn="PentahoAuthenticated") in populator.groupSearchFilter but with no success.

Any idea?

regards

carles.

Obtaining metric information of transformation in Java

$
0
0
Hi,

I would like please some help obtaining metric information (number of lines read, written, etc.) of a finished transformation in my Java program. My program dynamically creates transformations then run them and finally show the detailed information about the finished transformations. I use the Kettle Java API version 4.4. This is my code simplified :


KettleEnvironment.init(false);

//this creates the TransMeta object and the XML file
TransMeta transMeta = createTransformationPerQuery(t...);

if (transMeta != null) {
Trans transformation = new Trans(transMeta);

// adjust the log level //
transformation.setLogLevel(LogLevel.ERROR);

// retrieve logging appender
Log4jBufferAppender appender = CentralLogStore.getAppender();

// retrieve logging lines for transformation
String logText = appender.getBuffer(transformation.getLogChannelId(), false).toString();

transformation.execute(new String[0]);

// waiting for the transformation to finish
transformation.waitUntilFinished();

org.pentaho.di.core.Result result = transformation.getResult();

// report on the outcome of the transformation
logger.info(result.getNrErrors());
logger.info(result.getNrLinesWritten()); // this is always 0
logger.info(logText); //this is always empty


This should work, right? What am I doing wrong?
Thanks in advance!

Kettle Latest version

$
0
0
Could some one let me know from where I could download Kettle PDI 5.1.0 GA? This is the version in which the transformation executor issue is resolved I suppose.

Thanks & Regards
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>