Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

how to verify whether excel input date field has valid date values

$
0
0
Hi,

I have transformation in which i am getting an excel input data feed and write it into a table.
I have a date column in my excel, and need to check whether it contains valid date values in 'yyyy-MM-dd' format.
I retrieve the column as a string in excel input and trying to validate the data type using 'data validator' & used conversion mask 'yyyy-MM-dd'; defined error handling for that step and feed the errors to a text output. the assumption was, rows with empty/invalid date will be written into the error text file and only the valid dates will be passed to subsequent steps. instead i am getting errors on data validator step like 'unparsable date'. the other option i know is insert entire data into staging table and validate it using stored procedures.
need to know how to validate incoming fields and log the error data separately in Kettle itself.

Lourdhu.A

Attempt to allocate stack guard pages failed.

$
0
0
Hi,

One of Pentaho job failed with the below dump. Need your help to analyze the error.

3 jobs were running together when one of the job failed with this error.

Does the 3 child index entries below present 3 jobs running? How can i setup thePentaho application so that I can debug the below error?

2016/06/15 01:22:45 - ADM.0 - Connected to database [MASTER] (commit=1000)
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
Java HotSpot(TM) 64-Bit Server VM warning: Attempt to allocate stack guard pages failed.
child index = 21, logging object : org.pentaho.di.core.logging.LoggingObject@fd264b9 parent=fcb94b10-ad56-423f-880f-2d50084cc248
child index = 10, logging object : org.pentaho.di.core.logging.LoggingObject@549d83c9 parent=fcb94b10-ad56-423f-880f-2d50084cc248
child index = 18, logging object : org.pentaho.di.core.logging.LoggingObject@3dd2b5cc parent=fcb94b10-ad56-423f-880f-2d50084cc248


[error occurred during error reporting (null), id 0xc0000005]


#
# A fatal error has been detected by the Java Runtime Environment:
#
# EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x00000000724aa847, pid=256, tid=9288
#
# JRE version: Java(TM) SE Runtime Environment (7.0_55-b13) (build 1.7.0_55-b13)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (24.55-b03 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# V [jvm.dll+0x1ca847]

Thanks,
Rahul

CCC - orthoAxisFixedMin/leafContentOverflow usage + dom-to-image lib

$
0
0
Hi there,

We're combining ccc graphics output with dom-to-image library in order to generate a more wysiwyg images than the ones produced by cgg (made some pocs and got better visual results), and we've come across a problem while using 'orthoAxisFixedMin', due to the rect's having an 'x' negative value that stands beneath the 0 axis.

Code:

rect[Attributes Style]
      shape-rendering: crispEdges;
      cursor: pointer;
      x: -25.8906;
      y: 135.885;
      width: 89.5816;
      height: 30.5741;
      fill: rgb(0, 47, 157);
}

Reading a little bit, have found 'leafContentOverflow' with visible/hidden values, but this does not solve the problem. With a quick search over pv/ccc docu have not found any clues in order to have some workaround.
Guess it's not easy to resolve, but any ideas appreciated.

Thx!

pd: will check if this problem also exists using cgg in order to weigh the pros/cons of using it

Issue with SQL parameterized queries

$
0
0
I'm having an issue with CDA management of SQL parameterized queries.

For proof of concept's sake, I used a quite simple query
Here the correct initial SQL query.

SELECT * FROM DIM_RECORD
WHERE DIM_RECORD.RECORD_DURATION > 1000
ORDER BY DIM_RECORD.RECORD_NUMBER ASC

Then I created a parameter called where_clause_param which will contain the whole "WHERE" clause, so that the parameterized query looks like :

SELECT * FROM DIM_RECORD
${where_clause_param}
ORDER BY DIM_RECORD.RECORD_NUMBER ASC


The issue I'm facing is that even if I leave the parameter blank, the query execution is failing.
In fact, setting where_clause_param to null or '' -as if we were using MDX- will lead to this simple query :

SELECT * FROM DIM_RECORD
ORDER BY DIM_RECORD.RECORD_NUMBER ASC

​Best regards.

PS: MDX parameterized queries are working well. The problem is with SQL queries.
I'm working with Pentaho CE 6.0.1.0-386

Manage Users list is not displaying in Administration tab of Pentaho Home

$
0
0
Hi All,


While creating new user via PDI(using userroledao),by mistake I passed a blank user,it created user accordingly,but now in my Pentaho Home,I am not able to get available users list,also not able to delete selected user(using userroledao delete user method/Pentaho Home users list).



Note : Pentaho Version 5.4 CE

Thanks in Advance!!!


Untitled.jpg
Attached Images

Create Tenant Role Names: Best Practice?

$
0
0
Looking for some advice with tenant provisioning... specifically, I am wondering what the best practice is for role names when invoking ITenantManager.createTenant().

It seems that if I use the standard role names, (Administrator, Authenticated, and Anonymous), later calls to deleteTenant() have problems deleting the roles as it thinks they are system roles despite the fact they are in a non-system tenant. If I use custom role names, some of the logic in createTenant() does not pick them up and I end up with ACLs/ACEs with a mixture of custom and system names. How should new tenant role names be specified?

Here is the createTenant() API I am referencing:

public ITenant createTenant(ITenant parentTenant, String tenantName, String tenantAdminRoleName, String authenticatedRoleName, String anonymousRoleName)

Before I start working around the other problems I have encountered, I want to make sure I am between the lines. Any help is appreciated!

Randy

Merge 2 column field into 1

Could not connect to SMTP host

$
0
0
Hi,

I have upgraded BI from 3.10 to 5.4.

From BI, I am sending Email to 20 persons using pentaho server schedule to sending the email.

It was worked in 3.10 . But from 5.4 I am able to get 17/18 mails while other once are in errors.

Sometime A is not getting mail while sometime B.

Do we need to change anything also ?

Here is my log .....


DEBUG: setDebug: JavaMail version 1.4.7
DEBUG: getProvider() returning javax.mail.Provider[TRANSPORT,smtp,com.sun.mail.smtp.SMTPTransport,Oracle]
DEBUG SMTP: useEhlo true, useAuth true
DEBUG SMTP: useEhlo true, useAuth true
DEBUG SMTP: trying to connect to host "smtp.***.***.com", port 25, isSSL false
09:21:03,343 ERROR [EmailComponent] Error Start: Pentaho Pentaho Open Source BA Server 5.4.0.1-130
09:21:03,343 ERROR [EmailComponent] 1d09209d-31ce-11e6-96f9-00163e01035f:COMPONENT:context-866088565-1465867200154:send_email.xactionEmail.ERROR_0011 - SMTP ******@***.com
javax.mail.MessagingException: Could not connect to SMTP host: smtp.***.***.com, port: 25;
nested exception is:
java.net.ConnectException: Connection timed out
at com.sun.mail.smtp.SMTPTransport.openServer(SMTPTransport.java:1961)
at com.sun.mail.smtp.SMTPTransport.protocolConnect(SMTPTransport.java:654)
at javax.mail.Service.connect(Service.java:317)
at javax.mail.Service.connect(Service.java:176)
at javax.mail.Service.connect(Service.java:125)
at javax.mail.Transport.send0(Transport.java:194)
at javax.mail.Transport.send(Transport.java:124)
at org.pentaho.platform.plugin.action.builtin.EmailComponent.executeAction(EmailComponent.java:353)

Custom error row in dialog

$
0
0
I'm designing a custom plugin in Kettle.
In case of error, my processRow() method sends an error row through putError() with a specific meta layout. The run-time behavior is fine, but when connecting steps at design time, the dialogs complain because they believe that the error row has a different layout.


My question is: how can I set the RowMetaInterface for the error row used by the dialog ?
I've tried by invoking the BaseStep.getErrorRowMeta() and BaseStep.setErrorRowMeta() methods, but the dialog still sees the wrong layout.

How to change newline character to CRLF when export report as CSV

$
0
0
Hi

I always publish report I made by Report Designer.
And I export this report as CSV.

In default,when I export report, newline character is LF.
But I want to change newline character to CRLF

How can I realize above this.

Pentaho BA server 5.4.0(CE)

Best regards.

Problem with roles and permissions in BI Server 6.1.0.196

$
0
0
Hi, first time poster, apologies if this is a dim question, but I couldn't find an answer with a google or forum search.

I've installed and configured Pentaho BI Server Community Edition v 6.1.0.196 and have had some success linking up my DB, building a Mondrian Cube and using it with Saiku. I'm now looking to provide access to the Saiku data I've generated to users for comment, and as such I've tried to set up a new user and assign security.

What I've done:

In the Administration Dashboard I've added a new user and made them a Report Author. I've set up the report author to have the permissions Read Content, Publish Content and Schedule Content, and I've removed all permissions from Authenticated. I can log in as the new user.

Problem:

My new user can see everything: existence and contents of all user folders, and the Administration Dashboard amongst others. They can also manage security.

I've not done any configuration of config files to enable security, but I had a dig around and the security.properties file contains the text below. Am I missing any vital steps to make Pentaho CE enforce security?

Thanks in advance for pointing out my bleeding obvious mistakes, and apologies if I've missed a previous post on this; like I say I couldn't find one obviously.

Code:

provider=jackrabbit
requestParameterAuthenticationEnabled=false


# This flag indicates whether or not UserDetailsService is called during creation of user's principal.
# If the service is external (e.g. LDAP or JDBC-based auth is used) then such calls can be expensive.
# On the other hand, if user has been removed within external service, then it becomes impossible to
# prevent principal creation when the verification is muted
skipUserVerificationOnPrincipalCreation=true

dealing with eMail attachment name

$
0
0
Dear all:

we are able to get emails and attachments. The problem we have is that the atachment have the same file name. In our case, Windows, after downloading them, the attachment file name get an extra digit (a number) that makes the file unique. The main issue if how to associate each email file (files ending with extension .mail) to an attachment. For example, we are using the default naming convention:

We have two emails files:
name_101607-061620126_1.mail
name_101607-061620126_2.mail

and then we have two attachment files
test email.pdf
test email.pdf0

there is no way to know which pdf goes with which email file. Did anybody run into the same issue?

Esri shapfilereader unexpected results

$
0
0
I am reading a polyline shape file and get results that I don't expect and have no clue how to go forward.

This is the transformation I am making:

transformation.PNG

When previewing the results from the 'Esri Shapefile reader' I get the following result (which suit me fine):

shapefilreader.jpg

When previewing the excel output I get the following result:

excel_output.jpg

In the Excel output there is only one pointnr for each polyline (the maximum number) and all records have the same coordinate. I would expect a similar output as with the 'Esri Shapefile reader' step, because besides writing the output to Excel I am not changing the data in the stream.

Is this a bug, or am I misinterpreting the function of the 'Esri Shapefile reader'?

All help is very welcome!



I am using Pentaho Data integration 6.1.
Attached Images

PDI 6.1 - not finding database drivers

$
0
0
I have installed Kettle 6.1 on a Windows 2012 server.
I set KETTLE_HOME parameter to D:\KETTLE\pdi-ce-6.1.0.1-196

Spoon starts up with no issues, but when I try to set up a connection for Oracle dbase and I go to 'test connection' - I get:


Driver class 'oracle.jdbc.driver.OracleDriver' could not be found, make sure the 'Oracle' driver (jar file) is installed.
oracle.jdbc.driver.OracleDriver"

I have installed the Oracle Client and copied ojdbc6.jar in {KETTLE_HOME}\data_integration\lib - that did not help
I have also copied ojdbc7.jar in the same directory for good measure with same results.

I have done some search on the internet for resolution of similar issues - but could not yet find something that would work.
Is there a different directory this should go to, or is there some trick to setting it up? Seems like MS SQL connections work OK.
I found a helpful topic here http://stackoverflow.com/questions/2...ledriver-could but none of this worked for me.

Please advise.

Get Session Attributes

$
0
0
I am trying to get the session attributes/variables for an analyzer report using Java. I believe using IPentahoSession from the org.pentaho.platform.api.engine jar is the right approach for this:
Code:

IPentahoSession session = PentahoSessionHolder.getSession();
String host = (String)session.getAttribute("host");

But, "host" doesn't seem to be a real attribute, since it returns null. I have tried to get several attributes which seem like likely candidates ("Host", "hostname", "Hostname", "username", etc). But they all return null.

I can't figure out where to find a list of the possible attributes that you can retrieve. Is there some documentation on this? Are there some examples of working java code to retrieve attributes?

Accessing properties in dimensions

$
0
0
Hi all,

I have a dimension like this,

<Dimension name="TEST">

<Hierarchy name = "OLAP" visible= "true" hasAll="true" allMemberName="ALL" allLevelName="ALL" primaryKey="test_code" >

<Table name="TEST_master" />

<Level name="TEST" column= "test_code" uniqueMembers = "true" >

<Property name="label" column="test_name"/>

</Level>
</Hierarchy>
</Dimension>


I am trying to access the property name in a mdx query like

WITH MEMBER [Measures].[Label] as [TEST].CurrentMember.Properties("label") SELECT [Measures].[Label],[Measure].[IND] on columns [TEST].Members on rows from [FACT1]

but I get an error stating that No such property called label. Kindly guide me getting the property name.

Regards,
Nayagam

How to connect HBase using Apache Phoenix from Pentaho Kettle

$
0
0
Hi everyone,


I am trying to use pentaho Kettle to work with hbase using phoenix.
But i don't know how to connect from Pentaho Kettle to hbase.
Please help me to solve this problem.


Many thanks,
Hoang Anh.

Strikethrough is not working in Pentaho Report Designer 6.0.

$
0
0
Hi all,

I have made a report, where I have set 'Strikethrough' option to true. Formatting is applied in Pentaho 5.0, however same formatting is not getting applied to Pentaho 6.0.

What could be the root cause of the same?

Thanks in advance.

Hierarchical Clustering - Visualize Cluster Characteristics?

$
0
0
Hey all,

I am currently working on a dataset with 18Tsd instances and want to apply hierarchical clustering.


First, I applied X-Means and got a meaningful result as Cluster centroids, but with hierarchical clustering I can only infer, which instances belong to which cluster.
But that's in my case not sufficient as I cannot extract knowledge from that.

Is there any way to see characteristics like cluster centroids from the different clusters in hierarchical clustering as like with decision trees?


Thank you very much in advance :).

Greetings,
Markus

Repository of DI server

$
0
0
Hi.
From spoon, I was able to connect to the Pentaho Repository(Recommended) -Enterprise ready storage...
and I was able to create and store new transformation.

My question is: How could I execute those transformations via pan or kitchen. For example, to use pan, I need to specify the options --user, --pass and --trans.
I don't know where are my transformations stored? where is the default repository (Pentaho Repository)?

Thanks in advance for any help.

pika
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>