Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Pentaho Data Services in CE

$
0
0
Hi, guys.
I'd seen new feature in PDI 6.1 - Data services.
And... I couldn't understand how to run it as service on PDI CE. :(
Help me please, is it possible?

thanks

Get File Names Transformation Step and VFS

$
0
0
PDI version 6.1 CE

I've been trying to use this step to retrieve a list of SFTP files using a VFS sftp://name:pwd@blah.com:22/ URL

There are no special characters in the name/password or host. All are alphanumeric and of course period in the host.

I'm using vfs.sftp.StrictHostKeyChecking with default value of no as a parameter in the transform.

This works fine in the Job 'Add filenames to result' step, however in the transform it fails.

Interestingly enough in the Get File Names step when I preview the filenames (the filenames button) it actually works and displays the filenames,
but trying to run the step and/or preview it causes the transformation to 'hang'.

Any thoughts or advice?

Thanks,

BobC

PDI 5.4 CE Mondrian Step Error java.lang.NoClassDefFoundError

$
0
0
I'm tryng to use a Mondrian Input step in a PDI tranformation but after setting DB connection (Postgres via JDBC), Mondrian XML schema in catalog and a sample MDX query, I get always the error: 'java.lang.NoClassDefFoundError: Could not initialize class mondrian.spi.DialectManager' if I try to preview result.
The JDBC connection is already used for others kettle jobs and XML schema is working well on Pentaho BI for OLAP query (ie with pivot4j plugin). Any idea about the problem? Thank in advance for any suggestion.

Adabas on mainframe and Pentaho kettle

$
0
0
Hello everybody. I need to know if i can extract data from ADABAS on a mainframe, transform and load them into a data warehouse and also need to know whether the synchronization of the data could be automatically or need to schedule a job to run periodically. Thank you.

SQL Server Table Output Step not converting

$
0
0
Hi,

I am outputting some rows of data from a custom step. I have double checked that the row I am sending along with the meta data all looks fine.
i.e. every java.util.Date in the row object has a corresponding type of Date in the metadata, every Double is a numeric, etc.
I used this Pentaho recommended code to do the conversions

ValueMetaInterface source = new ValueMeta(dataItemAsString, ValueMetaInterface.TYPE_STRING);
ValueMetaInterface target = outputRowMeta.getValueMeta(indexOfFieldInOutputRow);
logBasic("Converting " + fieldName + " to " + target.getType());
Object convertedDataItem = target.convertData(source, dataItemAsString);

When I feed the data stream into a Table Output step (SQL Server) and match up the fields with the columns, I get this on every row:
"Error converting data type nvarchar to numeric"

I get similar ones for dates if I include those in the mapping.

Any ideas please?

Collector for rows with different metadata

$
0
0
In Kettle I have two steps with different meta layout that send data to a common User Defined Java Class that acts as data collector (see the following picture for reference).

collector.jpg


The output rows sent by the to steps have different metadata. However, the User Defined Java Class always sees the metadata of the first row arrived as getInputRowMeta() always return the same information.

Is any method available for getting (refreshed) information about the metadata of the specific row returned by getRow()?
Attached Images

Crosstab Report with Static Rows

$
0
0
Hi,

I'm looking to make a crosstab report in Pentaho that has the exact same column headers and row headers. For example,

0 1 2 3 4 5
A a b c d e
B f g h i j
C k l m n o
D p q r s t
E u v w x y
F z - - - -

The column headers '1-5' and row headers 'A-F' will always be exactly the same and in that order. I was thinking of using a Crosstab report but I don't know how to control the order in which the headers appear.

In PRD 3, we had created a custom jsp to handle this issue. What's the best way to do it in PRD 6.0?

-- Dev

Kettle parameter not used during run in BI Server

$
0
0
Hello together,

I am working with the internal variables in my kettle transformation to define the directory for file-Output relative to the destination of kettle transformation:
${Internal.Transformation.Filename.Directory}/output.txt

-> lets say the Destination is: "D:/pentaho/test/transformation.ktr

Then I built a very simple Kettle Job which runs the Transformation and store this job in same directory:
-> lets say the Destination is: "D:/pentaho/test/job.kjb

=> When I run the Job in Kettle the output-file will be stored next to the job:
D:/pentaho/test/output.txt

Then I upload the transformation and job in BI server over the BI-web-console in Folder: Public/Test2 and run the job (via double click or scheduling) the relative-Path in the kettel transformation is not used. The file "output.txt" will not stored in "D:/pentaho/test/output.txt" but in "D:/public/Test2/output.txt".

=> How can I fix this?

(When I use in kettle an absolute path it works but I want to use this kettle parameter ${Internal.Transformation.Filename.Directory})

Thanks a lot for any advise

Arved

Weka Explorer: Visualizing Cluster Assignment Disabled?

How to add load animation to all items

$
0
0
Hi community , we need to ad a "Loading" animation to all the items in your BA server ,is there any way to make an animation appear every time the user hits "refresh" or "run" buttons? without needing to go report by report changing or adding this functionality? Im new in Pentaho so every idea will be appreciated .

Is this some configuration?
Is this some modification to pentaho source code?

Thank you!!

execute job with timer

Secured connection between DB2 to MYSQL for data transfer

$
0
0
I have 2 databases DB2 and MYSQL. I want to extract some data from DB2 tables and load it into MYSQL tables. I can do that but we have another requirement that needs this data transfer to be in a secured way.
Is there a way to have a secured database connection between DB2 to MYSQL in PDI?

Report doesn't open up in Pentaho BI Server

$
0
0
Hi,

I've created a report in Pentaho Report Designer 6.0.1 with dependent parameters. Published the same to Pentaho BI Server 6.1. The report runs perfectly in the PRD in Preview and i'm able to see the data. Also, there are no errors thrown during publishing the report as well. But after publishing, when i try to launch the report, the link opens up blank with the loading spinner just rotating forever. I checked the logs in the biserver as well (catalina.out and pentaho.log), apart from 1 error during launch (java.io.UTFDataFormatException: encoded string too long: 78805 bytes), there is no other error seen. Not even during the report load.

Can someone please tell me what is going wrong here?

JSON Input problem - unexpected error java.lang.NullPointerException

$
0
0
Hi all,

I have a JSON input step that reads JSON content from a field. The JSON content will either be:
* a successful request-result containing all the fields I expect; or
* an unsuccessful request-result contain fields related to the error.

I have a transformation that looks for the JSON path for the "error code" that would appear only in an unsuccessful request-result.
If it finds this path it records the error_code number (if the request-result was OK, error_code will be NULL).
It then filters rows on the returned error code.
If error_code is > 0 (ie: the request-result contained returned an error) then it creates dummy (constant) values in the stream.
Otherwise, if the request-result did not return errors, the transformation runs a second JSON input step to extract the values.

I have attached a zip containing 2 files:
* "json_input_problem_example.ktr" - a cutdown version of this transformation using datagrid to generate the rows (one to be successful, one to be unsuccessful).
* "get_messages.ktr" - the full transformation. I've included this transformation to compare against the cutdown version. The complete job is too large to include and contains sensitive information.

The cutdown version using datagrid works without issue!

The full version produces the following error which I cannot explain.
Code:

2016/06/15 16:03:03 - Get Property Details Returned.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Unexpected error2016/06/15 16:03:03 - Get Property Details Returned.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : java.lang.NullPointerException
2016/06/15 16:03:03 - Get Property Details Returned.0 -    at org.pentaho.di.trans.steps.jsoninput.JsonInput.prepareToRowProcessing(JsonInput.java:153)
2016/06/15 16:03:03 - Get Property Details Returned.0 -    at org.pentaho.di.trans.steps.jsoninput.JsonInput.processRow(JsonInput.java:101)
2016/06/15 16:03:03 - Get Property Details Returned.0 -    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/06/15 16:03:03 - Get Property Details Returned.0 -    at java.lang.Thread.run(Unknown Source)

I can guarantee that the datagrid string (clapi_property_detail_result) that mimics my HTTP request in the cutdown sample is exactly the same as the output from the HTTP request because that's where I copied it from in the first place.

Can anybody tell my why this is not working? :confused:

I've done a bit of searching around and the closest issue I can find is this one:
http://forums.pentaho.com/showthread...SON-Input-step
...however I don't think it's relevant in this case because my cutdown version works.


Regards,

Chris


EDIT: I am using Pentaho v6.1. However when I test the exact same Job in 5.3 it works as expected!
Attached Files

CCC Line chart doesn't get new data after refreshing

$
0
0
Hi everyone,


I am using pentaho cde to create a real-time dashboard.
My dashboard has a CCC line chart.
My data source of the dashboard is a Kettle transformation which have a Apache Kafka Consumer step to read message from broker, do some more process...
I set the Refresh Period properties of the CCC line chart to 5.
It really do refresh every 5 seconds.
But my problem is that the line chart doesn't get new data to draw.
Could anyone please help me to solve this problem.
Sorry for my bad english.


Many thanks,
Hoang Anh.

error when i want connect localhost

$
0
0
Hi.

this is first time i am install pentaho community edition, i am follow instruction at website and article.
in the begnning, its running smoothly then i am connect localhost:8099 its still okay
and finally i got some error when i am connect to localhost:8080.

and this my error http://pastebin.com/zx7dre7n

i am used biserver-ce-4.8.0-stable.zip. and jdk1.8.0_77

thanks.

best regards
dandy

how to test a marketplace entry for a step plugin

$
0
0
Hi,
in the markeplace-metadata project is indicated a way to test a step plugin entry before submitting a pull request to marketplace. Apparently it doesn't work anymore from pdi version 6.0, and I think it is related to the new osgi-karaf plugin architecture.

Is there a "new" way now to run this little tests?

Thanks in advance for replies.
Regards

Excel Input with headers containing line return

$
0
0
Hi everyone,

I'm having trouble handeling an excel file with header having carriage return.
When I get fields from "Microsoft Excel Input" it's ok, carriage return is ignored, fieldname is on one line like
Code:

Line1Line2
, preview data works fine, but the next step return error saying
Code:

: Couldn't find field 'Line1
.

Any advice on how to avoid this error ?

how to install pentaho pdi 6.1 version

$
0
0
Hi Team,

can any one help me how to install pdi 6.1 in windows 7?


Thanks,
sekhar

Execute R Script Example

$
0
0
Hi All,

I am new to R language in Pentaho data integration.I need how can I use Execute R script in pentaho data integration.
can anyone please help me with Execute R script Example transformation.


Thanks,
:oJanu
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>