Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Correct way to use HL7 MLLP Input

$
0
0
I'm new to Pentaho and trying to utilize the HL7 MLLP Input job. I didn't see much in the way of information online - both within Pentaho or in general, but I was able to find a presentation that had an image of a job that used it. From that I was able create a job that called a transformation that can take an HL7 message and push that data into a file.

However, I wasn't sure how I would actually use this in a production environment where I had to receive HL7 messages all the time. Typically, in other HL7 systems, I would have have a connection that remains open to the source system and would receive HL7 messages whenever they send them. The connection would never close. With Pentaho, the job runs, waits for the message(s), processes the message(s), and then finishes and closes the connection. If I try to use the "Repeat" feature, where it would restart after that specified interval, the job fails after the first successful time it process the message. I get the following error the next time the job runs.

2014/12/12 18:59:17 - HL7 MLLP Input - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected error
2014/12/12 18:59:17 - HL7 MLLP Input - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : java.lang.NullPointerException
2014/12/12 18:59:17 - HL7 MLLP Input - at org.pentaho.di.job.entries.hl7mllpin.HL7MLLPInput.execute(HL7MLLPInput.java:163)
2014/12/12 18:59:17 - HL7 MLLP Input - at org.pentaho.di.job.Job.execute(Job.java:716)
2014/12/12 18:59:17 - HL7 MLLP Input - at org.pentaho.di.job.Job.execute(Job.java:859)
2014/12/12 18:59:17 - HL7 MLLP Input - at org.pentaho.di.job.Job.execute(Job.java:532)
2014/12/12 18:59:17 - HL7 MLLP Input - at org.pentaho.di.job.Job.run(Job.java:424)

I am sending an ACK back successfully, but again, I am unsure this is even the correct way to implement this.

BTW, I am having no problems if I process the HL7 message from a file - only as a stream.

If someone could comment how they are using the HL7 MLLP Input/Acknowledge or what the best practices are, I would appreciate it.

I have attached my job and transformation files.

Thx.

Created Widgets are not displayed

$
0
0
Hi! I'm using Pentaho Community Edition 5.2.
When I create a new widget and save in the cde/widgets folder, it doesn't appear in the Widgets list of the left side panel (attached screens).
I tried to clear my browser cahe, refresh CDA cache, refresh the folders list, restart the server, etc.
What am i doing wrong?
Attached Images

Connecting Pentaho DI to Microsoft SQL Server azzure

$
0
0
I make a connection like that:


engine: sqlserver
port: 1422
host name: machine_name.database.windows.net
database name: database_name


And I got his error:


Error connecting to database: (using class
net.sourgeforge.jtds.jdbc.Driver)
I/O Error: DB sercer closed connection


I don't know if I have to use sqlserver(native) but
I don't find this option in my version.
Do I have to migrate to last pentaho version


Any idea?


Any help or advise will be greatly aprecciated

cannot connect to mongodb: can't find a master

$
0
0
version: PDI 5.2.0.0

1)
I read this wiki: http://wiki.pentaho.com/display/EAI/MongoDB+Input

So I type "10.2.124.210:27017,10.2.124.210:27027,10.2.124.210:27037" in the hosts name field, then click "getDBs". It shows the error message “cannot connect to mongodb: can't find a master”, whatever enable "use all replica set members/mongos" or not.

How can I connect to a replica set containing more than one node?

2) It works fine when I only set one host name "10.2.124.210:27017".

tks

Calculator step, division is rounding result

$
0
0
When I use the calculator step to do division, the new field is rounding instead of giving a precise value. For example, 169/40 displays 4. How do I pass the exact value (4.225) to the next step?

Text Editor Component in CDE

$
0
0
Hi,

I am new to Pentaho and trying to achieve a feature in one of our dashboards where we need to add comments section and it should be saved to a text file in server.
I assume the "Text Editor Component" should be helpful to achieve this. I have set the path in the component to server url like "http://localhost:8081/pentaho/comments/example.txt".
The component allows me to enter text in the text area provided, but when I click save it gives an error. I have given all permissions to the directory.

Can anyone please help me out here, and let me know whether what I am trying to achieve is possible and how ?

Update dimension from csv files data without primary key

$
0
0
Hi Community,

i have a question about building up a dimension. The basic problem is that i am building the dimension from csv files from an internet service that come in on a daily basis. So there is no real source system wich natural keys i could use for looking up.

I am using Kettle 5.2 Community Edition and the data looks like this:

date | domain | query | impression | klicks
20141216 | www.example.de | example | 12345 | 651

My plan was to build up a dim_date, dim_domain and dim_query and a fct_query. Before i started using kettle i've used a php skript that did the inserting into a mysql database. First i loaded the data into a staging table and used an sql query to load the domain and query tuples into the dimensions.

It was something like:
INSERT INTO dim_domain (dim_domain.domain)
SELECT DISTINCT stg_query.domain
FROM stg_query
WHERE NOT EXISTS (SELECT dim_domain.domain from dim_domain WHERE dim_domain.domain = stg_query.domain);

So i just used the string values for comparison because i don't have natural keys. If i am building up something similar with the dimension lookup step it is running very slow (3 r/s). I guess this is because i'm using the string value as lookup key. The query field will be something about 100.000 new distinct values every day, so 3 r/s is not an option.

So what could be a best practice for a scenario like this?

I hope you have some advice for me :-) thank u!

This tool really disappoints me. It is completelly bugged.

$
0
0
I will try to post again in this forum, albeit I find it completelly useless, since moderators seem to not publish my posts.


Well, I have one ETL project to load a Data Wharehouse. I build my transformations locally using Spoon and they ran wonderfully. I was very happy at the time.

However, now I am alway from my office and came up with the idea to put carte.sh to run and to run the transformations there.

Fantastic idea. I searched the documentation and finally set it up. It is running I am able to send my transformations to it and monitor them.

But here endes my happy story. From now on, only tears and gray skies.

The steps started to fail saying that field XYZ cannot be found.

Well, I go to the step, reselect the field, be sure that the field is there, that is coming from the previous transformation, save again, run again and....

Error again.

It started with IF NULL component, then went to STRING OPERATIONS component and now is failing at the first Lookup component. So, if I remove one component and do the transformation in the SQL, the next component will fail anyway.

I am almost quitting this tool.

Can anyone provide some help?

Thanks



2014/12/16 07:56:55 - FatoDespesa - Dispatching started for transformation [FatoDespesa]
2014/12/16 07:56:55 - FatoDespesa.0 - Connected to database [PostgreSQL - Financeiro] (commit=1000)
2014/12/16 07:57:10 - Lookup DimNivel1.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected error
2014/12/16 07:57:10 - Lookup DimNivel1.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleStepException:
2014/12/16 07:57:10 - Lookup DimNivel1.0 - Field [N1] is required and couldn't be found!
2014/12/16 07:57:10 - Lookup DimNivel1.0 -
2014/12/16 07:57:10 - Lookup DimNivel1.0 - at org.pentaho.di.trans.steps.databaselookup.DatabaseLookup.processRow(DatabaseLookup.java:395)
2014/12/16 07:57:10 - Lookup DimNivel1.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2014/12/16 07:57:10 - Lookup DimNivel1.0 - at java.lang.Thread.run(Unknown Source)
2014/12/16 07:57:10 - Calculator.0 - Finished processing (I=0, O=0, R=2, W=2, U=0, E=0)
2014/12/16 07:57:11 - Despesas.0 - Finished reading query, closing connection.
2014/12/16 07:57:11 - Lookup DimNivel1.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2014/12/16 07:57:11 - FatoDespesa - Transformation detected one or more steps with errors.
2014/12/16 07:57:11 - FatoDespesa - Transformation is killing the other steps!
2014/12/16 07:57:11 - Despesas.0 - Finished processing (I=4, O=0, R=0, W=2, U=0, E=0)

Carte Dynamic Clustering Installation Community Edition with Repository - PDI

$
0
0
Our group is attempting to setup Pentaho Data Integration with Carte using Dynamic Clustering Installation along with a Repo.

We have attempted to use the following documentation to assist in the setup:
http://wiki.pentaho.com/display/EAI/Carte+Configuration
http://wiki.pentaho.com/display/EAI/...+Documentation
http://open-bi.blogspot.com/2009/11/

However, we are stuck and have a few questions



Here is our setup:
- Using Data Integration Community Edition 5.2
-Master Server - Ubuntu 14.04 - no UI
-Installed JRE - sudo apt-get install default-jre
-Downloaded and extracted the community edition to the following location /opt/pentaho/server/data-integration
-In this folder, I have a file called carte-config-master.xml.
-This file looks like the following (The sensitive fields are 'x''d out):
<slave_config>
<!--
Document description...


- masters: You can list the slave servers to which this slave has to report back to.
If this is a master, we will contact the other masters to get a list of all the slaves in the cluster.


- report_to_masters : send a message to the defined masters to let them know we exist (Y/N)


- slaveserver : specify the slave server details of this carte instance.
IMPORTANT : the username and password specified here are used by the master instances to connect to this slave.


-->


<slaveserver>
<name>Master</name>
<hostname>xx.xx.xx.xx</hostname>
<port>8090</port>
<username>cluster</username>
<password>xxxxx</password>
<webAppName>pentaho-di</webAppName>
<master>Y</master>
</slaveserver>


<repository>
<name>xxxxxxx</name>
<username>postgres</username>
<password>xxxxxxx</password>
</repository>
</slave_config>


--This appears to start up correctly with the following command ./carte.sh carte-config.master.xml


-Slave Server: Ubuntu 14.04 with no UI
-Installed JRE - sudo apt-get install default-jre
-Downloaded and extracted the community edition to the following location /opt/pentaho/server/data-integration
-In this folder, I have a file called carte-config-slave.xml.
-This file looks like the following (The sensitive fields are 'x''d out):<slave_config>
<masters>
<slaveserver>
<name>Master</name>
<hostname>xx.xx.xx.xxx</hostname>
<port>8090</port>
<username>cluster</username>
<password>xxxxxx</password>
<webAppName>pentaho-di</webAppName>
<master>Y</master>
</slaveserver>
</masters>
<report_to_masters>Y</report_to_masters>
<slaveserver>
<name>slave1</name>
<hostname>xx.xx.xx.xx</hostname>
<port>8091</port>
<username>cluster</username>
<password>xxxxxxx</password>
<master>N</master>
</slaveserver>
<repository>
<name>xxxxxx</name>
<username>postgres</username>
<password>xxxxxxx</password>
</repository>
</slave_config>

--This does not start up correct with the following command
./carte.sh carte-slave-config.xml

The following error occurs:
2014/12/16 15:44:14 - Carte - Installing timer to purge stale objects after 1440 minutes.
Attempting to load ESAPI.properties via file I/O.
Attempting to load ESAPI.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/pentaho/server/data-integration/ESAPI.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/ESAPI.properties
Not found in 'user.home' (/home/ubuntu) directory: /home/ubuntu/esapi/ESAPI.properties
Loading ESAPI.properties via file I/O failed. Exception was: java.io.FileNotFoundException
Attempting to load ESAPI.properties via the classpath.
SUCCESSFULLY LOADED ESAPI.properties via the CLASSPATH from '/ (root)' using current thread context class loader!
SecurityConfiguration for Validator.ConfigurationFile not found in ESAPI.properties. Using default: validation.properties
Attempting to load validation.properties via file I/O.
Attempting to load validation.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/pentaho/server/data-integration/validation.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/validation.properties
Not found in 'user.home' (/home/ubuntu) directory: /home/ubuntu/esapi/validation.properties
Loading validation.properties via file I/O failed.
Attempting to load validation.properties via the classpath.
validation.properties could not be loaded by any means. fail. Exception was: java.lang.IllegalArgumentException: Failed to load ESAPI.properties as a classloader resource.
SecurityConfiguration for Logger.LogServerIP not either "true" or "false" in ESAPI.properties. Using default: true
2014/12/16 15:44:14 - Carte - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unable to register to master slave server [Master] on address [10.81.39.98:8090]
2014/12/16 15:44:14 - Carte - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unable to register to master slave server [Master] on address [10.81.39.98:8090]
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: java.lang.NullPointerException
at org.pentaho.di.www.Carte.runCarte(Carte.java:183)
at org.pentaho.di.www.Carte.main(Carte.java:172)
... 5 more




My questions are as follows:
-How do I correct the error on the slave server?
-Is the setup correct?
-Is the setup of the repository correct? The documentation on using a repository with Carte is not clear.
-How can I test it once I get past the errors?
-Is having no UI a problem?

Additional questions?
-I was planning on using developer workstations to develop transformations? How are these deployed to the repository?
-We were planning on using the Carte web services and use a custom web page we develop to call the carte web services? Is this feasible with the current setup?

any example of oracle bulk loader

$
0
0
Hi all

i'm abit new in kettle looking for a dynamic loader to oracle I see oracle bulk loader so any idea on how to do this?,

I have around 100 files so I want to load and move them to another location.

thanks for help:D

SFTP put timing out?

$
0
0
I'm trying to "Put a file with SFTP" (aprox 300mb) in a Job . Uploading manually and small files works fine. I get the following message:

2014/12/16 12:30:25 - Put a file with SFTP 2 - ERROR (version 5.0.4, build 1 from 2014-02-17_08-03-10 by buildguy) : Error while trying to send files. Exception :
2014/12/16 12:30:25 - Put a file with SFTP 2 - 4: java.net.SocketException: Connection reset by peer: socket write error
2014/12/16 12:30:25 - Put a file with SFTP 2 - java.net.SocketException: Connection reset by peer: socket write error
2014/12/16 12:30:25 - Put a file with SFTP 2 - ERROR (version 5.0.4, build 1 from 2014-02-17_08-03-10 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
2014/12/16 12:30:25 - Put a file with SFTP 2 - 4: java.net.SocketException: Connection reset by peer: socket write error
2014/12/16 12:30:25 - Put a file with SFTP 2 - java.net.SocketException: Connection reset by peer: socket write error

Any help or pointers will be greatly appreciated.

POST versus PUT in BISERVER

$
0
0
Our IT department is leery of opening up an HTTPS redirect to allow PUT pentaho server for file operations.

Is there a way to use POST operations? Apparently these are more secure according to them.

Sql Error is not Capturing in Pentaho 5.1.0 version log file ( after upgrade from 5.)

$
0
0
Hi All,

We have recently upgraded from 5.0 to 5.1.0 version. we are having ans issue. Oracle (SQL) erros are not caturing in job log file.
I have created a test job with an explicitly induced an error; then I ran the job in 5.0 and 5.1.0 versions and I could see error messages are populating in the log file only in the 5.0 version job. So the issue is with 5.1.0 version.


Exmaple: in 5.0 version log( there is no issue)
INFO 02-12 11:32:41,209 - test - Transformation has allocated 2 threads and 1 rowsets.
INFO 02-12 11:32:41,209 - Table input - SQL query : select 1 as val dual
INFO 02-12 11:32:41,210 - Text file output - Starting to run...
ERROR 02-12 11:32:41,262 - Table input - Unexpected error
ERROR 02-12 11:32:41,262 - Table input - org.pentaho.di.core.exception.KettleDatabaseException:
An error occurred executing SQL:
select 1 as val dual
ORA-00923: FROM keyword not found where expected
at org.pentaho.di.core.database.Database.openQuery(Database.java:1896)

Exmaple: in 5.1.0 version log( Error are not capture as above)

2014/12/02 11:41:06 - Text file output.0 - Starting to run...
2014/12/02 11:41:06 - Table input.0 - SQL query : select 1 as val dual
child index = 1, logging object : org.pentaho.di.core.logging.LoggingObject@ff5784f parent=483528ca-a870-42b3-be4b-6e369629df40
2014/12/02 11:41:06 - OCS DB - Statement canceled!
2014/12/02 11:41:06 - OCS DB - Statement canceled!
2014/12/02 11:41:06 - Table input.0 - Finished reading query, closing connection.


If any one come acroos this situiation can you please tell me the solution.
NOTE: we cant go back to old version as it has( Jons) other enhancemnets so this is not a option we are looking.

Please help me.

Weka 3.6.12 and 3.7.12 releases

$
0
0
Hi everyone!

New versions of Weka are available for download from the Weka homepage:

* Weka 3.6.12 - stable book 3rd edition version. It is available as ZIP, with Win32 installer, Win32 installer incl. JRE 1.7.0_72, Win64 installer, Win64 installer incl. 64 bit JRE 1.7.0_72 and Mac OS X application (both Oracle and Apple JVM versions).

* Weka 3.7.12 - development version. It is available as ZIP, with Win32 installer, Win32 installer incl. JRE 1.7.0_72, Win64 installer, Win64 installer incl. 64 bit JRE 1.7.0_72 and Mac OS X application (both Oracle and Apple JVM versions).

Both versions contain a significant number of bug fixes, it is recommended to upgrade to the new versions. Stable Weka 3.6 receives bug fixes only. The development version receives bug fixes and new features.

Weka homepage:
http://www.cs.waikato.ac.nz/~ml/weka/

Pentaho data mining community documentation:
http://wiki.pentaho.com/display/Pent...+Documentation

Packages for Weka>=3.7.2 can be browsed online at:
http://weka.sourceforge.net/packageMetaData/

Note: It might take a while before Sourceforge.net has propagated all the files to its mirrors.


What's new in 3.7.12?

Some highlights
---------------

In core weka:

* GUIChooser now has a plugin exension point that allows implementations of GUIChooser.GUIChooserMenuPlugin to appear as entries in either the Tools or Visualization menus
* SubsetByExpression filter now has support for regexp matching
* weka.classifiers.IterativeClassifierOptimizer - a classifier that can efficiently optimize the number of iterations for a base classifier that implements IterativeClassifier
* Speedup for LogitBoost in the two class case
* weka.filters.supervised.instance.ClassBalancer - a simple filter to balance the weight of classes
* New class hierarchy for stopwords algorithms. Includes new methods to read custom stopwords from a file and apply multiple stopwords algorithms
* Ability to turn off capabilities checking in Weka algorithms. Improves runtime for ensemble methods that create a lot of simple base classifiers
* Memory savings in weka.core.Attribute
* Improvements in runtime for SimpleKMeans and EM
* weka.estimators.UnivariateMixtureEstimator - new mixture estimator


In packages:

* New discriminantAnalysis package. Provides an implementation of Fisher's linear discriminant analysis
* Quartile estimators, correlation matrix heat map and k-means++ clustering in distributed Weka
* Support for default settings for GridSearch via a properties file
* Improvements in scripting with addition of the offical Groovy console (kfGroovy package) from the Groovy project and TigerJython (new tigerjython package) as the Jython console via the GUIChooser
* Support for the latest version of MLR in the RPlugin package
* EAR4 package contributed by Vahid Jalali
* StudentFilters package contributed by Chris Gearhart
* graphgram package contributed by Johannes Schneider

As usual, for a complete list of changes refer to the changelogs.

Cheers,
The Weka Team

Collection

$
0
0
How to check if collection exist or not in mongodb ? I have tried in command line it will show empty but same command how do i use it ?

db.system.namespaces.find( { name: 'dev.info' } );

Transforming one file at a time

$
0
0
Hi,

I have created a transformation to fetch a csv file and convert it into a desired format by picking data from specific rows (e.g. 1st and 3rd row).

The transformation works fine with one file; however while dealing with multiple files the transformation starts transforming consecutive files as well for each step. Due to this the system is not able to pick up correct data for file 2 onwards and gives an error.

I would like to know if there's any way where I can control the execution of multiple files and have the Transformation only process one file at a time i.e. first process all steps for file 1 and then repeat all steps for file 2 onwards.

FYI: I am using text file input and regular expressions to pick up multiple files.

Any help would be much appreciated.

I am having issues while uploading a screenshot, will try again later and upload it soon.

Thanks in advance.

Cheers.

Kunal

Cassandra input step error (using Cassandra 2.1.2 and CQL spec 3.2.0)

$
0
0
On a Windows AWS 64bit server, running PDI version..

pdi-ce-5.2.0.0-209

with PDI cassandra plugin version 5.2.1.0-R

The Cassandra version is 2.1.2 and CQL spec = 3.2.0

When I attempt to view the schema, I am getting the following error. Any help would be greatly appreciated.

Thank you!

2014/12/16 22:13:13 - /Transformation 1 - Dispatching started for transformation [/Transformation 1]
2014/12/16 22:13:14 - Cassandra Input.0 - Connecting to Cassandra node at (removed hostname) : 9160 using keyspace eventlog ...
2014/12/16 22:13:14 - Cassandra Input.0 - Using connection options: cqlVersion=3.0.1
2014/12/16 22:13:14 - Cassandra Input.0 - Closing connection ...
2014/12/16 22:13:14 - Cassandra Input.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected error
2014/12/16 22:13:14 - Cassandra Input.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleException:
2014/12/16 22:13:14 - Cassandra Input.0 - null
2014/12/16 22:13:14 - Cassandra Input.0 - at java.lang.Thread.run (null:-1)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.di.trans.step.RunThread.run (RunThread.java:62)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.processRow (CassandraInput.java:164)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.CassandraConnection.getKeyspace (CassandraConnection.java:304)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.LegacyKeyspace.setKeyspace (LegacyKeyspace.java:100)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.CassandraConnection.setKeyspace (CassandraConnection.java:186)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.CassandraConnection.checkOpen (CassandraConnection.java:161)
2014/12/16 22:13:14 - Cassandra Input.0 -
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.processRow(CassandraInput.java:167)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2014/12/16 22:13:14 - Cassandra Input.0 - at java.lang.Thread.run(Unknown Source)
2014/12/16 22:13:14 - Cassandra Input.0 - Caused by: java.lang.NullPointerException
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.CassandraConnection.checkOpen(CassandraConnection.java:161)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.CassandraConnection.setKeyspace(CassandraConnection.java:186)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.LegacyKeyspace.setKeyspace(LegacyKeyspace.java:100)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.cassandra.legacy.CassandraConnection.getKeyspace(CassandraConnection.java:304)
2014/12/16 22:13:14 - Cassandra Input.0 - at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.processRow(CassandraInput.java:164)
2014/12/16 22:13:14 - Cassandra Input.0 - ... 2 more
2014/12/16 22:13:14 - Cassandra Input.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1)
2014/12/16 22:13:14 - /Transformation 1 - Transformation detected one or more steps with errors.
2014/12/16 22:13:14 - /Transformation 1 - Transformation is killing the other steps!

How to Install Pentaho on Centos and Integrate with Cloudera

$
0
0
Hi Experts,

I want to Install pentaho on centos which doesn't have the internet connection, and want to be integrate with existed hadoop cluster data for visualize and analize, could anybody help me to do this in better way ?

Thanks,
Kishore.

PDI Community Edition VS Enterprise Edition

$
0
0
Hi,
I am currently using PDI community Edition.I want to know what is the difference between Community Edition and Enterprise Edition in PDI .

Regards,
Rushikesh

http://localhost:8080/pentaho/public/casFailed issue even pentaho has same user

Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>