Quantcast
Channel: Pentaho Community Forums
Viewing all articles
Browse latest Browse all 16689

BigData Plugin From Java

$
0
0
Hello

I am trying to run a .ktr file which has a hadoop output step through java and I get the following error. The same ktr runs fine when I run from the Spoon. How do I ensure that it runs fine from my java code as well ?


2015/11/18 13:47:23 - Property_Validation - ERROR (version 5.3.0.0-200, build 1 from 2015-01-20_19-50-27 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/11/18 13:47:23 - Property_Validation - Unexpected error during transformation metadata load
2015/11/18 13:47:23 - Property_Validation -
2015/11/18 13:47:23 - Property_Validation - Missing plugins found while loading a transformation
2015/11/18 13:47:23 - Property_Validation -
2015/11/18 13:47:23 - Property_Validation - Step : HadoopFileOutputPlugin
2015/11/18 13:47:23 - Property_Validation -
2015/11/18 13:47:23 - Property_Validation - at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1205)
2015/11/18 13:47:23 - Property_Validation - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:648)
2015/11/18 13:47:23 - Property_Validation - at org.pentaho.di.job.Job.execute(Job.java:716)
2015/11/18 13:47:23 - Property_Validation - at org.pentaho.di.job.Job.access$000(Job.java:115)
2015/11/18 13:47:23 - Property_Validation - at org.pentaho.di.job.Job$1.run(Job.java:835)
2015/11/18 13:47:23 - Property_Validation - at java.lang.Thread.run(Thread.java:745)
2015/11/18 13:47:23 - Property_Validation - Caused by: org.pentaho.di.core.exception.KettleMissingPluginsException:
2015/11/18 13:47:23 - Property_Validation - Missing plugins found while loading a transformation
2015/11/18 13:47:23 - Property_Validation -

I have already added the below big data plugin dependency into my pom.xml file but still I get the above error. Do I need to register this plugin in the java code ? If so, how do I do it ?

I am using the following java code to trigger the job. It works fine without the hadoop input but once I add the hadoop input in my ktr, i get the above error.

KettleEnvironment.init();
JobMeta jobMeta = new JobMeta(filename,null,null);
Job job = initializeKettle(filename,jobMeta);

<dependency>
<groupId>pentaho</groupId>
<artifactId>pentaho-big-data-plugin</artifactId>
<version>5.2.1.0-148</version>
</dependency>

Can anyone please help ?

Viewing all articles
Browse latest Browse all 16689

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>