Hello All,
I have just started evaluating the Pentoho for our Business Integration purpose. We are having a Hortonworks Sandbox 2.0 Environment which is up and running successfully.
I Created a job to copy a file from local disk to HDFS. When I run the job, it's failing.
The logger prints the below exception.
2014/01/08 11:32:24 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : Can not copy file/folder [file:///E:/Data/FifteenGigaText.txt] to [hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test]. Exception : [
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test' : Could not resolve file "hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test".
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - ]
2014/01/08 11:32:24 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : org.pentaho.di.core.exception.KettleFileException:
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test' : Could not resolve file "hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test".
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:162)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:105)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:378)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:326)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:678)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:815)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:500)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.run(Job.java:407)
2014/01/08 11:32:24 - HadoopFileCopy - Finished job entry [Hadoop Copy Files] (result=[false])
2014/01/08 11:32:24 - HadoopFileCopy - Job execution finished
2014/01/08 11:32:24 - Spoon - Job has ended.
When i try to google it up, I landed on the page where pentaho talked about the various hadoop distros configuration and compatibility with PDI.
http://wiki.pentaho.com/display/BAD/...ro+and+Version
In this page, it's clearly stated that HDP 2.x is not supported by Pentaho unless we make some manual effort.
Update : I found a Jira http://jira.pentaho.com/browse/PDI-10807 Ticket available for the same reason what I 've asked here for.... The ticket status states it is closed. But no idea what it talks , and what i have to do......
Can Any one guide me through this to get it done.
Thanks In Advance
Regards,
VAP.
I have just started evaluating the Pentoho for our Business Integration purpose. We are having a Hortonworks Sandbox 2.0 Environment which is up and running successfully.
I Created a job to copy a file from local disk to HDFS. When I run the job, it's failing.
The logger prints the below exception.
Quote:
2014/01/08 11:32:24 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : Can not copy file/folder [file:///E:/Data/FifteenGigaText.txt] to [hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test]. Exception : [
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test' : Could not resolve file "hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test".
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - ]
2014/01/08 11:32:24 - Hadoop Copy Files - ERROR (version 5.0.2, build 1 from 2013-12-04_15-52-25 by buildguy) : org.pentaho.di.core.exception.KettleFileException:
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - Unable to get VFS File object for filename 'hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test' : Could not resolve file "hdfs://[[sandbox environment ip goes here]]:8020/user/hue/test".
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files -
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:162)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.core.vfs.KettleVFS.getFileObject(KettleVFS.java:105)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.ProcessFileFolder(JobEntryCopyFiles.java:378)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.entries.copyfiles.JobEntryCopyFiles.execute(JobEntryCopyFiles.java:326)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:678)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:815)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.execute(Job.java:500)
2014/01/08 11:32:24 - Hadoop Copy Files - at org.pentaho.di.job.Job.run(Job.java:407)
2014/01/08 11:32:24 - HadoopFileCopy - Finished job entry [Hadoop Copy Files] (result=[false])
2014/01/08 11:32:24 - HadoopFileCopy - Job execution finished
2014/01/08 11:32:24 - Spoon - Job has ended.
When i try to google it up, I landed on the page where pentaho talked about the various hadoop distros configuration and compatibility with PDI.
http://wiki.pentaho.com/display/BAD/...ro+and+Version
In this page, it's clearly stated that HDP 2.x is not supported by Pentaho unless we make some manual effort.
Update : I found a Jira http://jira.pentaho.com/browse/PDI-10807 Ticket available for the same reason what I 've asked here for.... The ticket status states it is closed. But no idea what it talks , and what i have to do......
Can Any one guide me through this to get it done.
Thanks In Advance
Regards,
VAP.