Quantcast
Channel: Pentaho Community Forums
Viewing all articles
Browse latest Browse all 16689

PDI + Hadoop: Permission Denied to copy file

$
0
0
How can I define user/password to be used on 'Hadoop Copy Files' step ?

I'm trying

Code:

hdfs://sicat:sicatuser@10.239.69.200:8020/user/sicat
But I'm getting the following error message :

Code:

2014/01/20 15:59:33 - Hadoop Copy Files - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : File System Exception: Could not copy "file:///d:/weblogs_rebuild.txt" to "hdfs://sicat:***@10.239.69.200:8020/user/sicat/weblogs_rebuild.txt".
2014/01/20 15:59:33 - Hadoop Copy Files - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Caused by: Could not write to "hdfs://sicat:***@10.239.69.200:8020/user/sicat/weblogs_rebuild.txt".
2014/01/20 15:59:33 - Hadoop Copy Files - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Caused by: Permission denied: user=kleysonrios, access=EXECUTE, inode="/user/sicat":sicat:hadoop:drwxr-x---
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:224)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:177)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:142)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4705)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4687)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4661)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1839)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1771)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1747)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:439)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:207)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44942)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1751)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1747)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1745)

So, PDI is trying to use my windows username instead of username defined on URL.

Regards.

Viewing all articles
Browse latest Browse all 16689

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>