[prev in list] [next in list] [prev in thread] [next in thread] 

List:       hadoop-user
Subject:    RE: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG
From:       Chinnappan Chandrasekaran <chiranchandra () jos ! com ! sg>
Date:       2015-12-30 1:47:00
Message-ID: D8C9FCFC5D0EA7448BC81967D1A590641EAA04E9 () jossgmail02 ! jos ! com ! sg
[Download RAW message or body]

[Attachment #2 (multipart/alternative)]

[Attachment #4 (text/plain)]

Try  this

$ chown username:groupname  /user/hive/warehouse/ -R



Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email \
chiranchandra@jos.com.sg<mailto:chiranchandra@jos.com.sg> 55, Ubi Avenue 1 \
#03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: Wednesday, 30 December, 2015 8:09 AM
To: Hue-Users; user@hadoop.apache.org
Subject: Getting ERROR 2118: Permission denied: while running PIG script using \
HCATALOG

Hi,

When I run this simple pig script from pig editor in hue I get permission denied \
error. I can execute queries in hive as the same user any idea why?

We are using sentry for authorisation.


Here is my pig script.


LOAD_TBL_A = LOAD 'sandbox.suppliers' USING \
org.apache.hive.hcatalog.pig.HCatLoader();

STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';





Apache Pig version 0.12.0-cdh5.4.5 (rexported)
compiled Aug 12 2015, 14:17:24

Run pig script using PigRunner.run() for Pig version 0.8+
2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Apache Pig \
version 0.12.0-cdh5.4.5 (rexported) compiled Aug 12 2015, 14:17:24 2015-12-30 \
00:00:42,437 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Logging error messages \
to: /mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/co \
ntainer_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log<https://huecdh \
01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm/usercache/edhadmsvc \
/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log>
 2015-12-30 00:00:42,487 [uber-SubtaskRunner] INFO  org.apache.pig.impl.util.Utils  - \
Default bootup file /home/edhadmsvc/.pigbootup<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup> \
not found 2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker is deprecated. \
Instead, use mapreduce.jobtracker.address 2015-12-30 00:00:42,617 \
[uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - \
fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS \
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to \
hadoop file system at: \
hdfs://nameservice1<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/> \
2015-12-30 00:00:42,623 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to \
map-reduce job tracker at: yarnRM 2015-12-30 00:00:42,627 [uber-SubtaskRunner] WARN  \
org.apache.pig.PigServer  - Empty string specified for jar path 2015-12-30 \
00:00:43,320 [uber-SubtaskRunner] INFO  hive.metastore  - Trying to connect to \
metastore with URI thrift://hmscdh01094p001.corp.costco.com:9083<http://hmscdh01094p001.corp.costco.com:9083>
 2015-12-30 00:00:43,387 [uber-SubtaskRunner] INFO  hive.metastore  - Opened a \
connection to metastore, current connections: 1 2015-12-30 00:00:43,388 \
[uber-SubtaskRunner] INFO  hive.metastore  - Connected to metastore. 2015-12-30 \
00:00:43,658 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - \
Pig features used in the script: UNKNOWN 2015-12-30 00:00:43,750 [uber-SubtaskRunner] \
INFO  org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer  - \
{RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, \
GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, \
LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, \
PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], \
RULES_DISABLED=[FilterLogicExpressionSimplifier, PartitionFilterOptimizer]} \
2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - \
fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS \
2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - mapred.textoutputformat.separator \
is deprecated. Instead, use mapreduce.output.textoutputformat.separator 2015-12-30 \
00:00:43,856 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler  - File \
concatenation threshold: 100 optimistic? false 2015-12-30 00:00:43,921 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - \
MR plan size before optimization: 1 2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - \
MR plan size after optimization: 1 2015-12-30 00:00:44,006 [uber-SubtaskRunner] INFO  \
org.apache.pig.tools.pigstats.ScriptState  - Pig script settings are added to the job \
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - \
mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use \
mapreduce.reduce.markreset.buffer.percent 2015-12-30 00:00:44,044 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - \
mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3 2015-12-30 \
00:00:44,044 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - mapred.output.compress is \
deprecated. Instead, use mapreduce.output.fileoutputformat.compress 2015-12-30 \
00:00:44,266 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - \
fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS \
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.conf.Configuration.deprecation  - \
mapred.task.id<http://mapred.task.id> is deprecated. Instead, use \
mapreduce.task.attempt.id<http://mapreduce.task.attempt.id> 2015-12-30 00:00:44,267 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - \
creating jar file Job4443028594885224634.jar 2015-12-30 00:00:47,524 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - \
jar file Job4443028594885224634.jar created 2015-12-30 00:00:47,524 \
[uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - \
mapred.jar is deprecated. Instead, use mapreduce.job.jar 2015-12-30 00:00:47,550 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - \
Setting up single store job 2015-12-30 00:00:47,617 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 1 \
map-reduce job(s) waiting for submission. 2015-12-30 00:00:47,618 \
[uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - \
mapred.job.tracker.http.address is deprecated. Instead, use \
mapreduce.jobtracker.http.address 2015-12-30 00:00:47,667 [JobControl] INFO  \
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to \
rm1199 2015-12-30 00:00:47,929 [communication thread] INFO  \
org.apache.hadoop.mapred.TaskAttemptListenerImpl  - Progress of TaskAttempt \
attempt_1449847448721_0473_m_000000_0 is : 1.0 2015-12-30 00:00:48,076 [JobControl] \
INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.input.dir is \
deprecated. Instead, use mapreduce.input.fileinputformat.inputdir 2015-12-30 \
00:00:48,103 [JobControl] INFO  org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning \
up the staging area job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=>" \
target="_blank">/user/edhadmsvc/.staging/job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
 2015-12-30 00:00:48,112 [JobControl] WARN  \
org.apache.hadoop.security.UserGroupInformation  - PriviledgedActionException \
as:edhadmsvc (auth:SIMPLE) \
cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission \
denied: user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

2015-12-30 00:00:48,113 [JobControl] INFO  \
org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob  - PigLatin:script.pig got \
                an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: \
user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
                
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
                
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
                
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
  at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
                
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
 Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: \
user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
                
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
                
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
                
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
                
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
                
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
                
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
  at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
                
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
                
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
                
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
                
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
                
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): \
Permission denied: user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
  at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
  at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
                
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
2015-12-30 00:00:48,119 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - \
HadoopJobId: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
 2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - \
Processing aliases LOAD_TBL_A 2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - \
detailed locations: M: LOAD_TBL_A[1,13] C:  R: 2015-12-30 00:00:48,123 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 0% \
complete 2015-12-30 00:00:53,133 [uber-SubtaskRunner] WARN  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - \
Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop \
immediately on failure. 2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - job \
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474> \
has failed! Stop running all dependent jobs 2015-12-30 00:00:53,133 \
[uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - \
100% complete 2015-12-30 00:00:53,202 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to \
rm1199 2015-12-30 00:00:53,207 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for \
job job_1449847448721_0474. Redirecting to job history server. 2015-12-30 \
00:00:53,245 [uber-SubtaskRunner] INFO  \
org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for \
job job_1449847448721_0474. Redirecting to job history server. 2015-12-30 \
00:00:53,260 [uber-SubtaskRunner] ERROR org.apache.pig.tools.pigstats.PigStatsUtil  - \
1 map reduce job(s) failed! 2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  \
org.apache.pig.tools.pigstats.SimplePigStats  - Script Statistics:

HadoopVersion    PigVersion       UserId   StartedAt        FinishedAt        \
Features 2.6.0-cdh5.4.5   0.12.0-cdh5.4.5  edhadmsvc        2015-12-30 00:00:44       \
2015-12-30 00:00:53         UNKNOWN

Failed!

Failed Jobs:
JobId    Alias    Feature  Message  Outputs
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474> \
LOAD_TBL_A       MAP_ONLY Message: \
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: \
user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
                
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
                
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
                
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
  at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
                
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
 Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: \
user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
                
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
                
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
                
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
                
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
                
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
                
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
  at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
                
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
                
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
                
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
                
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
                
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): \
Permission denied: user=edhadmsvc, access=EXECUTE, \
inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
                
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
                
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
                
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
                
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
                
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
                
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
                
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
                
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
  at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
  at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
                
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
         /tmp/pig_testing001,

Input(s):
Failed to read data from "sandbox.suppliers"

Output(s):
Failed to produce result in \
"/tmp/pig_testing001<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/tmp/pig_testing001>"


Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>



2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - \
Failed! 2015-12-30 00:00:53,269 [uber-SubtaskRunner] ERROR \
org.apache.pig.tools.grunt.GruntParser  - ERROR 2244: Job failed, hadoop does not \
return any error message Hadoop Job IDs executed by Pig: \
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>





Thanks
Jay

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: \
enquiry@jos.com.sg<mailto:enquiry@jos.com.sg> Hotline: (+65) 6551 9611 For JOS \
Support, please contact us at JOS Services Email: \
services@jos.com.sg<mailto:services@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia's leading \
providers of integrated IT services and solutions with offices in Singapore, \
Malaysia, Hong Kong and China. Find out more about JOS at \
www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the \
use of the individual or entity named above and may contain information that is \
privileged. If you are not the intended recipient, you are notified that any \
dissemination, distribution or copying of this email is strictly prohibited. If you \
have received this email in error, please notify us immediately by return email or \
telephone and destroy the original message (including any attachment to it). Thank \
you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is \
confidential and intended only for the use of the individual or entity named above \
and may contain information that is privileged.  If you are not the intended \
recipient, you are notified that any dissemination, distribution or copying of this \
email is strictly prohibited.  If you have received this email in error, please \
notify us immediately by return email or telephone and destroy the original message \
(including any attachment to it).  Thank you. \
______________________________________________________________________


[Attachment #5 (text/html)]

<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<style>
<!--
@font-face
	{font-family:Calibri}
@font-face
	{font-family:Tahoma}
@font-face
	{font-family:Consolas}
p.MsoNormal, li.MsoNormal, div.MsoNormal
	{margin:0cm;
	margin-bottom:.0001pt;
	font-size:12.0pt;
	font-family:"Times New Roman","serif"}
a:link, span.MsoHyperlink
	{color:blue;
	text-decoration:underline}
a:visited, span.MsoHyperlinkFollowed
	{color:purple;
	text-decoration:underline}
pre
	{margin:0cm;
	margin-bottom:.0001pt;
	font-size:10.0pt;
	font-family:"Courier New"}
p.MsoAcetate, li.MsoAcetate, div.MsoAcetate
	{margin:0cm;
	margin-bottom:.0001pt;
	font-size:8.0pt;
	font-family:"Tahoma","sans-serif"}
span.HTMLPreformattedChar
	{font-family:Consolas}
span.EmailStyle19
	{font-family:"Calibri","sans-serif";
	color:#1F497D}
span.BalloonTextChar
	{font-family:"Tahoma","sans-serif"}
.MsoChpDefault
	{font-family:"Calibri","sans-serif"}
@page WordSection1
	{margin:72.0pt 72.0pt 72.0pt 72.0pt}
div.WordSection1
	{}
-->
</style>
</head>
<body lang="EN-SG" link="blue" vlink="purple">
<div class="WordSection1">
<p class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; color:#1F497D">Try &nbsp;this \
</span></p> <p class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; color:#1F497D">$ chown \
<b><i>username</i></b>:<b><i>groupname</i></b>&nbsp; /user/hive/warehouse/ -R \
</span></p> <p class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><b><span lang="EN-US" \
style="font-size:10.0pt; font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></b></p> <p class="MsoNormal"><b><span lang="EN-US" \
style="font-size:10.0pt; font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">Thanks &amp; Regards</span></b></p> <p class="MsoNormal"><b><span \
lang="EN-US" style="font-size:11.0pt; \
font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">Chandrasekaran</span></b></p> <p class="MsoNormal"><span lang="EN-US" \
style="font-size:11.0pt; font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">Technical Consultant </span></p>
<p class="MsoNormal"><span lang="EN-US" style="font-size:11.0pt; \
font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; color:#1F497D">Business \
Solutions Group </span></p>
<p class="MsoNormal"><span lang="EN-US" style="font-size:10.0pt; \
font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><span lang="EN-US" \
style="font-size:10.0pt; font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">Jardine OneSolution (2001) Pte Ltd</span></p> <p \
class="MsoNormal"><b><span lang="EN-US" style="font-size:10.0pt; \
font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D">Tel</span></b><span lang="EN-US" style="font-size:10.0pt; \
font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; color:#1F497D"> &#43;65 6551 \
9608 | <b>Mobile&nbsp;</b>&#43;65 8138 4761 | <b>Email</b>&nbsp;<a \
href="mailto:chiranchandra@jos.com.sg">chiranchandra@jos.com.sg</a> \
<u></u></span></p> <p class="MsoNormal"><span lang="EN-US" style="font-size:10.0pt; \
font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; color:#1F497D">55, Ubi Avenue 1 \
#03-15,Singapore 408935 </span></p>
<p class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><a href="jos.com"><span \
style="font-size:10.0pt; font-family:&quot;Arial&quot;,&quot;sans-serif&quot;; \
color:#1F497D; text-decoration:none"><img border="0" width="248" height="139" \
id="Picture_x0020_3" src="cid:image003.jpg@01D142E7.01E44230" alt="Description: \
150828 - JOS email signature_grey-02"></span></a><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; color:#1F497D"></span></p> <p \
class="MsoNormal"><span style="font-size:11.0pt; \
font-family:&quot;Calibri&quot;,&quot;sans-serif&quot;; \
color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><b><span lang="EN-US" \
style="font-size:10.0pt; \
font-family:&quot;Tahoma&quot;,&quot;sans-serif&quot;">From:</span></b><span \
lang="EN-US" style="font-size:10.0pt; \
font-family:&quot;Tahoma&quot;,&quot;sans-serif&quot;"> Kumar Jayapal \
[mailto:kjayapal17@gmail.com] <br>
<b>Sent:</b> Wednesday, 30 December, 2015 8:09 AM<br>
<b>To:</b> Hue-Users; user@hadoop.apache.org<br>
<b>Subject:</b> Getting ERROR 2118: Permission denied: while running PIG script using \
HCATALOG</span></p> <p class="MsoNormal">&nbsp;</p>
<div>
<div>
<p class="MsoNormal">Hi,</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">When I run this simple pig script from pig editor in hue I get \
permission denied error. I can execute queries in hive as the same user any idea \
why?</p> </div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<p class="MsoNormal">We are using sentry for authorisation.&nbsp;</p>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
<div>
<p class="MsoNormal">Here is my pig script.</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">LOAD_TBL_A = LOAD 'sandbox.suppliers' USING \
org.apache.hive.hcatalog.pig.HCatLoader();</p> </div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<div>
<div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<div style="border:solid #DDDDDD 1.0pt; padding:7.0pt 7.0pt 7.0pt 7.0pt; \
background:whitesmoke"> <pre id="withLogs" style="line-height:15.0pt; \
background:whitesmoke; border:none; padding:0cm; word-wrap:break-word; \
white-space:pre-wrap; overflow:auto"><span style="font-size:9.0pt; \
font-family:Consolas; color:#666666">Apache Pig version 0.12.0-cdh5.4.5 (rexported) \
<br>compiled Aug 12 2015, 14:17:24<br><br>Run pig script using PigRunner.run() for \
Pig version 0.8&#43;<br>2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.Main&nbsp; - Apache Pig version 0.12.0-cdh5.4.5 (rexported) compiled \
Aug 12 2015, 14:17:24<br>2015-12-30 00:00:42,437 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.Main&nbsp; - Logging error messages to: <a \
href="https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm \
/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_144 \
9847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log</span></a><br>2015-12-30 \
00:00:42,487 [uber-SubtaskRunner] INFO&nbsp; org.apache.pig.impl.util.Utils&nbsp; - \
Default bootup file <a \
href="https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">/home/edhadmsvc/.pigbootup</span></a> not found<br>2015-12-30 \
00:00:42,617 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - mapred.job.tracker is \
deprecated. Instead, use mapreduce.jobtracker.address<br>2015-12-30 00:00:42,617 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - <a \
href="http://fs.default.name">fs.default.name</a> is deprecated. Instead, use \
fs.defaultFS<br>2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine&nbsp; - Connecting to \
hadoop file system at: <a \
href="https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">hdfs://nameservice1</span></a><br>2015-12-30 00:00:42,623 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine&nbsp; - Connecting to \
map-reduce job tracker at: yarnRM<br>2015-12-30 00:00:42,627 [uber-SubtaskRunner] \
WARN&nbsp; org.apache.pig.PigServer&nbsp; - Empty string specified for jar \
path<br>2015-12-30 00:00:43,320 [uber-SubtaskRunner] INFO&nbsp; hive.metastore&nbsp; \
- Trying to connect to metastore with URI thrift://<a \
href="http://hmscdh01094p001.corp.costco.com:9083">hmscdh01094p001.corp.costco.com:9083</a><br>2015-12-30 \
00:00:43,387 [uber-SubtaskRunner] INFO&nbsp; hive.metastore&nbsp; - Opened a \
connection to metastore, current connections: 1<br>2015-12-30 00:00:43,388 \
[uber-SubtaskRunner] INFO&nbsp; hive.metastore&nbsp; - Connected to \
metastore.<br>2015-12-30 00:00:43,658 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.tools.pigstats.ScriptState&nbsp; - Pig features used in the script: \
UNKNOWN<br>2015-12-30 00:00:43,750 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer&nbsp; - \
{RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, \
GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, \
LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, \
PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], \
RULES_DISABLED=[FilterLogicExpressionSimplifier, \
PartitionFilterOptimizer]}<br>2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - <a \
href="http://fs.default.name">fs.default.name</a> is deprecated. Instead, use \
fs.defaultFS<br>2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - \
mapred.textoutputformat.separator is deprecated. Instead, use \
mapreduce.output.textoutputformat.separator<br>2015-12-30 00:00:43,856 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler&nbsp; - File \
concatenation threshold: 100 optimistic? false<br>2015-12-30 00:00:43,921 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer&nbsp; \
- MR plan size before optimization: 1<br>2015-12-30 00:00:43,921 [uber-SubtaskRunner] \
INFO&nbsp; org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer \
&nbsp;- MR plan size after optimization: 1<br>2015-12-30 00:00:44,006 \
[uber-SubtaskRunner] INFO&nbsp; org.apache.pig.tools.pigstats.ScriptState&nbsp; - Pig \
script settings are added to the job<br>2015-12-30 00:00:44,044 [uber-SubtaskRunner] \
INFO&nbsp; org.apache.hadoop.conf.Configuration.deprecation&nbsp; - \
mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use \
mapreduce.reduce.markreset.buffer.percent<br>2015-12-30 00:00:44,044 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler&nbsp; \
- mapred.job.reduce.markreset.buffer.percent is not set, set to default \
0.3<br>2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - mapred.output.compress is \
deprecated. Instead, use mapreduce.output.fileoutputformat.compress<br>2015-12-30 \
00:00:44,266 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - <a \
href="http://fs.default.name">fs.default.name</a> is deprecated. Instead, use \
fs.defaultFS<br>2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - <a \
href="http://mapred.task.id">mapred.task.id</a> is deprecated. Instead, use <a \
href="http://mapreduce.task.attempt.id">mapreduce.task.attempt.id</a><br>2015-12-30 \
00:00:44,267 [uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler&nbsp; \
- creating jar file Job4443028594885224634.jar<br>2015-12-30 00:00:47,524 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler&nbsp; \
- jar file Job4443028594885224634.jar created<br>2015-12-30 00:00:47,524 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - mapred.jar is deprecated. \
Instead, use mapreduce.job.jar<br>2015-12-30 00:00:47,550 [uber-SubtaskRunner] \
INFO&nbsp; org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler&nbsp; \
- Setting up single store job<br>2015-12-30 00:00:47,617 [uber-SubtaskRunner] \
INFO&nbsp; org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher&nbsp; \
- 1 map-reduce job(s) waiting for submission.<br>2015-12-30 00:00:47,618 \
[uber-SubtaskRunner] INFO&nbsp; \
org.apache.hadoop.conf.Configuration.deprecation&nbsp; - \
mapred.job.tracker.http.address is deprecated. Instead, use \
mapreduce.jobtracker.http.address<br>2015-12-30 00:00:47,667 [JobControl] INFO&nbsp; \
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider&nbsp; - Failing over \
to rm1199<br>2015-12-30 00:00:47,929 [communication thread] INFO&nbsp; \
org.apache.hadoop.mapred.TaskAttemptListenerImpl&nbsp; - Progress of TaskAttempt \
attempt_1449847448721_0473_m_000000_0 is : 1.0<br>2015-12-30 00:00:48,076 \
[JobControl] INFO&nbsp; org.apache.hadoop.conf.Configuration.deprecation&nbsp; - \
mapred.input.dir is deprecated. Instead, use \
mapreduce.input.fileinputformat.inputdir<br>2015-12-30 00:00:48,103 [JobControl] \
INFO&nbsp; org.apache.hadoop.mapreduce.JobSubmitter&nbsp; - Cleaning up the staging \
area <a href="https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">job_1449847448721_0474</span></a>&quot; \
target=&quot;_blank&quot;&gt;/user/edhadmsvc/.staging/<a \
href="https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">job_1449847448721_0474</span></a><br>2015-12-30 00:00:48,112 \
[JobControl] WARN&nbsp; org.apache.hadoop.security.UserGroupInformation&nbsp; - \
PriviledgedActionException as:edhadmsvc (auth:SIMPLE) \
cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission \
denied: user=edhadmsvc, access=EXECUTE, inode=&quot;<a \
href="https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">/user/hive/warehouse</span></a>&quot;:hive:hive:drwxrwx---<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermissi \
on(DefaultAuthorizationProvider.java:257)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAu \
thorizationProvider.java:238)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at \
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(Defa \
ultAuthorizationProvider.java:180)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission \
(DefaultAuthorizationProvider.java:137)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.get \
FileInfo(AuthorizationProviderProxyClientProtocol.java:526)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getF \
ileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodePr \
otocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at java.security.AccessController.doPrivileged(Native \
Method)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at \
javax.security.auth.Subject.doAs(Subject.java:415)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)<br><br>2015-12-30 \
00:00:48,113 [JobControl] INFO&nbsp; \
org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob&nbsp; - PigLatin:script.pig \
got an error while submitting \
<br>org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission \
denied: user=edhadmsvc, access=EXECUTE, inode=&quot;<a \
href="https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse" \
target="_blank"><span style="color:#338BB8; \
text-decoration:none">/user/hive/warehouse</span></a>&quot;:hive:hive:drwxrwx---<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermissi \
on(DefaultAuthorizationProvider.java:257)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAu \
thorizationProvider.java:238)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at \
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(Defa \
ultAuthorizationProvider.java:180)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission \
(DefaultAuthorizationProvider.java:137)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.get \
FileInfo(AuthorizationProviderProxyClientProtocol.java:526)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getF \
ileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodePr \
otocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at java.security.AccessController.doPrivileged(Native \
Method)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at \
javax.security.auth.Subject.doAs(Subject.java:415)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)<br><br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at java.security.AccessController.doPrivileged(Native \
Method)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at \
javax.security.auth.Subject.doAs(Subject.java:415)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native \
Method)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; at \
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
at java.lang.reflect.Method.invoke(Method.java:606)<br>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; \
</div>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<div>
<p class="MsoNormal">&nbsp;</p>
</div>
<p class="MsoNormal">Thanks</p>
<div>
<p class="MsoNormal">Jay</p>
</div>
</div>
</div>
</div>
</div>
</div>
<p class="MsoNormal"><br>
______________________________________________________________________<br>
This email has been scanned by the Symantec Email Security.cloud service.<br>
For more information please visit <a \
href="http://www.symanteccloud.com">http://www.symanteccloud.com</a><br> \
______________________________________________________________________</p> </div>
<img src="http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg" \
width="200" height="126" style="margin:10px; padding:10px; border:0px solid #BFBFBF"> \
<p class="MsoNormal"><i><span style="font-size:10.0pt; color:#404040">For general \
enquiries, please contact us at JOS Enquiry Email: </span></i><a \
href="mailto:enquiry@jos.com.sg"><i><span \
style="font-size:10.0pt">enquiry@jos.com.sg</span></i></a><i><span \
style="font-size:10.0pt; color:#404040"> Hotline: (&#43;65) 6551 9611<b><span \
style="letter-spacing:-.2pt"></span></b></span></i></p> <p class="MsoNormal"><i><span \
style="font-size:10.0pt; color:#404040">For JOS Support, please contact us at JOS \
Services Email: </span></i><a href="mailto:services@jos.com.sg"><i><span \
style="font-size:10.0pt">services@jos.com.sg</span></i></a><i><span \
style="font-size:10.0pt; color:#404040"> Hotline: (&#43;65) 6484 2302</span></i></p> \
<p class="MsoNormal" style="text-autospace:none"><b><span \
style="font-size:12.0pt">&nbsp;</span></b></p> <p class="MsoNormal"><b><i><span \
style="font-size:10.0pt; color:#5F5F5F">A member of the Jardine Matheson Group, \
Jardine OneSolution is one of Asia's leading providers of integrated IT services and \
solutions with offices in Singapore, Malaysia, Hong Kong and  China. Find out more \
about JOS at </span></i></b><a href="http://www.jos.com"><b><i><span \
style="font-size:10.0pt">www.jos.com</span></i></b></a><b><span \
style="font-size:10.0pt"></span></b></p> <p class="MsoNormal"><b><i><span \
style="font-size:10.0pt"></span></i></b><span style="font-size:10.5pt; \
color:gray"><br> </span><span style="font-size:9.5pt; color:gray">Confidentiality \
Notice and Disclaimer:<br> This email (including any attachment to it) is \
confidential and intended only for the use of the individual or entity named above \
and may contain information that is privileged. If you are not the intended \
recipient, you are notified that any dissemination,  distribution or copying of this \
email is strictly prohibited. If you have received this email in error, please notify \
us immediately by return email or telephone and destroy the original message \
(including any attachment to it). Thank you.</span><b><i><span \
style="font-size:10.0pt"></span></i></b></p> <p class="MsoNormal"><span \
style="color:#1F497D">&nbsp;</span></p> <p class="MsoNormal"><span \
style="color:#1F497D">&nbsp;</span></p> <br clear="both">
______________________________________________________________________<BR>
This email has been scanned by the Symantec Email Security.cloud service.<BR>
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is \
confidential and intended only for the use of the individual or entity named above \
and may contain information that is privileged.  If you are not the intended \
recipient, you are notified that any dissemination, distribution or copying of this \
email is strictly prohibited.  If you have received this email in error, please \
notify us immediately by return email or telephone and destroy the original message \
(including any attachment to it).  Thank you.<BR> \
______________________________________________________________________<BR> </body>
</html>


["image003.jpg" (image/jpeg)]

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic