Hadoop Oozie学习笔记 org.apache.oozie.service.Authoriz

lumingming 2012-08-01

之前Oozie都是伪分布式测试,获取的都是本地文件.现在开始在分布式环境下测试,从HDFS中获取文件(所以你的APP也要传入到相应的HDFS路径中).我这里采用的例子是$OOZIE_HOME/examples/apps/map-reduce.其中对job.properties做如下设置:

nameNode=hdfs://localhost:9000

  1. jobTracker=localhost:9001   
  2. queueName=default  
  3. examplesexamplesRoot=examples   
  4.   
  5. oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce   
  6. outputDir=map-reduce

之后在控制台通过如下命令提交app job:

$OOZIE_HOME/bin/oozie job -oozie http://localhost:11000/oozie -config /home/guoyun/Hadoop/oozie-2.3.2-cdh3u2/examples/apps/map-reduce/job.properties -run

但碰到以下异常:

  1. org.apache.oozie.service.AuthorizationException: E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: User: guoyun is not allowed to impersonate guoyun]   
  2.     at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:318)   
  3.     at org.apache.oozie.servlet.BaseJobServlet.checkAuthorizationForApp(BaseJobServlet.java:171)   
  4.     at org.apache.oozie.servlet.BaseJobsServlet.doPost(BaseJobsServlet.java:89)   
  5.     at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)   
  6.     at org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:281)   
  7.     at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)   
  8.     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)   
  9.     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)   
  10.     at org.apache.oozie.servlet.AuthFilter$2.doFilter(AuthFilter.java:123)   
  11.     at com.cloudera.alfredo.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:371)   
  12.     at org.apache.oozie.servlet.AuthFilter.doFilter(AuthFilter.java:128)   
  13.     at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)   
  14.     at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)   
  15.     at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)   
  16.     at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)   
  17.     at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)   
  18.     at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)   
  19.     at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)   
  20.     at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)   
  21.     at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)   
  22.     at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)   
  23.     at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)   
  24.     at java.lang.Thread.run(Thread.java:662)   
  25. Caused by: org.apache.oozie.service.HadoopAccessorException: E0902: Exception occured: [org.apache.hadoop.ipc.RemoteException: User: guoyun is not allowed to impersonate guoyun]   
  26.     at org.apache.oozie.service.KerberosHadoopAccessorService.createFileSystem(KerberosHadoopAccessorService.java:211)   
  27.     at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:283)   
  28.     ... 22 more   
  29. Caused by: org.apache.hadoop.ipc.RemoteException: User: guoyun is not allowed to impersonate guoyun   
  30.     at org.apache.hadoop.ipc.Client.call(Client.java:1107)   
  31.     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)   
  32.     at $Proxy22.getProtocolVersion(Unknown Source)   
  33.     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)   
  34.     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)   
  35.     at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)   
  36.     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)   
  37.     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)   
  38.     at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)   
  39.     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)   
  40.     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)   
  41.     at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)   
  42.     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)   
  43.     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)   
  44.     at org.apache.oozie.service.KerberosHadoopAccessorService$3.run(KerberosHadoopAccessorService.java:203)   
  45.     at org.apache.oozie.service.KerberosHadoopAccessorService$3.run(KerberosHadoopAccessorService.java:194)   
  46.     at java.security.AccessController.doPrivileged(Native Method)   
  47.     at javax.security.auth.Subject.doAs(Subject.java:396)   
  48.     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)   
  49.     at org.apache.oozie.service.KerberosHadoopAccessorService.createFileSystem(KerberosHadoopAccessorService.java:194)   
  50.     ... 23 more

这是个权限异常,通过google也找到了较为详细的解决方案.原理我只能初步做个大概的猜测,具体细节还需日后渐渐摸索.由于需要从HDFS中访问你所提交的APP,那肯定需要对这个APP文件有权限.所以需要进行相关配置.我这里用户名(你提交app job所用的用户名)是guoyun,你可以通过whoami来获取.然后获取你当前用户名所在的用户组,通过groups userName获取.获取到用户名和用户组之后需要在$HADOOP_HOME/conf/core-site.xml中进行设置,我的设置如下:

  1. <!-- for oozie,add by guoyun,2011-11-13 -->  
  2. <property>     
  3.         <name>hadoop.proxyuser.<SPAN style="COLOR: #ff0000"><STRONG>guoyun</STRONG></SPAN>.hosts</name>     
  4.         <value>localhost</value>     
  5. </property>  
  6. <property>  
  7.     <name>hadoop.proxyuser.<SPAN style="COLOR: #ff0000"><STRONG>guoyun</STRONG></SPAN>.groups</name>  
  8.     <value>guoyun</value>  
  9. </property>

其中中hadoop.proxyuser.guoyun.hosts和hadoop.proxyuser.guoyun.groups中的guoyun要改成你的用户名.

修改好cote-site.xml后需要重启你的hadoop.之后再提交job,OK!该异常解决

相关推荐