hadoop GLIBC_2.14 not found

砚浅 2016-07-01

hadoop2.7.2centos6.5安装过程中报GLIBC_2.14notfound

查看系统的libc版本

[hadoop@master001native]$ll/lib64/libc.so.6

lrwxrwxrwx.1rootroot12Apr1416:14/lib64/libc.so.6->libc-2.12.so

显示版本为2.12

到网站http://ftp.gnu.org/gnu/glibc/

下载glibc-2.14.tar.bz2

下载glibc-linuxthreads-2.5.tar.bz2

[hadoop@master001native]$tar-jxvf/home/hadoop/software/glibc-2.14.tar.bz2

[hadoop@master001native]$cdglibc-2.14/

[[email protected]]$tar-jxvf/home/hadoop/software/glibc-linuxthreads-2.5.tar.bz2

[[email protected]]$cd..#必须返回上级目录

[[email protected]]$exportCFLAGS="-g-O2"#加上优化开关,否则会出现错误

[hadoop@master001native]$./glibc-2.14/configure--prefix=/usr--disable-profile--enable-add-ons--with-headers=/usr/include--with-binutils=/usr/bin

[hadoop@master001native]$make#编译,执行很久,可能出错,出错再重新执行

[hadoop@master001native]$sudomakeinstall#安装,必须root用户执行

#验证版本是否升级

[hadoop@master001native]$ll/lib64/libc.so.6

lrwxrwxrwx1rootroot12Jun2502:07/lib64/libc.so.6->libc-2.14.so#显示2.14

增加调试信息

[hadoop@master001native]$exportHADOOP_ROOT_LOGGER=DEBUG,console

#显示有下面红色部分,说明本地库不再有错误

[hadoop@master001native]$hadoopfs-text/test/data/origz/access.log.gz

15/06/2502:10:01DEBUGutil.Shell:setsidexitedwithexitcode0

15/06/2502:10:01DEBUGconf.Configuration:parsingURLjar:file:/usr/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar!/core-default.xml

15/06/2502:10:01DEBUGconf.Configuration:parsinginputstreamsun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@71be98f5

15/06/2502:10:01DEBUGconf.Configuration:parsingURLfile:/usr/hadoop/etc/hadoop/core-site.xml

15/06/2502:10:01DEBUGconf.Configuration:parsinginputstreamjava.io.BufferedInputStream@97e1986

15/06/2502:10:02DEBUGlib.MutableMetricsFactory:fieldorg.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccesswithannotation@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops,about=,always=false,type=DEFAULT,valueName=Time,value=[Rateofsuccessfulkerberosloginsandlatency(milliseconds)])

15/06/2502:10:02DEBUGlib.MutableMetricsFactory:fieldorg.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailurewithannotation@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops,about=,always=false,type=DEFAULT,valueName=Time,value=[Rateoffailedkerberosloginsandlatency(milliseconds)])

15/06/2502:10:02DEBUGlib.MutableMetricsFactory:fieldorg.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroupswithannotation@org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops,about=,always=false,type=DEFAULT,valueName=Time,value=[GetGroups])

15/06/2502:10:02DEBUGimpl.MetricsSystemImpl:UgiMetrics,Userandgrouprelatedmetrics

15/06/2502:10:02DEBUGsecurity.Groups:CreatingnewGroupsobject

15/06/2502:10:02DEBUGutil.NativeCodeLoader:Tryingtoloadthecustom-builtnative-hadooplibrary...

15/06/2502:10:02DEBUGutil.NativeCodeLoader:Loadedthenative-hadooplibrary

15/06/2502:10:02DEBUGsecurity.JniBasedUnixGroupsMapping:UsingJniBasedUnixGroupsMappingforGroupresolution

15/06/2502:10:02DEBUGsecurity.JniBasedUnixGroupsMappingWithFallback:Groupmappingimpl=org.apache.hadoop.security.JniBasedUnixGroupsMapping

15/06/2502:10:02DEBUGsecurity.Groups:Groupmappingimpl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;cacheTimeout=300000;warningDeltaMs=5000

15/06/2502:10:02DEBUGsecurity.UserGroupInformation:hadooplogin

15/06/2502:10:02DEBUGsecurity.UserGroupInformation:hadooplogincommit

15/06/2502:10:02DEBUGsecurity.UserGroupInformation:usinglocaluser:UnixPrincipal:hadoop

15/06/2502:10:02DEBUGsecurity.UserGroupInformation:UGIloginUser:hadoop(auth:SIMPLE)

15/06/2502:10:03DEBUGhdfs.BlockReaderLocal:dfs.client.use.legacy.blockreader.local=false

15/06/2502:10:03DEBUGhdfs.BlockReaderLocal:dfs.client.read.shortcircuit=false

15/06/2502:10:03DEBUGhdfs.BlockReaderLocal:dfs.client.domain.socket.data.traffic=false

15/06/2502:10:03DEBUGhdfs.BlockReaderLocal:dfs.domain.socket.path=

15/06/2502:10:03DEBUGretry.RetryUtils:multipleLinearRandomRetry=null

15/06/2502:10:03DEBUGipc.Server:rpcKind=RPC_PROTOCOL_BUFFER,rpcRequestWrapperClass=classorg.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@501edcf1

15/06/2502:10:03DEBUGipc.Client:gettingclientoutofcache:org.apache.hadoop.ipc.Client@16e7dcfd

15/06/2502:10:04DEBUGunix.DomainSocketWatcher:org.apache.hadoop.net.unix.DomainSocketWatcher$1@7e499e08:startingwithinterruptCheckPeriodMs=60000

15/06/2502:10:04DEBUGshortcircuit.DomainSocketFactory:Bothshort-circuitlocalreadsandUNIXdomainsocketaredisabled.

15/06/2502:10:04DEBUGipc.Client:Thepingintervalis60000ms.

15/06/2502:10:04DEBUGipc.Client:Connectingtomaster001/192.168.75.155:8020

15/06/2502:10:04DEBUGipc.Client:IPCClient(577405636)connectiontomaster001/192.168.75.155:8020fromhadoopsending#0

15/06/2502:10:04DEBUGipc.Client:IPCClient(577405636)connectiontomaster001/192.168.75.155:8020fromhadoop:starting,havingconnections1

15/06/2502:10:04DEBUGipc.Client:IPCClient(577405636)connectiontomaster001/192.168.75.155:8020fromhadoopgotvalue#0

15/06/2502:10:04DEBUGipc.ProtobufRpcEngine:Call:getFileInfotook122ms

text:`/test/data/origz/access.log.gz':Nosuchfileordirectory

15/06/2502:10:04DEBUGipc.Client:stoppingclientfromcache:org.apache.hadoop.ipc.Client@16e7dcfd

15/06/2502:10:04DEBUGipc.Client:removingclientfromcache:org.apache.hadoop.ipc.Client@16e7dcfd

15/06/2502:10:04DEBUGipc.Client:stoppingactualclientbecausenomorereferencesremain:org.apache.hadoop.ipc.Client@16e7dcfd

15/06/2502:10:04DEBUGipc.Client:Stoppingclient

15/06/2502:10:04DEBUGipc.Client:IPCClient(577405636)connectiontomaster001/192.168.75.155:8020fromhadoop:closed

15/06/2502:10:04DEBUGipc.Client:IPCClient(577405636)connectiontomaster001/192.168.75.155:8020fromhadoop:stopped,remainingconnections0

==============================

完成之后,需要将集群重启:

[hadoop@master001~]$sh/usr/hadoop/sbin/start-dfs.sh

[hadoop@master001~]$sh/usr/hadoop/sbin/start-yarn.sh

[hadoop@master001~]$hadoopfs-ls/

[hadoop@master001~]$hadoopfs-mkdir/usr

[hadoop@master001~]$hadoopfs-ls/

Found1items

drwxr-xr-x-hadoopsupergroup02015-06-2502:27/usr

相关推荐