0
点赞
收藏
分享

微信扫一扫

hadoop 报错 org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException


报错:

org.apache.hadoop.hdfs.DFSClient:Failed to close file

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException)

 

解决方法:

修改linux打开文件最大限制

echo "fs.file-max = 65535" >> /etc/sysctl.conf 
echo "* - nofile 65535" >> /etc/security/limits.conf
sysctl -p

ulimit -n

 

修改hadoop配置

vi hdfs-site.xml



<property>
  <name>dfs.datanode.max.xcievers</name>
  <value>8192</value>
</property>


 

 

 

举报

相关推荐

0 条评论