Logstash使用webhdfs插件遇到写入HDFS权限问题

本文介绍了解决Logstash向HDFS写入数据时遇到的权限错误问题,包括检查文件权限、更改所有者及所属组,并提供正确配置路径的方法。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

当我正常启动Logstash向HDFS写入数据的时候,报错:

[WARN ][logstash.outputs.webhdfs ] Failed to flush outgoing items {:outgoing_count=>1, :exception=>"LogStash::Error", 
:backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:205:in `sprintf'", 
"/export/servers/logstash-5.6.9/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:194:in `flush'", 
"org/jruby/RubyArray.java:2409:in `collect'", 
"/export/servers/logstash-5.6.9/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:189:in `flush'", 
"/export/servers/logstash-5.6.9/vendor/bundle/jruby/1.9/gems/stud-0.0.23/lib/stud/buffer.rb:219:in `buffer_flush'", 
"org/jruby/RubyHash.java:1342:in `each'", 
"/export/servers/logstash-5.6.9/vendor/bundle/jruby/1.9/gems/stud-0.0.23/lib/stud/buffer.rb:216:in `buffer_flush'", 
"/export/servers/logstash-5.6.9/vendor/bundle/jruby/1.9/gems/stud-0.0.23/lib/stud/buffer.rb:159:in `buffer_receive'", 
"/export/servers/logstash-5.6.9/vendor/bundle/jruby/1.9/gems/logstash-output-webhdfs-3.0.6/lib/logstash/outputs/webhdfs.rb:182:in `receive'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", 
"org/jruby/RubyArray.java:1613:in `each'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:22:in `multi_receive'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/output_delegator.rb:49:in `multi_receive'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/pipeline.rb:434:in `output_batch'", 
"org/jruby/RubyHash.java:1342:in `each'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/pipeline.rb:433:in `output_batch'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/pipeline.rb:381:in `worker_loop'", 
"/export/servers/logstash-5.6.9/logstash-core/lib/logstash/pipeline.rb:342:in `start_workers'"]}
为了大家好查找,特意贴出问题。

经过查找是没有写入权限的问题。

现在我们着手解决:

  1.  首先通过命令查看HDFS上的文件的权限和所属组:

hdfs  dfs  -ls  /

  2.  看到其属于hadoop组下,进入hadoop用户下,修正所属组:

 sudo su hadoop

 hdfs  dfs  -chmod  -R 777  /hive
修改文件夹的owner和所属组:
hdfs  dfs -chown root:root /hive


hdfs dfs -chgrp -R supergroup /


  3.  再次查看,已经更改成功。  

hdfs dfs -ls /



追加:后续在解决这个问题的时候,其实还发现不完全是权限的问题。我在写配置文件的时候,将写入HDFS的路径配置为:


实际这是不正确的,不应该完全由变量命名文件夹。大家在使用的时候切记。

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值