[prev in list] [next in list] [prev in thread] [next in thread] 

List:       flume-user
Subject:    Re: Question about unable to write to channel
From:       Roshan Naik <roshan () hortonworks ! com>
Date:       2015-03-23 20:52:03
Message-ID: D135CB1A.11030%roshan () hortonworks ! com
[Download RAW message or body]

Looks like a problem in the security setup, you are probably trying to
write to a secure Hadoop cluster w/o providing the kerberos credentials to
the HDFS sink. HDFS and Flume should work fine in HDP 2.1.
-roshan


On 2/6/15 10:50 AM, "David Novogrodsky" <david.novogrodsky@gmail.com>
wrote:

>All,
>
>Thanks your your help in the past.  I have a cluster managed by Ambari
>1.7 using HDP 2.1.  I was told that the versions of Flume and Hadopop
>in HDP 2.1 have a problem. Specifically the HDFS sink does not write
>to the cluster properly.  I want to test that to be sure.  The error I
>should be getting is this:
>
>java.lang.VerifyError: class
>org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestP
>roto
>overrides final method
>getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>    at java.lang.ClassLoader.defineClass1(Native Method)
>    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>
>I am starting the agent using this command line:
>[root@namenode flume]# flume-ng agent --conf conf --conf-file
>/etc/flume/conf/a2/flume.conf -n a2
>
>But instead I am getting this error, which seems unrelated to HDFS:
>15/02/06 12:31:44 ERROR source.SequenceGeneratorSource: r1 source
>could not write to channel.
>org.apache.flume.ChannelException: Unable to put event on required
>channel: org.apache.flume.channel.MemoryChannel{name: c1}
>    at 
>org.apache.flume.channel.ChannelProcessor.processEvent(ChannelProcessor.ja
>va:275)
>
>What follows is more details:
>
>Here is my configuration:
>a2.sources = r1
>a2.sinks = k1
>a2.channels = c1
>
>a2.sources.r1.type = seq
>
>a2.sinks.sink_to_hdfs.type = hdfs
>a2.sinks.sink_to_hdfs.hdfs.fileType = DataStream
>a2.sinks.sink_to_hdfs.hdfs.path = /flume/events
>a2.sinks.sink_to_hdfs.hdfs.filePrefix = eventlog
>a2.sinks.sink_to_hdfs.hdfs.fileSuffix = .log
>a2.sinks.sink_to_hdfs.hdfs.batchSize = 100
>
>a2.channels.c1.type = memory
>a2.channels.c1.capacity = 10000
>a2.channels.c1.transactionCapacity = 10000
>a2.channels.c1.byteCapacityBufferPercentage = 20
>a2.channels.c1.byteCapacity = 800000
>
>a2.sources.r1.channels = c1
>a2.sinks.k1.channel = c1
>a2.sinks.sink_to_hdfs.type = hdfs
>a2.sinks.sink_to_hdfs.hdfs.fileType = DataStream
>a2.sinks.sink_to_hdfs.hdfs.path = /flume/events
>a2.sinks.sink_to_hdfs.hdfs.filePrefix = eventlog
>a2.sinks.sink_to_hdfs.hdfs.fileSuffix = .log
>a2.sinks.sink_to_hdfs.hdfs.batchSize = 1000
>
>David Novogrodsky
>david.novogrodsky@gmail.com
>http://www.linkedin.com/in/davidnovogrodsky

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic